- cross-posted to:
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
- [email protected]
- [email protected]
“We have a technical debt that stretches back many decades.”
They adopted the system in 1998, when actually floppy floppies were already obsolete. Oof.
It’s called proven technology sweaty.
CDs were released in 1982 and pretty damn stable. One year after 3.5 floppys…6 years after 5.25 floppys.
ISO 9660 wasn’t around until '88, and even then, its read-only capability paired with high costs wouldn’t make it viable until maybe a decade later … ironically, around the time the system was deployed.
I mean 1989 was the last time I used a 5.25 in elementary before everything was switched to 3.5 with the IBM Model 30.
I know I was still using 5¼" floppies at least a bit into the early '90s, though it’s been long enough that exact years elude me.
I was also still developing technology that used 3½" diskettes well into the first decade of the new millennium - though I finally managed to migrate newer systems to CD-R around the end of that decade.
That’s a measure of success to me. Systems that have run for so long and continue to run effectively are to be commended, not ridiculed.
The article quotes the agency as mentioning the floppies not being the biggest worry. I wish it went into more detail of what shortcomings the current system has and what improvements such a shift to more modern infra would bring.
I like the part of the quote you omitted:
“We have to maintain programmers who are experts in the programming languages of the '90s in order to keep running our current system, so we have a technical debt that stretches back many decades,” Tumlin told San Francisco’s KQED in February 2023.
They say that as if most of the most popular languages in the '90s aren’t still in common use today. I guess what he really means is that they managed to pick something that was obscure proprietary garbage even back then, and should’ve known better.
Well, it’s increasingly difficult to find specific C experts.
Edit: downvotes from people who obviously don’t work in my field
FORTH
what, its not like they are near a technical mecca of any kinda out there in cali, of course they would be decades out of date!
I’m 20 minutes away from multiple population centers, in Bay area California, and on a good night I get 4 mB/s download. We need public energy and data ASAP, private oligarchs are fucking us over so hard.
There aren’t even decent broadband options in Silicon Valley. None of the “innovations” actually make it out of the plush offices into the community.
Sonic is offered around the bay area and silicon valley. It’s fantastic. I sadly don’t have it in my current place, but previously had their gigabit fiber — symmetric, uncapped, reliable, and north of 900Mbps on iperf (fast.com would claim 1.0Gbps).
Yeah, finally, but obviously not ubiquitously. They never offered it in any of the places I lived there, either.
Or you could just virtualize the machine in an emulator and run it from disk images. 🤷♂️
To hard for the "fake it 'til you make it"ers of old
Yeah, that’s what I want in safety-critical infrastructure: more abstractions and points of failure. Let’s slap that on a RasPi while we’re at it
Cloud workstation
AWS lambda