Ken wrote: ↑Fri Feb 09, 2024 6:20 pm
- Maybe this is because government often doesn't pay as much as the private sector and the best tech people don't want to work for government agencies.
This isn’t it. State-level actors are widely known to deploy the most sophisticated and novel attacks. Although maybe it’s that anyone in government who shows promise is recruited over into the NSA.
- Maybe this is because government budgets don't provide enough money to keep their systems modern and updated. I have walked into some offices before and it looked like they were still using Windows 95. Sometimes even running DOS programs.
- Maybe it is because tech people aren't highly ranked within government agencies. All the bosses and administrators are completely non-techie types and tech people are sort of at the bottom of the totem pole. That is definitely how it is at school districts. A tech person is never going to be made a top administrator at any school district. It will be some random EdD person. Even though education is becoming increasingly technology based. But I don't expect that is the case at places like Google or Apple where the top people, and especially mid level people tend to be tech engineers. And of course actual tech companies don't tend to get taken down by ransomware either.
Probably it is a combination of all three. I'm sure Josh has opinions.
I think these are a bit closer. One reason governments are slow to change is because they have critical things running that can’t be allowed to break because people rely on them. Maybe even some of them must be available by law. When something can’t break then you just don’t touch it if you don’t have to. (To be fair any large codebase probably has some parts “we just don’t touch.”)
Developing a replacement to meet that standard costs a lot more than just regular software. An example of this is that the computer chips in NASA’s satellites are very expensive for very little processing power—you should be able to buy something equivalent for a few dollars or less. However they also have to have a ridiculously low crash rate because when it’s fifty million miles from Earth your options for recovering from a problem are pretty limited. I wish I could find the article I read this in but here’s a
different one that touches on some similar things.
Another factor is that older things are often more secure simply by virtue of their age. If the weaknesses have been probed for twenty-five years and held up it’s often better to go with what’s old and tested rather than the new thing which might have a catastrophic weakness that no one has discovered yet—or no one except a hostile state that’s keeping it in their back pocket. Which would you trust the nuclear codes to?
Using old tech also has an additional advantage. I recently saw an article about a malware attack that failed because the system it tried to attack were so old some of the things the attackers were counting on just weren’t available.