It’s not the hacking problem
According to data-breach reports, 95% of all attacks are simple things, like phishing, SQL injection, and bad passwords – nothing related to software quality. The other 5% is because victims are using old, unpatched software. When exploits are used, it’s overwhelmingly for software that has remained unpatched for a year.
In other words, CyberUL addresses less than 0.1% of real-world attacks.
It’s not the same quality problem
UL is about accidental failures in electronics. CyberUL would be about intentional attacks against software. These are unrelated issues. Stopping accidental failures is a solved problem in many fields. Stopping attacks is something nobody has solved in any field.
In other words, the UL model of accidents is totally unrelated to the cyber problem of attacks.
Security is a tradeoff
Security experts ignore the costs of fixing security. They assume that it due to moral weakness, and that getting tough is all that’s needed.
That’s not true. Improving security comes at great cost, in terms of price, functionality, or usability. Insecurity happens not because people are weak, but because the tradeoffs aren’t worth it. That’s why you have an iPhone, which can get hacked, instead of a 1980s era feature-phone that can do little more than make phone calls – you find the added risk worth the tradeoffs.
The premise of a CyberUL is that people are wrong, that more tradeoffs must be imposed against their will in order to improve cybersecurity, such as increasing the price, removing features, or making products hard to use.
Rules have a cost
Government already has the “Common Criteria” rules. They are all for obviously good things, like masking a password with **** when users type it in. But here’s the thing: while the actual criteria are easy and straightforward, it’s buried in layers of bureaucracy. It costs at least $1 million to get a product certified with Common Criteria.
OPM invested millions in dealing with similar bureaucratic regulations. It’s not that they had no security – it’s that their security people spent all their time with bureaucracy. They ignored basic problems like SQLi, phishing, bad passwords, and patches because compliance consumed all their budget and time.
Do you even government?
People believe that wise CyberUL administrators will define what’s right based on their own expertise. This is nonsense – rules will be designed according to whoever spends the most on lobbyists. It’s same thing that happens in every industry.
As soon as the White House starts a CyberUL, Oracle, Microsoft, and Cisco will show up offering to help. Whatever rules are created will be those that favor those three companies at the expensive of smaller companies.
Government doesn’t follow the rules, anyways
Government agencies don’t follow the rules anyway. There are so many impossibly onerous rules in government anyway that complaining and getting an exception is the norm. That’s why, for example, the Navy just gave Microsoft $20 million to continue to support WinXP – a 15 year old operating-system – which is otherwise against the rules.
A CyberUL is an absurd idea, being unrelated to the problem it purports to solve. The only reason people take it seriously is that they are secretly fascist at heart. They aren’t interested in solving the problem of cybersecurity, because that’s hard. Instead, they want to tell other people what to do, because that’s easy.
SQLi, phishing, bad passwords, and lack of patches are the Four Horseman of the cybersecurity apocalypse, not software quality. Unless you are addressing those four things, then you are doing essentially nothing to solve the problem.