Tuesday, June 30, 2015

CyberUL is a dumb idea

Peiter “mudge” Zatko is leaving Google, asked by the White House to create a sort of a cyber “Underwriter Laboratories” (UL) for the government. UL is the organization that certifies electrical devices, so that they don’t short out and zap you to death. But here’s the thing: a CyberUL is a dumb idea. It’s the Vogon approach to the problem. It imagines that security comes from a moral weakness that could be solved by getting “serious” about the problem.

It’s not the hacking problem

According to data-breach reports, 95% of all attacks are simple things, like phishing, SQL injection, and bad passwords – nothing related to software quality. The other 5% is because victims are using old, unpatched software. When exploits are used, it’s overwhelmingly for software that has remained unpatched for a year.

In other words, CyberUL addresses less than 0.1% of real-world attacks.

It’s not the same quality problem

UL is about accidental failures in electronics. CyberUL would be about intentional attacks against software. These are unrelated issues. Stopping accidental failures is a solved problem in many fields. Stopping attacks is something nobody has solved in any field.

In other words, the UL model of accidents is totally unrelated to the cyber problem of attacks.

Security is a tradeoff

Security experts ignore the costs of fixing security. They assume that it due to moral weakness, and that getting tough is all that’s needed.

That’s not true. Improving security comes at great cost, in terms of price, functionality, or usability. Insecurity happens not because people are weak, but because the tradeoffs aren’t worth it. That’s why you have an iPhone, which can get hacked, instead of a 1980s era feature-phone that can do little more than make phone calls – you find the added risk worth the tradeoffs.

The premise of a CyberUL is that people are wrong, that more tradeoffs must be imposed against their will in order to improve cybersecurity, such as increasing the price, removing features, or making products hard to use.

Rules have a cost

Government already has the “Common Criteria” rules. They are all for obviously good things, like masking a password with **** when users type it in. But here’s the thing: while the actual criteria are easy and straightforward, it’s buried in layers of bureaucracy. It costs at least $1 million to get a product certified with Common Criteria.

OPM invested millions in dealing with similar bureaucratic regulations. It’s not that they had no security – it’s that their security people spent all their time with bureaucracy. They ignored basic problems like SQLi, phishing, bad passwords, and patches because compliance consumed all their budget and time.

Do you even government?

People believe that wise CyberUL administrators will define what’s right based on their own expertise. This is nonsense – rules will be designed according to whoever spends the most on lobbyists. It’s same thing that happens in every industry.

As soon as the White House starts a CyberUL, Oracle, Microsoft, and Cisco will show up offering to help. Whatever rules are created will be those that favor those three companies at the expensive of smaller companies.

Government doesn’t follow the rules, anyways

Government agencies don’t follow the rules anyway. There are so many impossibly onerous rules in government anyway that complaining and getting an exception is the norm. That’s why, for example, the Navy just gave Microsoft $20 million to continue to support WinXP – a 15 year old operating-system – which is otherwise against the rules.


Conclusion

A CyberUL is an absurd idea, being unrelated to the problem it purports to solve. The only reason people take it seriously is that they are secretly fascist at heart. They aren’t interested in solving the problem of cybersecurity, because that’s hard. Instead, they want to tell other people what to do, because that’s easy.

SQLi, phishing, bad passwords, and lack of patches are the Four Horseman of the cybersecurity apocalypse, not software quality. Unless you are addressing those four things, then you are doing essentially nothing to solve the problem.

5 comments:

ThingFish said...

Although I agree with your assessment of CyberUL as not cost effective, and not even trying to solve the correct problem, there's at least one other Great Big reason for something like "CyberUL" - getting something "certified" is a barrier to entry. As you've observed, Common Criteria certification is financially onerous. This keeps out smaller organizations who can't afford that investment, whether they be proprietary startups or open source.

Taken at face value, "CyberUL" is not going to have an effect. But face value isn't where this is aimed. It's aimed at routing government contracts to already established players in the market.

Unknown said...

Great Post ,Here you can Learn online cyber security Tutorial with simple and easy examples diagram covering need for Security, Legal Ethical and Professional issues, Planning for Security, Risk Management, Advanced Security Threats, Technologies & Physical Security, Implementing Information, Personnel & Information Security Maintenance. Visit- Cyber Security Tutorial

Unknown said...

I like your line of thinking on the broader ramifications of CyberUL (will become like other govt efforts), but doesn't the idea of the CyberURL affect two of your horsemen: SQLi and patching? If CyberURL has tested a product against SQLi, doesn't that address the issue? Also, a more stringent testing plan by CyberUL could reduce the number of times one would expect to patch, no?

Unknown said...

How will a CyberUL certification be any different than common criteria? The difficult issue with common criteria has been that the OS, IOS, firmware, and/or feature sets get updated before the initial certification is even completed. Customers want the latest and greatest features built into the latest build while the vendor is chanting "this product is certified". The vendor don't mention that the product is only certified running yesterday's feature set. 

Rob Fielding said...

SQLi and patching are totally "software quality issues". I don't see how regulation will really help though. Security will improve quite a bit when it becomes cheaper to make software speak more verifiable protocols, cheaper to use programming tools that rule out large classes of bugs than to have cowboys write unsafe code (and make bug cleanup a different pipeline stage handled by a different person).