Airplanes carry flight-recorders made of an indestructible material that will survive a crash. A common joke is to ask "Why can't the entire airplane be made from the same material?" The answer is, of course, that this would make the airplane too heavy to lift off the ground. Planes need to be dangerously flimsy to fly.
Cybersecurity has the same issues. People point out "obvious" solutions to cybersecurity while ignoring costs they would entail.
A good example is this story at The Register where former cyberczar Richard Clarke outlines his 5-point plan for securing the Internet. All his points make as much sense as building airplanes from cast iron. An example is his quote:
"We should look, as an industry, at improving the quality of secure code, so that we don't need to issue software patches, so there aren't trap doors - intentional or otherwise. This is not a revolutionary idea. We put this in place a long time ago for electrical appliances."
Is this as cost-free and uncontroversial as Clarke pretends? No, of course not, or else such laws would already have been passed.
"Safety" is not "security". Safety protects against ACCIDENTAL problems, security protects against INTENTIONAL problems. Regulations are designed to protect your gas stove from accidentally exploding, but they can't protect you if somebody intentionally rigs your stove to explode. In much the same way, automobile safety protects against accidental problems, but does nothing to stop your tires being slashed or your break lines from being punctured intentionally. Product safety protects against the INANIMATE effects of bad design, bad construction, and wear-and-tear. Security protects against the ANIMATE and unpredictable adversary who is likely more clever than the engineers who designed the product.
In other words, the analogy between "product safety" and "software security" is faulty.
Not only would regulations work less well, they would cost more. Government regulations have hidden costs. That's why economists and politicians spend so much time trying to get rid of them. While you can't see the costs directly, you can see their effects. For example, what appliances do you own that were NOT created by a huge multinational corporation? This is because small companies cannot afford the heavy costs of regulation. Countries that regulate more innovate less. Regulation favors large companies over small companies.
Think of this another way. Right now, you can start your own software company and (hopefully) make a million dollars. That's because you don't have to worry about government regulations. If Clarke were to get his way, it would take several full time employees and high-priced lawyers just to deal with the regulations, leaving nobody left to create your software. Software innovation would come to a stop in the name of cybersecurity. Only megacorporations would then be able to ship software (which is why companies like Microsoft support such regulation). A good example of this is the "Common Criteria" certification for government software, where it takes at least a million dollars to get certified (and which still fails to produce secure software).
Richard Clarke's remaining proposals are even worse. Clarke's solution to cybersecurity is to convert our free society into a totalitarian police state. He's right, of course, doing so WILL improve cybersecurity, but at a huge cost to civil liberties. Even today, we live in a slight police state: you are more likely to be arrested falsely by the police than you are to be attacked by terrorists. There are many of us who believe that the slight improvement in security is not worth the huge cost in liberty.
Anybody can pass themselves off as a security expert by proposing extreme solutions and silencing their opponents with the accusations that they aren't taking security "seriously" enough. That's not what an expert is. Instead, an expert is somebody who can find solutions WITHIN a reasonable set of costs/sacrifices.
Very well said. I love your quote: "Safety protects against ACCIDENTAL problems, security protects against INTENTIONAL problems."
A pity to see that highly paid & listened to people like Richard Clarke are still clowns and talk about stuff they have absolutely no clue about. On the other hand is it the guarantee that people like you and me will have work for as long as we want :P
You must admit though that in software we don't really even know how to protect against the accidental problems, much less the intentional ones. Maybe that would be a good first step?
"...and silencing their opponents with the accusations that they aren't taking security 'seriously' enough."
Of course, that's how many politically-inclined people argue, so not all that surprising considering the source.
Post a Comment