Today was the 135th and final launch of the space shuttle. Many are crying over the end of an era. But the project has been a boondoggle from the start, sucking the life out of space exploration. At $1-billion per launch, it costs 10 times as much to launch something with the Shuttle than with another spacecraft, which is why we buy so many launches from the Russians these days. Over its 40 year life, NASA has spent $211-billion (inflation adjusted) in the program that has no notable accomplishments.
The problem with the Shuttle has always been that it’s a moral argument. Everyone knows that “reusable” is morally superior to “disposable”. Therefore, a reusable spacecraft has to be better than a disposable one. The moral superiority of this argument has blinded people for 40 years to the fact that it just doesn't work.
The flaw in the program can be seen in the two Shuttle disasters, when the Challenger exploded after liftoff, and the Columbia burned up on re-entry. The cause of both disasters was the complexity of the Shuttle. Disposable spacecraft are simple, and harder to mess up. A reusable space plane is horribly complex, and impossible to get right. If we’d ever achieved the thousands of launches (rather than the mere 135), we would have had many more disasters.
As a risk expert, I was horrified by the finger pointing after the Columbia tragedy. What everyone points to as the “cause” was foam falling off the tank and hitting the heat resistant tiles. The enormous heat during re-entry burned through the broken tiles, and destroyed the space craft. But the real “cause” was the complexity of the tiles themselves. There were over 20,000 tiles, no two alike, that had to be individually inspected, removed, repaired/replaced, and glued back on after every flight.
The tiles alone made the Shuttle too expensive, and too risky to operate, but it's just a small part of Shuttle complexity.
But worse than the tiles themselves was the blame game following the Columbia disaster. As the news reports, there were people within NASA who has warned management of the risk, but management covered up the problem. Of course this was the case. By any rational risk analysis, the Shuttle was too risky to fly, but NASA was told by Congress and the American people to make it fly. Therefore, NASA management had to decide which risks they were willing to live with. If the Shuttle were to keep flying after today, there would be another disaster soon, and its cause would be one of the many other risks management knows about but “ignores”.
This is a useful lesson for cybersecurity. Today’s networks are too complex to secure. Getting hacked is as inevitable as a Shuttle blowing up. Despite this, corporations are convinces that they can solve the complexity. They believe their firewalls will not have holes hackers can get through. They believe that they can control their website code to prevent all SQL injection and cross-site-scripting. They believe enough anti-virus will prevent users from infecting themselves with viruses. They believe they can keep all patches up-to-date all the time. They believe they can isolate critical bits from the Internet so that hackers can’t reach them. When they get hacked, they can always point backwards at the path they failed to apply, or the Web 2.0 code they failed to inspect, or the virus their AV failed to catch. They believe that was the only problem that allowed the hack, and once fixed, they will be secure from now on. They believe if they “just take security seriously enough”, then such problems won’t happen. But they are wrong, as wrong as Shuttle engineers who thought “if we just take safety serious enough, disasters won’t happen”.
4 comments:
I apologize if this is an obvious question or one that has been answered previously, but I am new to this blog. You compare security to the re-usable space shuttle, too complex to properly deal with. I do agree with this, yet one part of your equation bothers me. You suggest that disposable rockets are the solution to the complexity of the reusable shuttle, but you do not suggest an analogue in the security industry. What is our alternative to complexity?
This is and has been the exact battle I have been fighting for the last 10 years. My solution is: strive for elegance.
As every universal truth is elegant, so should information security be. If one looks at the [proposed] solution / implementation it should shine and it's minimalism should be painfully obvious.
Reusable or not.
I applaude your guts, Robert, to address this.
@calcipher: here's my suggestion regarding the alternative to complexity/how to deal with security - promote rational morality and rational enfranchisement of all people; instead of focusing on how to make systems impenetrable, focus on humanity so that individuals will cease in the effort to penetrate; instead of 'us versus them', it may be understood that it's all just 'us' -- how's that idea?
Amen to that.
You also have to factor in that people get excited by complexity, and there is a lot of psychology going on.
Post a Comment