Bruce Schneier describes his African safari. He tells how the staff explained to him that when threatened by different animals, people needed to respond differently. You stare down some animals, you avert your gaze from others. You run from some, stand your ground with others. You can climb trees to avoid some animals, but other animals are better climbers than you are. Following the wrong strategy for an animal can get you killed, such as staring down an animal that provokes it to attack.
Schneier doesn't make what I think would be the obvious corollary with cybersecurity, namely that the details of each threat matters. In order to defend yourself, you need to pay attention to the details.
The Slammer worm resulted in a lot of sales of anti-virus product. However, anti-virus products do not protect against in-memory worms like Slammer. Anti-virus is not the correct defense against Slammer. The correct defense would include intrusion-prevention systems (IPS), better firewall rules, better patching, and better vulnerability scanning. In other words, unless you pay attention to the details of Slammer, you are unlikely to defend yourself well against it.
Corporations have a culture of ignoring the details of cyber-threats. They build their security policies at a high-level, from the top down. Of course these policies state that the details must be taken care of eventually, but much of the time, their processes never advance that far. When confronted with a cybersecurity failure due to lack of attention to the details, they will often go back and redesign their policies from scratch, again at a high-level, and again ignoring the low level details.
A good example of this is the recent fad of "secure development lifecycles", where corporations now pay attention to securing their internal software projects. The low-level details they need to solve are things like "SQL injection" and "cross-site scripting". However, when companies approach the problem from a high-level, they often never get as far as these details. Thus, the uptake in "secure development" has not resulted in solving the widespread problem of "SQL injection".
Its interesting reading documents on the Internet about the security lifecycle, such as this one. It's full of high-level platititudes but empty of any low-level details. For example, it stresses the importance of teaching software security in universities, but it's approach is largely meaningless. A better approach to the problem, one that is focused on the details, is to tell universities to stop teaching students to use "strcpy()" and to start teaching them how hackers "smash the stack".
The successful security lifecycle projects I've seen are those that start first with the details, then build upwards from there. In other words, processes that focus first on a detail like "SQL injection" is likely to both solve that problem as well as be extensible to other problems. An example of this is Microsoft: while they do describe the security lifecycle from the high-level, their processes were created from low-level details. The documents they produce show that they pay attention to low-level details.
You would think that the technical staff would be the leaders in getting their employers to focus on security details, but you'd be wrong. Technical people ignore the details of business to the same degree that business people ignore the technical details. It's a rare company that has a leader who both understands the business as well as the security details. In addition, the technical staff likes to focus on exciting problems like stopping 0day worms, but the biggest problems are the boring ones (like guessable passwords), and companies are more likely to be hacked by the boring problems than the exciting ones.
So what is the solution? My recommendation is to change the culture so that technical details matter. Any high-level document used in the process should include "use-cases" or something that points to a low-level detail. This will keep the project anchored in reality, and discourage it from drifting off course.
1 comment:
Robert, Many (if not most) technical people don't understand the security implications themselves. They still view security as having a firewall and AV. Maybe a IDS/IPS thrown in and a few ACL on routers and folders. We have to start with ensuring that ALL technical people are aware of the security issues and how to at least do basic configurations and remediation to reduce the risk. Also having a true Security Guru on staff who can review configs, code, etc... plus make architectural/infrastructural recommendations will go a long way also.
Post a Comment