ME: "Hi, you left your headlights on."The intent of this is to show how badly people react to "friendly" disclosure of vulns. However, vuln disclosure isn't friendly. It is an inherently rude act. It's more like writing "WASH ME" in the dirt covering a person's car. Your standard of cleanliness may differ than your neighbor's. Pointing this out will always get a hostile reaction.
NEIGHBOR: "WHO SENT YOU? DID MY EX-WIFE SEND YOU? ARE YOU SLEEPING WITH HER?"
What vuln researchers expect is something like the following scenario:
ME: "Your hairstyle is ugly."That's not going to happen. I mean, try it with your neighbor. I'd bet that you'll get a hostile reaction rather than the desired one above.
NEIGHBOR: "Thank you for telling me. I'll go to the hairdresser and fix it immediately."
The reason vuln disclosure is rude is because most people don't care about fixing security problems. This includes not only the vendor, but also the customer. Sure, the customers care when vulns cause a problem, but frankly, most vulns don't cause problems. For the average small software vendor, no hacker will ever exploit even obvious vulns in their products. I know this because people hire us all the time to find vulns. We find the most shockingly obvious things that have been in their products for years, yet going back through logs we fail to find any evidence that malicious hackers have found and exploited those problems.
Security is a tradeoff. When a small company fixes a vuln, it comes at a huge cost. It means fixing something that their customers largely do not care about (and would not pay for) at the expense of adding new features that customers would want (and pay for). When a researcher publishes a vuln in a product, it a previously harmless vuln becomes harmful, forcing a tradeoff to be made that neither the vendor nor the customer wanted.
It's hard for vuln research to grasp the enormity of the tradeoff. For us, security is easy. We can pentest a product and come up with an obvious vuln within seconds. Here's a real scenario: we were asked to look at a product, so we put a packet-sniffer on the wire and within seconds found a secret password being sent in the clear that would allow us to own all their customers. Yet, for the small software vendor, this is black magic. Nobody in the company knows how to use a packet-sniffer, nor is there any budget to hire somebody who can. Nobody understands really how to fix the problem that isn't equally insecure. Their development process is so crappy that they can't track this bug and make sure it's fixed.
Vuln disclosers believe that such problems wouldn't exist "if only people took security seriously". That's not the issue. I could hold a gun to your head and tell you to solve a third order differential equation. No matter how seriously you took my threat, you still would not have the ability to solve it. Sure it'd be straightforward for a mathematician to solve, but essentially impossible for everyone else. Vulns are straightforward for us, but still impossibly difficult for most vendors.
The Attrition blogpost linked above focused on the idea of "responsible disclosure". Vuln researchers can be extremely rude while cloaking themselves in "responsibility". For example, you often see vuln reports where the researcher claimed that the vendor ignored their vuln. That's not precisely true. What really happens is that researchers act in passive-aggressive ways to make sure that the disclosure process goes awry. Sure, the immediate response of any vendor is to deny a problem (especially small vendors that have never dealt with a vuln before). Polite researchers try to overcome this hurdle in a positive fashion, passive-aggressive researchers exploit this to show why the vendor is evil or stupid. The Attrition article is a good example of the passive-aggressive attitude: when disclosure awry, it's always the vendor's fault and never the researcher's.
While disclosure is always "rude", it is never "irresponsible". Take Internet worms, for example. Regardless how responsible the underlying vulns were disclosed, they resulted in devestating Internet worms. The result was that vendors learned a lesson and stopped shipping product with such a large attack surface. While it's impossible to justify the value of disclosing a single vuln that caused a billion dollars in damage, overall, the disclosure of such bugs clearly improved cybersecurity.
It's like how there is no such thing as "irresponsible speech". It's hard to say why particular speech is valuable, such as burning a Koran, but overall, any laws that restrict "irresponsible" speech end up restricting good speech. That's why laws banning "insulting" end up banning legitimate criticism of political leaders.
Therefore, we should not delude ourselves that we are helping the vendor (or their customers) with a disclosed vuln: we usually aren't. We should not justify our actions by calling it "responsible disclosure". We should understand the inherent rudeness of our actions and behave in a humble manner. Yet, while the vendor will hate us, we should still disclose such vulns, because unfettered security research serves the greater good.
UPDATE by David Maynor (5:08pm EST): The more people involved, the more likely it'll go awry. Vulnerability Disclosure is like sex: it should be between 2 people, 3 max. Any more participants will result in a big mess of hurt feelings, bruised egos, NDAs and a weird feeling in the pit of your stomach anytime it is mentioned.
> We've learned to ship product and configure networks so that they are less susceptible to worms.
Saying that 10 month after the big Conficker bang is ridiculous. Sure, it could be worse, but a 6 figure number of bots doesn't show we've learned much.
Even if nobody in "real" world cares about vulns, the black hats do. And if you don't care about your data in every manner you can simply stop making backups or don't fix vulns.
And the reason why the small company isn't able to fix it is the way stuff like this gets handled by a vendor (no simple upgrade/migration plans, no simple update mechanisms, weak customer communication etc.).
The comparison with the hairstyle does not fit at all (as many other metaphors either). This is not a style problem. For me it's a kind of negligence to ignore vulnerabilities (no matter if you're a vendor or a user).
How about this metaphor: It's like driving without brake light - you can drive many miles but someday somebody hits your back...
I respectfully disagree. There is such a thing as irresponsible free speech, and "irresponsible disclosure" has been the stated official disclosure policy of the CAU for a number of years now:
Do you not also consider the Vendors to be rude by introducing vulnerabilities into their customers' systems and networks? In fact, that may go beyond being rude, to at best being negligent. One could even argue that the Vendor is being overtly malicious if they (and/or the public) are aware of the vulnerabilities but are ignoring them or denying that they even exist while still shipping them within the Vendor's product.
I completely disagree.
If researchers don't post anything so vendors can fix .... don't you think that there would be more exploitable vuln' used by bad guys?
... seriously ... openness is the way to go ...
As I concluded: Yet, while the vendor will hate us, we should still disclose such vulns, because unfettered security research serves the greater good.
Openness is the only way to go. But don't delude yourself into thinking that the vendor (or its customers) will like you for doing it.
It's like how nobody thanks the doctor for telling them the bad news. When the doctor tells the soldier "Sorry son, we are going to have to amputate the leg", the soldier doesn't thank the doctor, but more likely will curse the doctor and ask for a second opinion. The same is true of vuln disclosure: the vendors you report bugs to will always respond with hostility.
"I could hold a gun to your head and tell you to solve a third order differential equation"
Pfft, man I eat those in my sleep.
No just kidding. I'd probably crap myself if I saw one of those during a Differential Equation final.
Nice article. I agree with you if you approach a company, vendor or programmer which is using the product for themselves and if the weakness does not affect others (e.g. possibility of spamming, cross site scripting, etc.).
But I have to disagree with you if we are talking about companies which are selling a commercial product. In this case as a current or future customer I demand a minimum level of security (which will prevent harm for my and my users). And for me it has 2nd priority only if my disclosure will be understood as "rude".
Post a Comment