The story so far
tl;dr: hackers drop 0day on medical device company hoping to profit by shorting their stock
St Jude Medical (STJ) is one of the largest providers of pacemakers (aka. cardiac devices) in the country, around ~$2.5 billion in revenue, which accounts for about half their business. They provide "smart" pacemakers with an on-board computer that talks via radio-waves to a nearby monitor that records the functioning of the device (and health data). That monitor, "Merlin@Home", then talks back up to St Jude (via phone lines, 3G cell phone, or wifi). Pretty much all pacemakers work that way (my father's does, although his is from a different vendor).MedSec is a bunch of cybersecurity researchers (white-hat hackers) who have been investigating medical devices. In theory, their primary business is to sell their services to medical device companies, to help companies secure their devices. Their CEO is Justine Bone, a long-time white-hat hacker. Despite Muddy Waters garbling the research, there's no reason to doubt that there's quality research underlying all this.
Muddy Waters is an investment company known for investigating companies, finding problems like accounting fraud, and profiting by shorting the stock of misbehaving companies.
Apparently, MedSec did a survey of many pacemaker manufacturers, chose the one with the most cybersecurity problems, and went to Muddy Waters with their findings, asking for a share of the profits Muddy Waters got from shorting the stock.
Muddy Waters published their findings in [1] above. St Jude published their response in [2] above. They are both highly dishonest. I point that out because people want to discuss the ethics of using 0day to short stock when we should talk about the ethics of lying.
"Why you should sell the stock" [finance issues]
In this section, I try to briefly summarize Muddy Water's argument why St Jude's stock will drop. I'm not an expert in this area (though I do a bunch of investment), but they do seem flimsy to me.
Muddy Water's argument is that these pacemakers are half of St Jude's business, and that fixing them will first require recalling them all, then take another 2 year to fix, during which time they can't be selling pacemakers. Much of the Muddy Waters paper is taken up explaining this, citing similar medical cases, and so on.
If at all true, and if the cybersecurity claims hold up, then yes, this would be good reason to short the stock. However, I suspect they aren't true -- and they are simply trying to scare people about long-term consequences allowing Muddy Waters to profit in the short term.
@selenakyle on Twitter suggests this interest document [4] about market-solutions to vuln-disclosure, if you are interested in this angle of things.
Update from @lippard: Abbot Labs agreed in April to buy St Jude at $85 a share (when St Jude's stock was $60/share). Presumable, for this Muddy Waters attack on St Jude's stock price to profit from anything more than a really short term stock drop (like dumping their short position today), Muddy Waters would have believe this effort will cause Abbot Labs to walk away from the deal. Normally, there are penalties for doing so, but material things like massive vulnerabilities in a product should allow Abbot Labs to walk away without penalties.
The 0day being dropped
Well, they didn't actually drop 0day as such, just claims that 0day exists -- that it's been "demonstrated". Reading through their document a few times, I've created a list of the 0day they found, to the granularity that one would expect from CVE numbers (CVE is group within the Department of Homeland security that assigns standard reference numbers to discovered vulnerabilities).The first two, which can kill somebody, are the salient ones. The others are more normal cybersecurity issues, and may be of concern because they can leak HIPAA-protected info.
CVE-2016-xxxx: Pacemaker can be crashed, leading to death
Within a reasonable distance (under 50 feet) over several hours, pounding the pacemaker with malformed packets (either from an SDR or a hacked version of the Merlin@Home monitor), the pacemaker can crash. Sometimes such crashes will brick the device, other times put it into a state that may kill the patient by zapping the heart too quickly.
CVE-2016-xxxx: Pacemaker power can be drained, leading to death
Within a reasonable distance (under 50 feet) over several days, the pacemaker's power can slowly be drained at the rate of 3% per hour. While the user will receive a warning from their Merlin@Home monitoring device that the battery is getting low, it's possible the battery may be fully depleted before they can get to a doctor for a replacement. A non-functioning pacemaker may lead to death.
CVE-2016-xxxx: Pacemaker uses unauthenticated/unencrypted RF protocol
The above two items are possible because there is no encryption nor authentication in the wireless protocol, allowing any evildoer access to the pacemaker device or the monitoring device.
CVE-2016-xxxx: Merlin@Home contained hard-coded credentials and SSH keys
The password to connect to the St Jude network is the same for all device, and thus easily reverse engineered.
CVE-2016-xxxx: local proximity wand not required
It's unclear in the report, but it seems that most other products require a wand in local promixity (inches) in order to enable communication with the pacemaker. This seems like a requirement -- otherwise, even with authentication, remote RF would be able to drain the device in the person's chest.
So these are, as far as I can tell, the explicit bugs they outline. Unfortunately, none are described in detail. I don't see enough detail for any of these to actually be assigned a CVE number. I'm being generous here, trying to describe them as such, giving them the benefit of the doubt, there's enough weasel language in there that makes me doubt all of them. Though, if the first two prove not to be reproducible, then there will be a great defamation case, so I presume those two are true.
The movie/TV plot scenarios
So if you wanted to use this as a realistic TV/movie plot, here are two of them.
#1 You (the executive of the acquiring company) are meeting with the CEO and executives of a smaller company you want to buy. It's a family concern, and the CEO really doesn't want to sell. But you know his/her children want to sell. Therefore, during the meeting, you pull out your notebook and an SDR device and put it on the conference room table. You start running the exploit to crash that CEO's pacemaker. It crashes, the CEO grabs his/her chest, who gets carted off the hospital. The children continue negotiations, selling off their company.
#2 You are a hacker in Russia going after a target. After many phishing attempts, you finally break into the home desktop computer. From that computer, you branch out and connect to the Merlin@Home devices through the hard-coded password. You then run an exploit from the device, using that device's own radio, to slowly drain the battery from the pacemaker, day after day, while the target sleeps. You patch the software so it no longer warns the user that the battery is getting low. The battery dies, and a few days later while the victim is digging a ditch, s/he falls over dead from heart failure.
The Muddy Water's document is crap
There are many ethical issues, but the first should be dishonesty and spin of the Muddy Waters research report.The report is clearly designed to scare other investors to drop St Jude stock price in the short term so that Muddy Waters can profit. It's not designed to withstand long term scrutiny. It's full of misleading details and outright lies.
For example, it keeps stressing how shockingly bad the security vulnerabilities are, such as saying:
We find STJ Cardiac Devices’ vulnerabilities orders of magnitude more worrying than the medical device hacks that have been publicly discussed in the past.This is factually untrue. St Jude problems are no worse than the 2013 issue where doctors disable the RF capabilities of Dick Cheney's pacemaker in response to disclosures. They are no worse than that insulin pump hack. Bad cybersecurity is the norm for medical devices. St Jude may be among the worst, but not by an order-of-magnitude.
The term "orders of magnitude" is math, by the way, and means "at least 100 times worse". As an expert, I claim these problems are not even one order of magnitude (10 times worse). I challenge MedSec's experts to stand behind the claim that these vulnerabilities are at least 100 times worse than other public medical device hacks.
In many places, the language is wishy-washy. Consider this quote:
Despite having no background in cybersecurity, Muddy Waters has been able to replicate in-house key exploits that help to enable these attacksThe semantic content of this is nil. It says they weren't able to replicate the attacks themselves. They don't have sufficient background in cybersecurity to understand what they replicated.
Such language is pervasive throughout the document, things that aren't technically lies, but which aren't true, either.
Also pervasive throughout the document, repeatedly interjected for no reason in the middle of text, are statements like this, repeatedly stressing why you should sell the stock:
Regardless, we have little doubt that STJ is about to enter a period of protracted litigation over these products. Should these trials reach verdicts, we expect the courts will hold that STJ has been grossly negligent in its product design. (We estimate awards could total $6.4 billion.15)I point this out because Muddy Waters obviously doesn't feel the content of the document stands on its own, so that you can make this conclusion yourself. It instead feels the need to repeat this message over and over on every page.
Muddy Waters violation of Kerckhoff's Principle
One of the most important principles of cyber security is Kerckhoff's Principle, that more openness is better. Or, phrased another way, that trying to achieve security through obscurity is bad.The Muddy Water's document attempts to violate this principle. Besides the the individual vulnerabilities, it makes the claim that St Jude cybersecurity is inherently bad because it's open. it uses off-the-shelf chips, standard software (line Linux), and standard protocols. St Jude does nothing to hide or obfuscate these things.
Everyone in cybersecurity would agree this is good. Muddy Waters claims this is bad.
For example, some of their quotes:
One competitor went as far as developing a highly proprietary embedded OS, which is quite costly and rarely seen
In contrast, the other manufacturers have proprietary RF chips developed specifically for their protocolsAgain, as the cybersecurity experts in this case, I challenge MedSec to publicly defend Muddy Waters in these claims.
Medical device manufacturers should do the opposite of what Muddy Waters claims. I'll explain why.
Either your system is secure or it isn't. If it's secure, then making the details public won't hurt you. If it's insecure, then making the details obscure won't help you: hackers are far more adept at reverse engineering than you can possibly understand. Making things obscure, though, does stop helpful hackers (i.e. cybersecurity consultants you hire) from making your system secure, since it's hard figuring out the details.
Said another way: your adversaries (such as me) hate seeing open systems that are obviously secure. We love seeing obscure systems, because we know you couldn't possibly have validated their security.
The point is this: Muddy Waters is trying to profit from the public's misconception about cybersecurity, namely that obscurity is good. The actual principle is that obscurity is bad.
St Jude's response was no better
In response to the Muddy Water's document, St Jude published this document [2]. It's equally full of lies -- the sort that may deserve a share holder lawsuit. (I see lawsuits galore over this). It says the following:We have examined the allegations made by Capital and MedSec on August 25, 2016 regarding the safety and security of our pacemakers and defibrillators, and while we would have preferred the opportunity to review a detailed account of the information, based on available information, we conclude that the report is false and misleading.If that's true, if they can prove this in court, then that will mean they could win millions in a defamation lawsuit against Muddy Waters, and millions more for stock manipulation.
But it's almost certainly not true. Without authentication/encryption, then the fact that hackers can crash/drain a pacemaker is pretty obvious, especially since (as claimed by Muddy Waters), they've successfully done it. Specifically, the picture on page 17 of the 34 page Muddy Waters document is a smoking gun of a pacemaker misbehaving.
The rest of their document contains weasel-word denials that may be technically true, but which have no meaning.
St. Jude Medical stands behind the security and safety of our devices as confirmed by independent third parties and supported through our regulatory submissions.
Our software has been evaluated and assessed by several independent organizations and researchers including Deloitte and Optiv.
In 2015, we successfully completed an upgrade to the ISO 27001:2013 certification.These are all myths of the cybersecurity industry. Conformance with security standards, such as ISO 27001:2013, has absolutely zero bearing on whether you are secure. Having some consultants/white-hat claim your product is secure doesn't mean other white-hat hackers won't find an insecurity.
Indeed, having been assessed by Deloitte is a good indicator that something is wrong. It's not that they are incompetent (they've got some smart people working for them), but ultimately the way the security market works is that you demand of such auditors that the find reasons to believe your product is secure, not that they keep hunting until something is found that is insecure. It's why outsiders, like MedSec, are better, because they strive to find why your product is insecure. The bigger the enemy, the more resources they'll put into finding a problem.
It's like after you get a hair cut, your enemies and your friends will have different opinions on your new look. Enemies are more honest.
The most obvious lie from the St Jude response is the following:
The report claimed that the battery could be depleted at a 50-foot range. This is not possible since once the device is implanted into a patient, wireless communication has an approximate 7-foot range. This brings into question the entire testing methodology that has been used as the basis for the Muddy Waters Capital and MedSec report.That's not how wireless works. With directional antennas and amplifiers, 7-feet easily becomes 50-feet or more. Even without that, something designed for reliable operation at 7-feet often works less reliably at 50-feet. There's no cutoff at 7-feet within which it will work, outside of which it won't.
That St Jude deliberately lies here brings into question their entire rebuttal. (see what I did there?)
ETHICS EHTICS ETHICS
First let's discuss the ethics of lying, using weasel words, and being deliberately misleading. Both St Jude and Muddy Waters do this, and it's ethically wrong. I point this out to uninterested readers who want to get at that other ethical issue. Clear violations of ethics we all agree interest nobody -- but they ought to. We should be lambasting Muddy Waters for their clear ethical violations, not the unclear one.So let's get to the ethical issue everyone wants to discuss:
Is it ethical to profit from shorting stock while dropping 0day.Let's discuss some of the issues.
There's no insider trading. Some people wonder if there are insider trading issues. There aren't. While it's true that Muddy Waters knew some secrets that nobody else knew, as long as they weren't insider secrets, it's not insider trading. In other words, only insiders know about a key customer contract won or lost recently. But, vulnerabilities researched by outsiders is still outside the company.
Watching a CEO walk into the building of a competitor is still outsider knowledge -- you can trade on the likely merger, even though insider employees cannot.
Dropping 0day might kill/harm people. That may be true, but that's never an ethical reason to not drop it. That's because it's not this one event in isolation. If companies knew ethical researchers would never drop an 0day, then they'd never patch it. It's like the government's warrantless surveillance of American citizens: the courts won't let us challenge it, because we can't prove it exists, and we can't prove it exists, because the courts allow it to be kept secret, because revealing the surveillance would harm national intelligence. That harm may happen shouldn't stop the right thing from happening.
In other words, in the long run, dropping this 0day doesn't necessarily harm people -- and thus profiting on it is not an ethical issue. We need incentives to find vulns. This moves the debate from an ethical one to more of a factual debate about the long-term/short-term risk from vuln disclosure.
As MedSec points out, St Jude has already proven itself an untrustworthy consumer of vulnerability disclosures. When that happens, the dropping 0day is ethically permissible for "responsible disclosure". Indeed, that St Jude then lied about it in their response ex post facto justifies the dropping of the 0day.
No 0day was actually dropped here. In this case, what was dropped was claims of 0day. This may be good or bad, depending on your arguments. It's good that the vendor will have some extra time to fix the problems before hackers can start exploiting them. It's bad because we can't properly evaluate the true impact of the 0day unless we get more detail -- allowing Muddy Waters to exaggerate and mislead people in order to move the stock more than is warranted.
In other words, the lack of actual 0day here is the problem -- actual 0day would've been better.
This 0day is not necessarily harmful. Okay, it is harmful, but it requires close proximity. It's not as if the hacker can reach out from across the world and kill everyone (barring my movie-plot section above). If you are within 50 feet of somebody, it's easier shooting, stabbing, or poisoning them.
Shorting on bad news is common. Before we address the issue whether this is unethical for cybersecurity researchers, we should first address the ethics for anybody doing this. Muddy Waters already does this by investigating companies for fraudulent accounting practice, then shorting the stock while revealing the fraud.
Yes, it's bad that Muddy Waters profits on the misfortunes of others, but it's others who are doing fraud -- who deserve it. [Snide capitalism trigger warning] To claim this is unethical means you are a typical socialist who believe the State should defend companies, even those who do illegal thing, in order to stop illegitimate/windfall profits. Supporting the ethics of this means you are a capitalist, who believe companies should succeed or fail on their own merits -- which means bad companies need to fail, and investors in those companies should lose money.
Yes, this is bad for cybersec research. There is constant tension between cybersecurity researchers doing "responsible" (sic) research and companies lobbying congress to pass laws against it. We see this recently how Detroit lobbied for DMCA (copyright) rules to bar security research, and how the DMCA regulators gave us an exemption. MedSec's action means now all medical devices manufacturers will now lobby congress for rules to stop MedSec -- and the rest of us security researchers. The lack of public research means medical devices will continue to be flawed, which is worse for everyone.
Personally, I don't care about this argument. How others might respond badly to my actions is not an ethical constraint on my actions. It's like speech: that others may be triggered into lobbying for anti-speech laws is still not constraint on what ethics allow me to say.
There were no lies or betrayal in the research. For me, "ethics" is usually a problem of lying, cheating, theft, and betrayal. As long as these things don't happen, then it's ethically okay. If MedSec had been hired by St Jude, had promised to keep things private, and then later disclosed them, then we'd have an ethical problem. Or consider this: frequently clients ask me to lie or omit things in pentest reports. It's an ethical quagmire. The quick answer, by the way, is "can you make that request in writing?". The long answer is "no". It's ethically permissible to omit minor things or do minor rewording, but not when it impinges on my credibility.
A life is worth about $10-million. Most people agree that "you can't put value on a human life", and that those who do are evil. The opposite is true. Should we spend more on airplane safety, breast cancer research, or the military budget to fight ISIS. Each can be measured in the number of lives saved. Should we spend more on breast cancer research, which affects people in their 30s, or solving heart disease, which affects people's in their 70s? All these decisions means putting value on human life, and sometimes putting different value on human life. Whether you think it's ethical, it's the way the world works.
Thus, we can measure this disclosure of 0day in terms of potential value of life lost, vs. potential value of life saved.
Is this market manipulation? This is more of a legal question than an ethical one, but people are discussing it. If the data is true, then it's not "manipulation" -- only if it's false. As documented in this post, there's good reason to doubt the complete truth of what Muddy Waters claims. I suspect it'll cost Muddy Waters more in legal fees in the long run than they could possibly hope to gain in the short run. I recommend investment companies stick to areas of their own expertise (accounting fraud) instead of branching out into things like cyber where they really don't grasp things.
This is again bad for security research. Frankly, we aren't a trusted community, because we claim the "sky is falling" too often, and are proven wrong. As this is proven to be market manipulation, as the stock recovers back to its former level, and the scary stories of mass product recalls fail to emerge, we'll be blamed yet again for being wrong. That hurts are credibility.
On the other the other hand, if any of the scary things Muddy Waters claims actually come to pass, then maybe people will start heading our warnings.
Ethics conclusion: I'm a die-hard troll, so therefore I'm going to vigorously defend the idea of shorting stock while dropping 0day. (Most of you appear to think it's unethical -- I therefore must disagree with you). But I'm also a capitalist. This case creates an incentive to drop harmful 0days -- but it creates an even greater incentive for device manufacturers not to have 0days to begin with. Thus, despite being a dishonest troll, I do sincerely support the ethics of this.
Conclusion
The two 0days are about crashing the device (killing the patient sooner) or draining the battery (killin them later). Both attacks require hours (if not days) in close proximity to the target. If you can get into the local network (such as through phishing), you might be able to hack the Merlin@Home monitor, which is in close proximity to the target for hours every night.Muddy Waters thinks the security problems are severe enough that it'll destroy St Jude's $2.5 billion pacemaker business. The argument is flimsy. St Jude's retort is equally flimsy.
My prediction: a year from now we'll see little change in St Jude's pacemaker business earners, while there may be some one time costs cleaning some stuff up. This will stop the shenanigans of future 0day+shorting, even when it's valid, because nobody will believe researchers.
3 comments:
Compromising the entire SJM network through their password fail - that's several orders of magnitude more severe than one device.
Seems like a stretch to target individuals that might or might not have an icd. I don't see this as a high payoff. Better to lock up a hospital's system by finding a vulnerable device hooked up to their network. It just seems too random. I suppose you could target a high value individual. But at some point, you have to give them a heads up so that you can get paid. Still I need to know that they have a device and what kind it is. If it's murder, there has to be easier ways. I wonder what the trade off is for plain text between the device and the 'router'.
Thanks for your article. You may be interested in this code of ethics for the private sector presented on Aug 26 at HITB GSEC Singapore: Vulnerabilities and the Surrounding Ethical Questions: A Code of Ethics for the Private Sector http://gsec.hitb.org/sg2016/sessions/biz-commsec-vulnerabilities-and-ethics-a-code-of-ethics-for-the-private-sector/
Post a Comment