Backdoors are common, but rarely malicious
Backdoors are a common problem in software. About 20% of home routers have a backdoor in them, and 50% of industrial control computers have a backdoor. The cause of these backdoors isn't malicious, but a byproduct of software complexity. Systems need to be debugged before being shipped to customers. Therefore, the software contains debuggers. Often, programmers forget to disable the debugger backdoors before shipping. This problem is notoriously bad for all embedded operating systems (VxWorks, QNX, WinCE, etc.).
Chips have reached the software level of complexity. It is rare that any designer builds a chip from scratch. Instead, designers construct a chip from building-blocks. One of the most common building-blocks is the debugger, known as JTAG. This is a standard way of soldering some wires to the chip and connecting to the USB port, allowing common tools to debug your custom chip.
Whereas companies (should) disable the debug feature in the version they send to customers, that's not so easy with chips. It requires millions of dollars for every change to chip design. Therefore, chips always have the JTAG interface enabled. What chip designers attempt to do is just not connect the pins to it. Or, if they connect the pins, they don't route to the pins on the circuit board.
This has led to a popular hacking activity of taking a device, finding the JTAG pins, and hooking them up. A lot of devices have been hacked this way – although it requires that the hacker have physical control over the device.
One way to protect against this is by putting a key into the JTAG hardware that only the manufacturer knows, to disable some of the more dangerous JTAG commands. That's what appears to have happened here. Whether you call this a security feature to prevent others from hacking the chip through JTAG, or a secret backdoor available only to the manufacturer, is open to interpretation.
Security of FPGAs
The chip in question (Microsemi/Actel ProASIC3) is a typical FPGA – a chip with a blank array of gates that can be programmed to emulate almost any other kind of chip. As real silicon chips are becoming more expensive to manufacturer, FPGAs are becoming a more popular alternative. Every change to a chip design requires millions of dollars in changes to the masks that print gates onto a chip. FPGAs, or field programmable gate arrays, can be reprogrammed with no additional cost.
Most FPGAs are put in "smart" devices that also contain a processor (often ARM), memory, and drive (often flash). These smart devices run an embedded operating system, often Linux. The gate-array exists as a file on the drive. The file is read from the drive and written to the FPGA every time the power is turned on.
The obvious concern here is protecting intellectual-property. Competitors can easily get their hands on that file, then upload to their own FPGAs, thus cloning the product.
Therefore, to protect intellectual-property, this file can be encrypted. The FPGA can be configured with an AES 128-bit encryption key, known only to the manufacturer of the device. That makes the file useless to anybody else. Nobody can decrypt the contents to find the secrets, and competitors can download it to their own FPGAs without the keys.
While intended to protect intellectual-property, this technique will protect any other secrets. For example, you may use the FPGA as an SSL accelerator in your servers, where the FPGA executes the RSA encryption algorithm, with the private-key stored as part of the gate-array. This technique stops hackers from stealing the private-key should they be able to break into the server.
This encryption also serves as an integrity check, as it prevents hackers from changing the gate-array to do something malicious.
Obviously, a JTAG backdoor subverts all this. It not only allows the original manufacturer to steal intellectual-property, but any other secrets you tried to protect with the original AES key.
How this bug was found
This bug was found by fuzzing the JTAG port looking for undocumented functionality. While there are parts of this process unique to hardware (such as differential power analysis), the technique is ultimately little different than the fuzzing used to find software bugs.
Fuzzing has found backdoors in software before, but nobody claimed it was the work of the evil Chinese. We should keep this perspective.
This is not a "military chip
Much has been made about this being a "military" chip, but that's not true -- at least, it's not what you think.
The military uses a lot of commercial, off-the-shelf products. That doesn't mean there is anything special about it. A million soldiers use laptops to browse Facebook and exchange emails with their loved ones. It doesn't mean that these laptops are anything special or different than any other laptops. They are same Dell, Apple, and HP laptops that everyone else uses.
Sometimes the laptops are different, but that's because they are built to endure harsh environments (heat, radiation, humidity, vibration, and dust). Actel makes a "military" version of this chip, but pretty much the only difference is that it's rated to operate at higher temperatures. None of their chips, including the "military" ones, are certified by the government to hold secrets. Most of their sales are for their non-military versions, and even most of their military versions aren't for military use, but by customers (like oil rigs or airplanes) that have the same environmental concerns.
That's not to say there isn't a problem here. Consider something like the drones shot down by Iran. By their very nature, drones are designed from many non-secret, off-the-shelf components (you might find an iPhone buried somewhere inside). The reason is that they are designed to be cheap, to be frequently lost while flying over the enemy. Thus, it's likely that one of these FPGAs was inside the drone shot down by Iran. While it's unlikely the FPGA had any secrets worthwhile, issues like this make it easier for Iran to reverse engineer the drone and manufacture their own.
So what does this mean?
It's hard to say. We'll know more when the vendor (Microsemi/Actel) issues a response.
It could just be part of the original JTAG building-block. Actel didn't design their own, but instead purchased the JTAG design and placed it on their chips. They are not aware of precisely all the functionality in that JTAG block, or how it might interact with the rest of the system.
But I'm betting that Microsemi/Actel know about the functionality, but thought of it as a debug feature, rather than a backdoor.
It's remotely possible that the Chinese manufacturer added the functionality, but highly improbable. It's prohibitively difficult to change a chip design to add functionality of this complexity. On the other hand, it's easy for a manufacturer to flip bits. Consider that the functionality is part of the design, but that Actel intended to disable it by flipping a bit turning it off. A manufacturer could easily flip a bit and turn it back on again. In other words, its extraordinarily difficult to add complex new functionality, but they may get lucky and be able to make small tweaks to accomplish their goals.
In the software world, security flaws that hackers use generally result from researchers doing the unexpected. In this case, researchers found a new way of analyzing chips, and therefore, found new unexpected results. This is to be expected. We shouldn't be surprised by this backdoor, but we should insist on fixing it. And researchers will not probably hunt for similar JTAG backdoors in other chips.
We'll know more when Microsemi/Actel responds. In the meantime, it's important to note that while the researchers did indeed discover a backdoor, they offer only speculation, but no evidence, as to the source of the backdoor. As somebody with a lot of experience with this sort of thing in software cybersecurity, I doubt there is anything malicious behind it. Also note that the issue is "intellectual property protection" in FPGAs; the "military security" angle is really distant. The Chinese might subvert FPGAs so that they could later steal intellectual-property written to the chips, but the idea they went through all this to attack the US military is pretty fanciful.
Update: the researchers respond
In this article, the researchers respond to this post. It's a bit humorous, because they simultaneously say that the issues their research exposes are "[Trustworthiness] of chip developers who are subcontracted by military but mainly outsource their designs and chip fabrication to China and India" and "we have no idea why people have linked the Chinese to this as it did not come from us". The link to the Chinese came directly from them. Likewise, they deliberately distort people's misconception about the military. The truth is that the military cares about operating at high temperatures, and that in most applications, could care less if the intellectual property was stolen, or if the chip was backdoored.
Update: By the way, I've been accused of putting a backdoor in products the military uses in one high-profile incident (the accusation being nonsense, of course). I guess that makes me an expert in "backdooring the military" of some sort. Update: Over at YCombinator, somebody points out that changes aren't quite as expensive as I thought, because instead of changing the entire mask set, you can change only a single metal layer in order to enable/disable things. Update: In the comments below, Olin Sebert makes a strong argument that while the backdoor may be accidental, Actel's explicit marketing of the device as having no readback capability is evil. Update: Many have pointed out that the current paper does explicitly make the claim that the Chinese were involved. True, but they do their best to hype that danger. Their first references  is to a Taxonomy of trojans a Chinese manufacturer might insert into chips, and the page at Cambridge's website announcing the paper draws that conclusion. Moreover, the paper describes the chip as "military grade", but it is in fact only "consumer grade". All the press generated by the paper took the Chinese angle, and it's the paper's authors who are responsible for that.
Rob: Excellent stuff, as always. The paper itself is very good, but from my perspective it isn't terribly newsworthy beyond the hardware community, so the geopolitical stuff is just BS to leverage the story to a bigger audience.
where did you find a copy of their paper that details their work finding backdoors?
The particular paper is here:
The original scam / dissertation is available at
Spot-on: of course it wasn't the Chinese who did this originally, and they certainly didn't do it in the manufacturing stage. While it is not unreasonable to consider the possibility that an agent in place arranged for this debugging capability to remain even when it should have been removed, there's no evidence.
The real disappointment is that Actel made strong and explicit claims about the impossibility of reading out the FPGA configuration, and we now know those claims to have been completely false. What does that tell us about Actel's credibility? Should that make us worry similarly about other manufacturers? What should we do to assess such claims in the future?
Many false claims of security are to some extent accidental--we designed a mechanism (e.g., WEP) and thought it was secure, but we were too ignorant to do a good job. Here, they implemented a mechanism that did clearly and precisely the opposite of what they claimed--and, if the paper is to be believed, did it for years and in many different products. Maybe just miscommunication between marketing and manufacturing, but it should never have happened, and its apparently pervasive nature is deeply troubling.
JTAGs not withstanding, the real scary scenario is someone finding a way to add a backdoor into a chip's microcode. This might require an inside man in the chip design team to inject the code, but if they do, all bets are off.
I like this article because first you obviously know hardware and most people dont.
People dont understand that hardware is becoming software. And software has to be easy to commoditize, customize and yes debug.
This is progress. The more flexible it becomes, the better it is, the cheaper it is.
When you consider the ease with which this hardware can now be modified versus hardcoded chips you have to cheer.
The people that will take advantage of it, are really not important. If people want t they will fly planes into buildings. Im not worried about Iranian geeks with a USB debugger. Im just not.
Informatics Outsourcing is an Offshore Intellectual Property Services company. They are providing Intellectual Property services for Bio Technology, Biochemistry, Drug Discovery, Chemistry, etc
I find it rather disconcerting how easily you dismiss the possibilities here. Granted JTAG exists by design and if not properly implemented in final products does provide exposure, and such may very well be the case here. But to dismiss the possibility that hardware or firmware could or has been modified for malicious intent and may remain yet undiscovered is irresponsible. Personally I believe it is in the interest of US national security the silicon foundries return home.
"Many false claims of security are to some extent accidental--we designed a mechanism (e.g., WEP) and thought it was secure, but we were too ignorant to do a good job." -- Olin: Oh, the good old humility vs. hubris. If you loudly proclaim that something can't be done, and you're so good hurr durr, you usually end up getting your ass handed over to you on a fancy plate. With a "get well soon" card to boot.
"In the meantime, it's important to note that while the researchers did indeed discover a backdoor, they offer only speculation, but no evidence, as to the source of the backdoor."
Actually, the paper is quite clear that the backdoor was added by the manufacturer (Section 4):
"We discovered that in fact Actel did implement such an access, with a special key used for activation."
At no point in the paper does it claim that China was involved in inserting this backdoor (see the author's page).
The author of the original paper has determined that this backdoor was put there on purpose. The only questions are who put it there and why. Your entire post is based around your bias that this was an accident which you state right up front. But that is clearly not the case here. It might be a debug feature or something else, but it wasn't an accident.
David a very good points that clearly the blogger is glossing over. It appears there are different camps when it comes to security. Those such as this person and Bruce Schneier. They downplay everything. You have to wonder what the motivation could be. While I certainly agree that it would be an expensive and daunting endeavor, what a coup for a nation state to "pwn" their might at the most basic level. This is clearly not beyond the realm of possibility and anyone who thinks otherwise is living in a fantasy world. If the Chinese are the number one at hacking and stealing from the US, why not take to an indefensible level. Wake up and face reality Robert Graham.
You are all naive. Of course a chip designed for special cases will be used on equipment targeting military. Don't be naive to think this was a "debugger" left on. What track record does this adversary have to give them the benefit of the doubt.
Robert, You are so naive. You are not living together with russians and chinese in one country and cannot imagin what how they think about every outsider. You are like our government. Nothing happens. It's all right. Be quiet.
Actually, I think being able "steal the intellectual property" is a meaningful threat to defense systems. When your drone crashes in hostile territory, you'd like to be confident that it will be difficult for the adversary to extract your classified algorithms for target recognition, jamming resistance, etc. (not to mention crypto keys).
Of course, there are lots of ways to protect against that (ranging from fail-safe thermite ignition to just hoping it won't happen), and you need to make cost-effectiveness trade-offs in choosing what to protect and how strongly. However, if you relied on a manufacturer's representation of a security mechanism's capability and it turns out to be false, your security calculus changes, and you might be legitimately irritated about that.
All that said, I still see no reason to believe that this particular mechanism was deliberately installed as an act of espionage. Why not? Because it was so unsophisticated. It's hidden, sure, but once it's out, it's out, and it's obvious that it's a backdoor. I would expect any adversary with the resources to do this for espionage purposes would do a better job of concealing it and making it look like an accident. I absolutely believe those adversaries are out there, and working diligently to compromise U.S. systems--I just believe they'd do a better job.
I believe you did a disservice to your reputation with this post. You wrote like you are authority, based your claims at no knowledge at all (because you hadn't read the leaked paper) and now your claims slowly break down.
You have already admitted that your authoritative sounding claims about the complexity of modifying the chip by the vendor are overblown and it is in fact possible.
Now you have halfheartedly admitted as well, that there is a "evil" feature in the chip. You weasel around the full extent of this: Fact is that this backdoor allows the reading of values which should not be possible to read back AND that the vendor claimed that such read back is not even physically implemented.
Please stop downplaying this finding. Here is a backdoor (and apparently a stupid one with a default key which is claimed to be same in all chops) to circumvent elementary security features.
Also I disagree with your downplay of a backdoor. Of course it is malicious. If it was intent to be malicious is irrelevant. It is, and most backdoors are because they are usually security holes.
On the good side apparently the backdoor also allows to change the backdoor key, so there is hopefully a way to mitigate this hole.
However still the achievement of this work and paper is high. They devised a new way of side channel attack, apparently much more reliable and rewarding than current technology. Using this technology a attacker would still be able to read out a changed backdoor key if he can get his hands on a chip. And most often this would be a security catastrophe on its own. Imagine a crashed fighter jet falling into the hands of iran. They might have only the one chip, but if they can determine the backdoor key, they apparently can get access to all content in the chip. They can likely achieve this already using existing technology but this attack is so much easier.
While the researcher might have made some inappropriate remarks and might have accepted that these are blown over to some china conspiracy, this is still no reason to discount this research.
I work a lot with said FPGA.
I have indeed integrated 15 of these A3P250 and they working smoothly in an airport (fear :-P)
This blog gets something wrong : the FPGA configuration is not "a file" in the system, the A3P holds the configuration directly as Flash cells directly at each gate. It's all and well explained in ALL the datasheets.
So the system builder "flashes" the part before or after soldering the part, and the FPGA boots instantly. The A3P are not like Xilinx or Altera's chips.
Furthermore, there is a 2-key system that was designed to provide flexible IP delivery and protection. The 2nd key is not well publicised but if you dig in certain documents, it is explained. It's not a backdoor.
Oh and leave the Chinese alone... Let them build our stuff cheap, and let companies make stupid errors.
Mmmmm i should look at what they say on comp.arch.fpga, there are specialists there.
Thanks for your comments, Anonymous person above this.
I wanted to discuss FPGAs in general rather than deal with the one-time-programming features of this particular model. I suppose since the readback capability is the important part here, I should've covered that as well.
But what you say is interesting, that the second key is an integral, well-documented part of the design, if you can just find the right documentation. I'll try to hunt down that document.
I read the authors webpage and it doesn`t say anything about that the Chinese have inserted the backdoor. Nor does it suggest that anywhere on the University site. There is a paper on there about hardware assurance on the same page where the papers are for download but responding to concerns raised by the military themselves about how they are worried about chips been compromised. The BBC carried the same story, Wired magazine and Darpa themselves. So why have you link that paper with the published one when they are not connected?
There is a reference on the webpage that the US Military are worried about China, thats common knowledge its all over the internet. I dont see how making a comment on something the US Military have said means the authors are hyping the danger. The danger is hyping by the military and press.
I`ve read the paper and in it its quite plainly said that the authors believe its inserted by the manufacturer. No one in their right mind think this is just a `debug` issue as you keep claiming or its for some engineering purpose, its totally obvious its designed to get around the security and thats not needed for a debug port or during fab.
It is Military Gradeeeeeeee
Why people have difficulty understanding this thing. I have seen elsewhere the same comment. Even though the architecture is mostly the same (or should be in our case) it designed for wider temperature ranges probably offers hermetic packaging and definitely has a lot of strict quality control to pass through. Also probably uses radiation hardened by design techniques at library level...
What most people don't realize is that many many IC's come with reconfigurable options known as an eFuse. Many SoC's such as those in cell phones often have a wide range of debug features enabled in the actual hardware IC, but before going into production the eFuse for these specific features are "blown" so that they no longer operate. some examples of these are JTAG, Debug Uart, Alternate booting options such as via USB, and hardware observability signals. Back in 2010, Motorola came into the headlines regarding how they had specifically "blown" some eFuses on the Droid-X phone that prevented a wide range of upgrades. although we don't have all the details of the device in question, most likely this was a simple production error where the eFuse for these features was simply not "blown" before shipping.....
I think it's fair to publish the story in that way. When Microsemi says that the chip could be used by military, then it should be safe. A chip that holds a key and has a backdoor isn't safe.
From military I expect that they use no special foreign chip at all. Either they create and know the chip, or they should use a common chip as they would use a common and known crypto algorithm.
A no-readout promise from foreign chip manufacturers is security through obscurity.
Armorbearer Worldwide is dedicated to protection of individuals at all times in all places and in every situation. Our objective is to assist individuals in creating skills so they can help better secure their loved ones, homes, businesses and resources as well as the environment they communicate with daily.
I'd like to thank you for the efforts you have put in writing this site. I really hope to see the same high-grade blog posts by you in the future as well. In truth, your creative writing abilities has encouraged me to get my own, personal website now ;)
Feel free to surf my blog post
I think it was placed there purposely and not for debugging.
There was a book around 2000 titled "SPYDER WEB" that dealt with this - fictional book about a spy device attached to Chinese made Cray supercomputers.
MUST read - fiction or not it shows the potential of what "Could" happen.
Moving Simplified We found out this morning that we may be headed to
Fort Bragg this summer(surprising as we weren't due to PCS for another year)
fort bragg pcs
Post a Comment