Backdoors are common, but rarely malicious
Backdoors are a common problem in software. About 20% of home routers have a backdoor in them, and 50% of industrial control computers have a backdoor. The cause of these backdoors isn't malicious, but a byproduct of software complexity. Systems need to be debugged before being shipped to customers. Therefore, the software contains debuggers. Often, programmers forget to disable the debugger backdoors before shipping. This problem is notoriously bad for all embedded operating systems (VxWorks, QNX, WinCE, etc.).
Chips have reached the software level of complexity. It is rare that any designer builds a chip from scratch. Instead, designers construct a chip from building-blocks. One of the most common building-blocks is the debugger, known as JTAG. This is a standard way of soldering some wires to the chip and connecting to the USB port, allowing common tools to debug your custom chip.
Whereas companies (should) disable the debug feature in the version they send to customers, that's not so easy with chips. It requires millions of dollars for every change to chip design. Therefore, chips always have the JTAG interface enabled. What chip designers attempt to do is just not connect the pins to it. Or, if they connect the pins, they don't route to the pins on the circuit board.
This has led to a popular hacking activity of taking a device, finding the JTAG pins, and hooking them up. A lot of devices have been hacked this way – although it requires that the hacker have physical control over the device.
One way to protect against this is by putting a key into the JTAG hardware that only the manufacturer knows, to disable some of the more dangerous JTAG commands. That's what appears to have happened here. Whether you call this a security feature to prevent others from hacking the chip through JTAG, or a secret backdoor available only to the manufacturer, is open to interpretation.
Security of FPGAs
The chip in question (Microsemi/Actel ProASIC3) is a typical FPGA – a chip with a blank array of gates that can be programmed to emulate almost any other kind of chip. As real silicon chips are becoming more expensive to manufacturer, FPGAs are becoming a more popular alternative. Every change to a chip design requires millions of dollars in changes to the masks that print gates onto a chip. FPGAs, or field programmable gate arrays, can be reprogrammed with no additional cost.
Most FPGAs are put in "smart" devices that also contain a processor (often ARM), memory, and drive (often flash). These smart devices run an embedded operating system, often Linux. The gate-array exists as a file on the drive. The file is read from the drive and written to the FPGA every time the power is turned on.
The obvious concern here is protecting intellectual-property. Competitors can easily get their hands on that file, then upload to their own FPGAs, thus cloning the product.
Therefore, to protect intellectual-property, this file can be encrypted. The FPGA can be configured with an AES 128-bit encryption key, known only to the manufacturer of the device. That makes the file useless to anybody else. Nobody can decrypt the contents to find the secrets, and competitors can download it to their own FPGAs without the keys.
While intended to protect intellectual-property, this technique will protect any other secrets. For example, you may use the FPGA as an SSL accelerator in your servers, where the FPGA executes the RSA encryption algorithm, with the private-key stored as part of the gate-array. This technique stops hackers from stealing the private-key should they be able to break into the server.
This encryption also serves as an integrity check, as it prevents hackers from changing the gate-array to do something malicious.
Obviously, a JTAG backdoor subverts all this. It not only allows the original manufacturer to steal intellectual-property, but any other secrets you tried to protect with the original AES key.
How this bug was found
This bug was found by fuzzing the JTAG port looking for undocumented functionality. While there are parts of this process unique to hardware (such as differential power analysis), the technique is ultimately little different than the fuzzing used to find software bugs.
Fuzzing has found backdoors in software before, but nobody claimed it was the work of the evil Chinese. We should keep this perspective.
This is not a "military chip
Much has been made about this being a "military" chip, but that's not true -- at least, it's not what you think.
The military uses a lot of commercial, off-the-shelf products. That doesn't mean there is anything special about it. A million soldiers use laptops to browse Facebook and exchange emails with their loved ones. It doesn't mean that these laptops are anything special or different than any other laptops. They are same Dell, Apple, and HP laptops that everyone else uses.
Sometimes the laptops are different, but that's because they are built to endure harsh environments (heat, radiation, humidity, vibration, and dust). Actel makes a "military" version of this chip, but pretty much the only difference is that it's rated to operate at higher temperatures. None of their chips, including the "military" ones, are certified by the government to hold secrets. Most of their sales are for their non-military versions, and even most of their military versions aren't for military use, but by customers (like oil rigs or airplanes) that have the same environmental concerns.
That's not to say there isn't a problem here. Consider something like the drones shot down by Iran. By their very nature, drones are designed from many non-secret, off-the-shelf components (you might find an iPhone buried somewhere inside). The reason is that they are designed to be cheap, to be frequently lost while flying over the enemy. Thus, it's likely that one of these FPGAs was inside the drone shot down by Iran. While it's unlikely the FPGA had any secrets worthwhile, issues like this make it easier for Iran to reverse engineer the drone and manufacture their own.
So what does this mean?
It's hard to say. We'll know more when the vendor (Microsemi/Actel) issues a response.
It could just be part of the original JTAG building-block. Actel didn't design their own, but instead purchased the JTAG design and placed it on their chips. They are not aware of precisely all the functionality in that JTAG block, or how it might interact with the rest of the system.
But I'm betting that Microsemi/Actel know about the functionality, but thought of it as a debug feature, rather than a backdoor.
It's remotely possible that the Chinese manufacturer added the functionality, but highly improbable. It's prohibitively difficult to change a chip design to add functionality of this complexity. On the other hand, it's easy for a manufacturer to flip bits. Consider that the functionality is part of the design, but that Actel intended to disable it by flipping a bit turning it off. A manufacturer could easily flip a bit and turn it back on again. In other words, its extraordinarily difficult to add complex new functionality, but they may get lucky and be able to make small tweaks to accomplish their goals.
In the software world, security flaws that hackers use generally result from researchers doing the unexpected. In this case, researchers found a new way of analyzing chips, and therefore, found new unexpected results. This is to be expected. We shouldn't be surprised by this backdoor, but we should insist on fixing it. And researchers will not probably hunt for similar JTAG backdoors in other chips.
We'll know more when Microsemi/Actel responds. In the meantime, it's important to note that while the researchers did indeed discover a backdoor, they offer only speculation, but no evidence, as to the source of the backdoor. As somebody with a lot of experience with this sort of thing in software cybersecurity, I doubt there is anything malicious behind it. Also note that the issue is "intellectual property protection" in FPGAs; the "military security" angle is really distant. The Chinese might subvert FPGAs so that they could later steal intellectual-property written to the chips, but the idea they went through all this to attack the US military is pretty fanciful.
Update: the researchers respond
In this article, the researchers respond to this post. It's a bit humorous, because they simultaneously say that the issues their research exposes are "[Trustworthiness] of chip developers who are subcontracted by military but mainly outsource their designs and chip fabrication to China and India" and "we have no idea why people have linked the Chinese to this as it did not come from us". The link to the Chinese came directly from them. Likewise, they deliberately distort people's misconception about the military. The truth is that the military cares about operating at high temperatures, and that in most applications, could care less if the intellectual property was stolen, or if the chip was backdoored.
Update: By the way, I've been accused of putting a backdoor in products the military uses in one high-profile incident (the accusation being nonsense, of course). I guess that makes me an expert in "backdooring the military" of some sort. Update: Over at YCombinator, somebody points out that changes aren't quite as expensive as I thought, because instead of changing the entire mask set, you can change only a single metal layer in order to enable/disable things. Update: In the comments below, Olin Sebert makes a strong argument that while the backdoor may be accidental, Actel's explicit marketing of the device as having no readback capability is evil. Update: Many have pointed out that the current paper does explicitly make the claim that the Chinese were involved. True, but they do their best to hype that danger. Their first references  is to a Taxonomy of trojans a Chinese manufacturer might insert into chips, and the page at Cambridge's website announcing the paper draws that conclusion. Moreover, the paper describes the chip as "military grade", but it is in fact only "consumer grade". All the press generated by the paper took the Chinese angle, and it's the paper's authors who are responsible for that.