Friday, March 11, 2016

Can the Apple code be misused? (Partly Retracted)

Dan Guido (@DGuido), who knows more about iOS than I do, wants me to retract this post. I'm willing to retract it based solely on his word, but he won't give me any details as to what specifically he objects to. I'm an expert in reverse-engineering and software development, but I admit there may be something to specific to iOS (such as how it encrypts firmware) that I may not know.

This post will respond to the tweet by Orin Kerr:

The government is right that the software must be signed by Apple and made to only work on Farook's phone, but the situation is more complicated than that.

The basic flaw in this picture is jailbreaks. This is a process of finding some hack that gets around Apple's "signing" security layer. Jailbreaks are popular in the user community, especially China, when people want to run software not approved by Apple. When the government says "intact security", it means "non-jailbroken".

Each new version of iOS requires the discovery of some new hack to enable jailbreaking. Hacking teams compete to see who can ship a new jailbreak to users, and other companies sell jailbreaks to intelligence agencies. Once jailbroken, the signing is bypassed, as is the second technique of locking the software specifically to Farook's phone.

Details are more complicated than this. The issue isn't that jailbreaks will allow this software to run. Instead, the issue is that jailbreaks can reverse-engineer this software to grab its secrets, and then use those secrets on other phones.

A more important flaw in this reasoning is the creation of the source code itself. This is the human readable form of the code written by the Apple engineers. This will later be compiled into "binary code" then signed. It's at the source code stage that Apple is most in danger of losing secrets.

Let's assume that Apple is infiltrated by spies from the NSA and the Chinese. Some secrets can still be kept, such as the signing keys for the software. Other secrets cannot be kept, such as source code. It's likely the NSA and/or Chinese have stolen Apple's source code multiple times. Indeed, most of the source is public anyway (the Darwin operating system, Webkit, etc.). It's not something Apple is too concerned about -- as long as the source doesn't get published.

When Apple writes this specific tool for the FBI, it'll be very hard to keep that source out of the hands of such spies. It's possible to keep it secret, but only through burdonsome heroic efforts on Apple's part that certainly weren't part of its initial estimate.

More important than the source code, though, are the ideas. Code is expressive speech that communicates ideas. Even when engineers forget the details of source code, they can still retain these ideas. Years later, they can recall those ideas and use them. I give a real example of this in my previous post on expressiveness of code. Apple cannot contain these ideas. The engineers in question, after building the code, can immediately quit Apple and got to to work for Chinese jailbreak companies or American defense contractors for twice the salary. And it's completely legal.

It's like a Hollywood failed movie project. In the end, they decide not to move forward with the project, shutting it down. The employees then go off to different companies, taking those ideas with them, using them in unrelated movie projects. That's the story told in the award-winning documentary Jodorowsky's Dune, which ties that production to other unrelated movies, like Alien, Star Wars, and Terminator.

Orin goes onto ask:
It will likely take more than a few days to write the code. The FBI misrepresents the task as consisting of only a few lines of code. But Apple estimates a much larger project. Though to be fair, some of that is testing, packaging, and documentation unrelated to the amount of code written.

The task will likely require different skills from multiple engineers, rather than being the output of a single engineer. That's because it's possible no single engineer has all the necessary skills. However, all the engineers involved will still walk away with the entire picture, able to recreate the work on their own when working for the Chinese or Booz-Allen.

In the end, it's not a huge secret that Apple will be losing. For the most part, the "backdoor" already exists, the only question is how best to exploit it. It's likely something the jailbreak community can figure out for themselves. But at the same time, Apple does have a point that there is the fundamental burden that producing this software will slightly (though not catastrophically) weaken the security of their existing phones.


Alex Trebek said...
This comment has been removed by the author.
Alex Trebek said...

please clarify. So your argument is either:
-leveraging jailbreaks (effectively 0-day vulnerabilities) is a problem
-because Apple would have to sign the hypothetical firmware/os update (to bypass 10max tries, 5s artificial delay, etc.) with their private key, the FBI (and whomever they share the code with) can still push said firmware/os update to anyone elses phones

The first doesnt appear to be a problem, as even though 0days are harder to find it hasnt dried up and probably never will until fully verified code (formal verification which is a reality in the form of a formally verified kernel a la sEL4)

This whole shenanigan came about probably because the NSA's capabilities, however traditional to the IC across the planet (spies goin spy), has the public scared. Would NSA have Apple's firmware private key? Probably. However, it would seriously tank an American company, ergo, a publicly known way would have to be created (a legal precedent) and leveraged so the IC/LE community can continue parallel construction and such.

If I'm wrong, then at least setting a legal precedent would take the international bite out of the argument that the American IC has created a panopticon

Neither for nor against, just an objective observation