Tuesday, February 16, 2016

Some notes on Apple decryption San Bernadino phone

Today, a judge ordered Apple to help the FBI decrypt the San Bernadino shooter's iPhone 5C. Specifically:
  1. disable the auto-erase that happens after 10 bad guesses
  2. enable submitting passcodes at a high speed electronically rather than forcing a human to type them one-by-one
  3. likely accomplish this through a software update (which would run out of RAM, rather than updating the operating system)
The text of the court order almost exactly matches that of the "IOS Security Guide". In other words, while it may look fairly technical, actually the entirety of the technical stuff they are asking is described in one short document.

The problem the FBI is trying to solve is that when guessing passcodes is slow. The user has two options. One option is that every bad guess causes the wait between guesses to get longer and longer, slowing down guessing, forcing an hour between guesses. The other option is to have the phone erase itself after 10 bad guesses. Ether way, it makes guessing the passcode impractical. The FBI is demanding the Apple update the software of the phone to prevent either of these things from happening. This software would run in RAM, rather than updating the operating-system software already stored on the device.

The phone is an iPhone 5C, first released in September 2013, so is quite old. This increases the chance that Apple may indeed be able to hack the phone as the court order suggests, depending upon the software version. Unlike the 5S, the 5C doesn't have the hardware enclave, but I seem to remember it has something related.

On newer phones like the iPhone 6, with Apple's "Enclave", such an update of the firmware would be impossible. Updating the firmware to do what the FBI wants would also erase the crypto keys, or at least first require unlocking. If such a trick would work on the newer phones, then Apple has been lying about them. [UPDATE: There seems to be some disagreement here. I remember something to this effect when Apple announced the iPhone 5C, but I can't find any reference to backup my claims. It may be that the current 5C has a similar vulnerability to a firmware update].

On older phones, such as the iPhone 5C, there is no enclave, so plausible the FBI's strategy of updating iOS might work. But the problem exists on how to get the iOS update onto the phone -- which may need a passcode.

The first hurdle is to get the iPhone to trust the computer doing the update, which can only be done with an unlocked phone. That means the FBI won't be able to get the phone to trust their own computers. However, the iPhone has probably been connected to a laptop or desktop owned by the terrorists, so such an update can happen from those computers.

The second hurdle is that the phone asks for a passcode during an update. I updated my old iPhone 5 to verify this. Right between the update steps, it asked for the passcode. I'm not sure who asked for it. Was it the older iOS version, preventing an update? Or was it the new iOS version, asking to verify the new update. In the first case, it's not something Apple can change, but in the second case, it's something Apple can fix to comply with the FBI's request.

I was using iTunes. Apparently, there are other tools out there (used for repair shops and factories) that are more efficient, and which may be able to bypass a security check.

Depending on the version of the existing iOS version on the phone, there may be other opportunities for the FBI. Back in 2014, there was some controversy about a developer feature that could be used to 'backdoor' the iPhone, assuming it had already been set to trust a computer.

Lastly, the older iOS 8 defaulted to 4 digit passcodes, and merely a long delay (but not erasure) between attempts. There's a good chance this is how the phone was configured. Which means that an intern with the phone will eventually be able to decrypt it.

The upshot is this. It's an older phone. If the iOS version is old, and especially if it's been configured to "trust" a laptop/desktop, then there is a good chance Apple or the FBI could decrypt it. If the software is reasonably up-to-date, my understanding of how iPhone's work, it's impossible at the moment for Apple to decrypt the device, especially as suggested by the court order.

In any case, I assume that Apple will challenge the "All Writs Act" that the FBI is using to compel Apple to comply.


Q: Isn't helping the fight against terrorism the right thing to do? These terrorists killed a lot of innocent people!
A: Certainly. But the question isn't whether Apple should help in this particular case, but whether Apple can be compelled to help in all cases, even when the government is abusing it's power. And by an large, the government is abusing the powers it demanded in order to fight "terrorism".

Q: Is it possible for Apple to do this?
A: If the phone were a 5S or later, then the answer is probably "no". Apple claims this, and techies agree. But the phone was a 5C. That model, and older, it may be possible. There are still hurdles, such as getting the phone to trust a firmware update without having the passcode.

Q: Does the law allow the FBI to do this? What law?
A: The "All Writs Act of 1789", 28 U.S.C. § 1651. This is highly controversial, with many claiming that this law is nowhere near enough to compel Apple to write new code.

Q: I heard its a trick to force Apple to create a backdoor.
A: No, that's an invalid assertion. For one thing, the court order explicitly wants Apple to limit the special software for only this phone, so it wouldn't be something the FBI could use on other phones. Nor is that FBI asking for this feature to be placed on any customer owned phone, but only this one phone in their possession.

Q: Doesn't the "enclave" features stop this?
A: Not for the older iPhone 5C.

Q: Once Apple supplies the backdoored software, how long will it take them to crack the decryption?
A: It's 80 milliseconds per guess, which is a hardware limit. We can therefore do the math:
4 digit PIN - 13.3 minutes
6 digit PIN - 22.2 hours
6 letter password - 300+ years

Q: Do we know if the 'erase' feature (after 10 failed guesses) was enabled?
A: It was a phone given by the employer, which claims erase was enabled by default.

Q: What iOS version is the phone?
A: According to the FBI, it's iOS 9.

Q: Is the FBI asking for Apple to create a custom operating-system?
A: No. They are asking for Apple to write code that would run out of RAM, without changing anything on the drive.

Q: How is this different from the 70 other times the FBI has asked Apple to unlock a phone?
A: Because those times took a few minutes of effort on Apple's part. This is asking Apple to spend 2000 hours on creating a new technology that could potentially be used to unlock all older phones. In other words, it's conscripting a man-year's worth of labor.


Specifically, the court suggests that this be done with a firmware update, but with a unique ID specific to this particular phone, so that the FBI can't just then load that firmware on any phone. The otherwise awesome Mike Masnick suggests the court is ordering Apple to "create a backdoor" instead of just "decrypt". I disagree with that logic, it really is just about decrypting this one phone.


Aca said...

"it really is just about decrypting this one phone"
This case is just that, but the precedent has larger implications.

Austin Mabry said...

"it really is just about decrypting this one phone"
No, it really isn't. It's about setting precedents and gaining tools that will allow the government to get access to ANY device.

ANY device has technical security flaws. Forcing the manufacturer to go back and find and exploit those flaws, and then hand over a tool that automates that exploitation process is a serious security issue.

Even if the FBI pinky swears not to use it for any other device (yeah right, not until they decide they "Need" to), and to keep it out of the wrong hands...

Yeah, the same way Hillary kept her email server secure.

The same way Lois Lerner just accidentally deleted and physically destroyed every drive that ever touched her emails, and then destroyed the backups that they were required by law to keep.

The same way the OPM spoon-fed millions of people's identities to hackers.

The same way the Veterans Affairs Department gave away tens of thousands of veterans' identities away, making veterans the single greatest targets for identify theft.

The same way the NSA prevented Snowden from exposing their activities.

The same way the DOD prevented Manning and Assange from leaking their secrets.

Seems legit...

analytica said...

If this is a business phone, why can't the administrator of the device from the perpetrator's employer use admin tools to access the device? Did they not use central administration that would allow them access to the device after it was configured?

Philip Ngai said...

Based on the DOJ's motion, Apple is merely being asked to disable the delay between passcode guesses and the erasure of data if there are too many bad guesses.

How did you decide that would take 2000 hours?

dramklukkel said...

How does a 6 letter password take 300+ years?
Only letters, case sensitive, no spaces... (52 char.) would take 50 years.
Same stuff, but with digits (62 char.) would take 144 years.
All characters, no spaces (94 char.) would take 1750 years.
I'd like to know how you worked the math.

Josh said...

I've used a limera1n-based exploit to recover data from a locked iPhone4 before, and that's a well-enough known procedure that there's commercial software doing the same by this point. Doesn't work on the 4s or any of the 5-gen hardware, though. That software can run even without using the conventional iPhone OS upgrade system, just Recovery mode and a USB cable, even to an untrusted machine.

Indeed, this sort of thing is intentional -- if you every do a "DFU mode" recovery from iTunes, you're using a more specific variation on the same concept. Running a full firmware update to disk from iTunes will frag data, but a signed ramdisk will run fine.

ih8sn0w claims to have an iBoot exploit that would likely allow access from Recovery Mode in the same manner in A5 (4s), expandable to A6 (5c) and A7 (5s) chips, but he's not released it and some of the tweets have implied that it's really hard. There's also some question whether the Secure Enclave in A7- or later devices would erase the internal keys if fiddled with enough, though chances are pretty good Apple could get past that: there have been updates to the SE in the past.

Cesium said...

Why can't the FBI pull the flash out of the phone and then dump the flash to one of their servers. Then have 10,000 computers guess at the password and decrypt the data as it sits in server memory.