Monday, June 08, 2015

What's the state of iPhone PIN guessing

I think even some experts have gotten this wrong, so I want to ask everyone: what's the current state-of-the-art for trying to crack Apple PIN codes?

This is how I think it works currently (in iOS 8).

To start with, there is a special "crypto-chip" inside the iPhone that holds your secrets (like a TPM or ARM TrustZoneSecurCore). I think originally it was ARM's TrustZone, but now that Apple designs its own chips, that they've customized it ("Secure Enclave"). I think they needed to add stuff to make Touch ID work.

All the data (on the internal flash drive) is encrypted with a random AES key that nobody, not even the NSA, can crack. This random AES key is stored on the crypto-chip. Thus, if your phone is stolen, the robbers cannot steal the data from it -- as long as your phone is locked properly.

To unlock your phone, you type in a 4 digit passcode. This passcode gets sent to the crypto-chip, which verifies the code, then gives you the AES key needed to decrypt the flash drive. This is all invisible, of course, but that's what's going on underneath the scenes. Since the NSA can't crack the AES key on the flash drive, they must instead get it from the crypto-chip.

Thus, unlocking the phone means guessing your 4 digit PIN.

This seems easy. After all, it's only 4 digits. However, offline cracking is impossible. The only way to unlock the phone is to send guesses to the crypto-chip (a form of online cracking). This can be done over the USB port, so they (the NSA) don't need to sit there trying to type every possible combination -- they can simply write a little script to send commands over USB.

To make this more difficult, the crypto-chip will slow things down. After 6 failed guesses, the iPhone temporarily disables itself for 1-minute. Thus, it'll take the NSA a week (6.9 days), trying all 10,000 combinations, once per minute.

Better yet, you can configure your phone to erase itself after 10 failed attempts ([Erase Data] Passcode setting). This isn't the default configuration, but it's how any paranoid person (like myself) configures their phone. This is a hard lock, preventing even the NSA from ever decrypting the phone. It's the "going dark" problem that the FBI complains about. If they get the iPhone from a terrorist, drug dealers, or pedophile, they won't be able to decrypt it (well, beyond the 0.1% chance of guessing 10 random numbers). (Note: I don't think it actually erases the flash drive, but simply erases the secret AES key -- which is essentially the same thing).

Instead of guessing PIN numbers, there may be a way to reverse-engineer such secrets from the crypto-chip, such as by using acids in order to remove the top from the chip then use an electron microscope to read the secrets. (Physical possession of the phone is required). One of the Snowden docs implies that the NSA can sometimes do this, but that it takes a month and a hundred thousand dollars, and has a 50% chance of destroying the chip permanently without being able to extract any secrets. In any event, that may have been possible with the older chips, but the latest iPhones now include custom chips designed by Apple where this may no longer be possible.

There may be a a physical exploit that gets around this. Somebody announced a device that would guess a PIN, then immediately power down the device before the failed guess could be recorded. That allows an infinite number of guesses, requiring a reboot of the phone in between. Since the reboot takes about a minute, it means hooking up the phone to the special device and waiting a week. This worked in phones up to iOS 8.1, but presumably it's something Apple has since patched (some think 8.1.1 patched this).

There may be other exploits in software. In various versions of iOS, hackers have found ways of bypassing the lock screen. Generally, these exploits require the phone to still be powered on since it was stolen. (Coming out of sleep mode is different than being powered up, even though it looks like the same unlocking process to you). However, whenever hackers disclose such techniques, Apple quickly patches them, so it's not a practical long term strategy. On the other hand, they steal the phone, the FBI/NSA may simply hold it powered on in storage for several years, hoping an exploit is released. The FBI is patient, they really don't care if it takes a few years to complete a case. The last such exploit was in iOS 7.0, and Apple is about to announce iOS 9. They are paranoid about such exploits, I doubt that a new one will be found.

If the iPhone owner synced their phone with iTunes, then it's probable that the FBI/NSA can confiscate both the phone and the desktop in order to grab the data. They can then unlock the phone from the desktop, or they can simply grab the backup files from the desktop. If your desktop computer also uses disk encryption, you can prevent this. Some desktops use TPMs to protect the disk (requiring slow online cracking similar to cracking the iPhone PIN). Others would allow offline cracking of your password, but if you chose a sufficiently long passwords (mine is 23 characters), even the NSA can't crack it -- even at the rate of billions of guesses per second that would be possible with offline cracking.

The upshot is this. If you are a paranoid person and do things correctly (set iPhone to erase after 10 attempts, either don't sync with desktop or sync with proper full disk encryption), then when the FBI or NSA comes for you, they won't be able to decrypt your phone. You are safe to carry out all your evil cyber-terrorist plans.


I'm writing this up in general terms because I think this is how it works. Many of us general experts glance over the docs and make assumptions about how we think things should work, based on our knowledge of crypto, but we haven't paid attention to the details, especially the details as the state-of-the-art changes over the years. Sadly, asking general questions gets general answers from well-meaning, helpful people who really know only just as much as I do. I'm hoping those who are up on the latest details, experts like Jonathan Zdziarski, will point out where I'm wrong.



Response: So Jonathan has written a post describing this in more detail here:  http://pastebin.com/SqQst8CV

He is more confident the NSA has 0days to get around everything. I think the point wroth remembering is that nothing can be decrypted without 0days. and that if ever 0days become public, Apple patches them. Hence, you can't take steal somebody phone and take it to the local repair shop to get it read -- unless it's an old phone that hasn't been updated. It also means the FBI is unlikely to get the data -- at least without revealing that they've got an 0da.




Specifics: Specifically, I think this is what happens.

Unique-id (UID): When the CPU is manufactured, it's assigned a unique-identifier. This is done with hardware fuses, some of which are blown to create 1 and 0s. Apple promises the following:

  • that UIDs are secret and can never be read from the chip, but anybody, for any reason
  • that all IDs are truely random (nobody can guess the random number generation)
  • that they (or suppliers) keep no record of them

This is the root of all security. If it fails, then the NSA can decrypt the phone.

Crypto-accelerator: The CPU has a built-in AES accelerator that's mostly separate from the main CPU. One reason it exists is to quickly (with low power consumption) decrypt/encrypt everything on the flash-drive. It's the only part of the CPU that can read the UID. It can therefore use the UID, plus the PIN/passcode, to encrypt/decrypt something.

Special flash: Either a reserved area of the flash-drive, or a wholly separate flash chip, is used to store the rest of the secrets. These are encrypted using the UID/PIN combo. Apple calls this "effaceable" storage. When it "wipes" the phone, this area is erased, but the rest of the flash drive isn't. Information like your fingerprint (for Touch ID) is stored here.

So the steps are:

  1. iOS boots
  2. phone asks for PIN/passcode
  3. iOS sends PIN/passcode to crypto-accelerate to decrypt flash-drive key (read from the "effaceable" storage area)
  4. uses flash-drive key to decrypt all your data

I'm skipping details. This is just enough to answer certain questions.

FAQ: Where is the unique hardware ID stored? On the flash memory? The answer is within the CPU itself. Flash memory will contain further keys, for example to unlock all your data, but they have to be decrypted using the unique-id plus PIN/passcode.



4 comments:

Christopher Anderson said...

Having been locked out of my (jailbroken) 4S running iOS 6, I can offer the following – since it was jailbroken, there were countless claims and instructions on how to remove the pin-code. I believe these only applied to the iPhone 4. At one stage, I managed to regain root access (via ssh) to the phone, and still couldn't remove the pin-code.

The flip side is that you can (or could with iOS 6 anyway) plug a jailbroken phone into any computer's USB port and run freely available software to copy all the data.

Newer versions of iOS ask you to "Trust this Computer" which I assume negates this attack vector, however if your phone is jailbroken you are definitely more vulnerable, especially if you run open-ssh, which will allow attackers an infinite amount of attempts at brute forcing your ssh password (which quite often people leave as default anyway).

And the people who use the levels of encryption you describe are usually the same people who are most likely to jailbreak their iPhone.

michee said...

see this : http://blog.mdsec.co.uk/2015/03/bruteforcing-ios-screenlock.html

Radha Manju said...

This was a good suggestion that you put up here.hope that it benefits all the ones who land up here.
iPhone Service Center in Chennai

Unknown said...

As one gets infinite attempts using jailbreaks, it seems like the actual Secure Enclave doesn’t has any security feature against brute forcing – it is solely in software.

This would mean that it should be easy to transplant the chip onto a different board and brute force it externally.