Monday, April 30, 2018

Proposed "Universal Secure Backdoor" for iPhones isn't secure

Ray Ozzie is one of the technical giants of the computer era.  He was one of the team that created VisiCalc back in the 1980s.  He followed this by creating Lotus Notes (IIRC, this is still being sold by IBM two decades later which has to be some kind of record).  He was one of Microsoft's CTOs and took over as Chief Software Architect from Bill Gates himself.  He is responsible for Microsoft's Azure cloud initiative - if you haven't heard of this, it's a bet-the-company gamble on making Microsoft Office work as a cloud service.  It looks like it's going to save Microsoft's bacon.

That's quite a resume.  Ozzie has a new proposal out for a secure universal backdoor to allow Law Enforcement to unlock iPhones.  It's an excellent example of the way that great software engineers are spectacular failures as security engineers.

The Geek With Guns has background on how it's supposed to work:
Dubbed “Clear,” Ozzie’s idea was first detailed Wednesday in an article published in Wired and described in general terms last month.
[…]
  1. Apple and other manufacturers would generate a cryptographic keypair and would install the public key on every device and keep the private key in the same type of ultra-secure storage vault it uses to safeguard code-signing keys.
  2. The public key on the phone would be used to encrypt the PIN users set to unlock their devices. This encrypted PIN would then be stored on the device.
  3. In cases where “exceptional access” is justified, law enforcement officials would first obtain a search warrant that would allow them to place a device they have physical access over into some sort of recovery mode. This mode would (a) display the encrypted PIN and (b) effectively brick the phone in a way that would permanently prevent it from being used further or from data on it being erased.
  4. Law enforcement officials would send the encrypted PIN to the manufacturer. Once the manufacturer is certain the warrant is valid, it would use the private key stored in its secure vault to decrypt the PIN and provide it to the law enforcement officials.
Well, so what's the problem?  After all, the keys used to sign software are incredibly sensitive - if someone gains access to your keys they could sign all sorts of malware which people's computers would recognize as coming from you.  So keeping these keys on the same secure system solves the problem, right?  Well, no:
Yes, Apple has a vault where they've successfully protected important keys. No, it doesn't mean this vault scales. The more people and the more often you have to touch the vault, the less secure it becomes. We are talking thousands of requests per day from 100,000 different law enforcement agencies around the world. We are unlikely to protect this against incompetence and mistakes. We are definitely unable to secure this against deliberate attack.

Ozzie makes an assumption that makes sense only if you don't understand operational procedure.  Yes, we've secured the keys under Scenario A.  Ozzie spends absolutely no time at all showing how his Scenario B is similar to Scenario A.  Quite frankly, it's not even in the same ballpark.  A procedure that is performed a few dozen times a year can be handled quite well by a small group of highly security savvy people; the same procedure performed thousands of times a year simply cannot be.  I myself have been part of these teams, and there is a very high level of awareness of the implications of any screwup, and so team members need joint simultaneous access to the system to make it work.  Think of it as sort of like ICMBs where there are two launch keys that have to be turned simultaneously, with the locks on opposite sides of the room.  The Launch Operators have to work together to make it happen.

That works for a few launches, but couldn't possibly work for a dozen a day or more, which is what Law Enforcement would want.  The system will collapse - or be intentionally subverted - a thousand different ways:
We have a mathematically pure encryption algorithm called the "One Time Pad". It can't ever be broken, provably so with mathematics.

It's also perfectly useless, as it's not something humans can use. That's why we use AES, which is vastly less secure (anything you encrypt today can probably be decrypted in 100 years). AES can be used by humans whereas One Time Pads cannot be. (I learned the fallacy of One Time Pad's on my grandfather's knee -- he was a WW II codebreaker who broke German messages trying to futz with One Time Pads).

The same is true with Ozzie's scheme. It focuses on the mathematical model but ignores the human element. We already know how to solve the mathematical problem in a hundred different ways. The part we don't know how to secure is the human element.

How do we know the law enforcement person is who they say they are? How do we know the "trusted Apple employee" can't be bribed? How can the law enforcement agent communicate securely with the Apple employee?

You think these things are theoretical, but they aren't.
Nope.  This isn't about hardware or software, it's about wetware (sometimes called "peopleware").  People, as we all know, are prone to make mistakes or to be corruptible.  And this is where the Geek With Guns makes his point:
What’s noteworthy in regards to this post is the fact that nowhere does the Fourth Amendment state that measures have to be taken to make information easily accessible to the government once a warrant is issued. This omission is noteworthy because a lot of the political debates revolving around computer security are argued as if the Fourth Amendment contains or implies such language
We are often asked "Why do you need an AR-15?"  The question implies that we need to justify ourselves to some government authority and get permission before we can do something.  You could easily rephrase the question to "Why do you need a phone that keeps your secrets from everyone"*

The answer to both questions, of course, is identical.  Because fuck you is why.

* Assuming you can get one of these today.  Which you can't.  But while Law Enforcement could indeed collect all your data from the companies who collect it, it would be a royal pain in the behind for them to go to Apple, Google, Facebook, Twitter, and all the other companies with warrants.  Which is why they want backdoors.

3 comments:

Old NFO said...

F them... Warrants, period. No damn back doors!

Archer said...

"Universal Secure Backdoor" is a contradiction in terms. If it's universal, then by definition it is not secure; and if it's secure, then by definition it is not universal.

Also, I noticed the acronym, "USB". Is that intentional or accidental? I'm asking because AFAIK, historically, those who seek to unlock an iPhone have needed to have physical access to the device's hard drive or data port, which almost guarantees some kind of USB (Universal Serial Bus) cable or adapter. Will they intend that both "USBs" be one-and-the-same?

Or was that just a coincidence?

Jonathan H said...

The proponent ignores that digital certificates for software HAVE gotten out at times in the past and while they provide a higher level of security than unsigned software, they are far from foolproof.
In this proposal, the manufacturer wouldn't even have to have the device in hand to do it - that makes the system much more susceptible to problems.
If the NSA themselves can't prevent information from leaking out, how is anyone else expected to?