In the upcoming version of the Apple iPhone iOS operating system, iOS 12, the phone’s Lightning cable port (used for charging and data transmission) will be disabled an hour after the phone is locked. The device will still charge, but transferring data to or from the device via the Lightning cable will require entering the device’s password first.
Connecting to the data port via Lightning cable is what third-party forensic devices called Cellebrite and GrayKey rely upon to extract data from locked, encrypted iPhones. These tools (made, respectively, by the eponymous Cellebrite and a company called Grayshift) are employed by U.S. law enforcement agencies at federal, state, and local levels. Unsurprisingly, just about everybody covering the story is framing Apple’s move as one that will thwart law enforcement.
While it may be attention-grabbing, this framing ignores the many other actors out there who would love to use Cellebrites or GrayKeys to get into the iPhones of persons of interest to them, if they aren’t doing so already. Apple doesn’t know who all of Cellebrite’s and Grayshift’s customers are. But it can guess that not all of them have the public’s best interest in mind. Some of those clients (whether the vendors know it or not) may be repressive governments, crooked cops, or organized crime.
A repressive government might use a Cellebrite or GrayKey tool to find out if the phone’s owner has been engaging in dissident political activity. In a country where it’s illegal to be gay, the police might use those tools to crack into phones and look for evidence of homosexual relationships by the phone owner. A police officer might misuse the tool to get access to what’s on his wife’s phone, alongside other tools for spying on one’s partner. (Police officers commit intimate partner violence at higher rates than the general population.) Foreign intelligence agencies are keenly interested in high-level U.S. government officials’ phones, as evidenced by the rogue Stingray problem in D.C. right now.
Maybe Cellebrite and Grayshift are choosy about whom they sell to. Maybe they put restrictions in their contracts, making customers promise not to misuse the tools. But Apple has no visibility into or control over the vendors’ client lists or whether they abide by those agreements, and it has no recourse against the vendors’ customers if they don’t.
And those are just the direct customers. Even if all of the initial purchasers of GrayKey and Cellebrite are what we’d consider legitimate, democratic law enforcement agencies, these tools will eventually turn up on secondary markets. Just search for “IP Box” on eBay. IP Box is a hacking tool for getting into smartphones that state and federal law enforcement have used in criminal cases in the past. But it’s also available to anyone with an eBay account and a few hundred dollars. There is no reason to think that that won’t eventually be the case with GrayKey or Cellebrite tools, too (at least the models that can be used in the field without the need for the vendor’s involvement). Organized crime rings can use these tools to extract sensitive personal or financial information from stolen phones and use that data for identity theft or similar criminal purposes.
These are not theoretical concerns, and they have nothing to do with thwarting legitimate law enforcement investigations. This is why the framing in so many news reports of “Apple will stymie law enforcement” is so wrongheaded. Once a company learns about a third-party tool that undermines its product’s security, the only responsible thing to do is to figure out what security flaw the tool exploits and fix it. Even if the initial users of the tool are law enforcement agents from a democratic country who have a warrant supported by probable cause, that’s not always going to be the case. Apple is doing the responsible thing by adding USB Restricted Mode to iOS 12.
But it won’t be the last time Apple has to take countermeasures against tools for cracking into its encrypted devices. This is a cat-and-mouse game. As I noted an amicus brief I co-authored in the “Apple vs. FBI” case in 2016, there will always be security flaws in every model of iPhone, every version of iOS, despite Apple’s best efforts. Vendors like Cellebrite and Grayshift—as well as the FBI’s own internal staff, jailbreakers, bug bounty hunters, and so on—will hammer on every new version to find the bugs and then develop or update their tools to exploit those bugs.
iOS 12’s “USB Restricted Mode” is only a temporary setback for those developers. In fact, it’s one that Grayshift reportedly may already have overcome. If media outlets are going to headline their stories “iOS 12 May Hamper Law Enforcement,” they could at least tack on “For a Little While.”
Another reason it’s careless to frame USB Restricted Mode as something that will harm law enforcement is that we don’t even have an accurate picture of the current impact of the iPhone’s security features on law enforcement’s ability to do its job effectively. We don’t know how many iPhones law enforcement has been able to open using tools whose functionality will be impeded by USB Restricted Mode. We don’t know how many iPhones investigators couldn’t open despite the availability of those and other forensics tools (whether third-party or home-rolled). Indeed, until recently, even DOJ and FBI themselves apparently didn’t know that figure. They admitted last month that they had drastically overstated the number of phones they can’t unlock due to encryption. They also admitted that that number is hard to pin down because sometimes investigators are later able to access a phone which they couldn’t initially. They’ve never divulged the number of cases they were able to resolve anyway despite an unopenable phone. Senator Wyden sent the head of the FBI a letter on May 23 demanding answers to these questions. He asked for a reply by yesterday. As far as I know, he hasn’t gotten an answer yet.
We don’t know how big the “going dark” problem actually is, or how big an impact USB Restricted Mode will have on Cellebrite and Grayshift. But we can be sure those companies won’t throw up their hands, give up, and close up shop. They’ll keep finding ways to make their tools work.
The bad news is that the iPhone will never be totally free of security flaws. But the good news is that this update will definitely improve user security, by closing off one potential avenue of malicious access to iPhones. And we can all use a little more good news these days.