The Dark Side of the “Apple vs. FBI” OIG Report

Before I built a wall I’d ask to know
What I was walling in or walling out,
And to whom I was like to give offense.
Something there is that doesn't love a wall,
That wants it down.

- Robert Frost, "Mending Wall"

I’m sure I’m not the only one who read the recent DOJ Office of the Inspector General’s report on the FBI’s internal handling of the San Bernardino shooter’s iPhone and smugly thought, “I knew it!” But the report has me worried as well, because it suggests that the FBI’s criminal investigators may be about to get significant new powers, in a way that lacks accountability and transparency.

You’ll recall that in February and March 2016, the government picked a very public battle with Apple, popularly called “Apple vs. FBI,” over the locked, encrypted iPhone used by one of the San Bernardino shooters. Since 2014, federal law enforcement officials had been decrying Apple’s efforts to strengthen iPhones’ encryption, because those improvements meant that Apple could no longer extract data from encrypted iPhones for law enforcement. In the “Apple vs. FBI” case, the government tried to misuse a 1789 law called the All Writs Act to dragoon Apple into writing custom software code to help the FBI get into the phone. The government argued in court—and then-FBI Director James Comey testified to Congress—that Apple’s assistance was necessary because the FBI, despite really trying, had no other way to get into the phone. And then, the day before a March 2016 hearing that would have tested the government’s legal theory and factual claims, the DOJ suddenly announced it had an outside vendor that had found a solution for accessing the phone. It dismissed the case shortly thereafter, without naming the vendor. (For a link round-up of blog posts by CIS members about the case, see here.)

This last-minute revelation appears to stem from a communication breakdown within the FBI. The March 2018 OIG report notes that unbeknownst to Comey, around the same time as his testimony, one division of the FBI had an outside vendor that was very close to a solution for getting into iPhones like the one used in San Bernardino. That division, the Remote Operations Unit (ROU), is a subsection of the FBI’s Operational Technology Division (OTD). According to the report, ROU “provides computer network exploitation capabilities,” principally in national security matters—meaning its tools and techniques are classified. Another sub-unit of OTD, the Cryptographic and Electronic Analysis Unit (CEAU), provides digital forensics assistance for devices such as smartphones, “primarily, but not exclusively, in support of criminal matters.” Those matters included the San Bernardino investigation.

According to the report, ROU’s chief was never asked to help CEAU find a solution to get into the San Bernardino iPhone. One reason for this disconnect between CEAU and ROU was that the CEAU’s chief wanted to set a legal precedent in the Apple case. It appears from the report that the CEAU chief didn’t try very hard to find out what other units within the FBI might be able to assist CEAU. When ROU’s outside vendor cracked into the phone, the CEAU chief was very upset, because it meant the case against Apple couldn’t proceed anymore.

The report’s account of the CEAU chief’s personal agenda confirmed what many in the privacy and security community had long suspected: that the Apple vs. FBI battle wasn’t about getting into this particular device. Rather, it was about creating a legal precedent for posterity that would permit the government to force companies (such as smartphone manufacturers) to assist law enforcement in accessing digital evidence, even if that meant undermining the security and integrity of their own products and services. It was never about “just this one time, for just this one phone,” as the FBI had insisted at the time.

But that’s only half the story the OIG report reveals. Another reason the ROU was not asked for help, according to the unit’s chief, was “a long-standing policy that … created a ‘line in the sand’ against using national security tools in criminal cases.” That is, “ROU’s classified techniques could not be used in criminal cases.” According to the report, a 2002 DOJ policy imposes significant “procedural requirements … before using classified investigative technologies in criminal cases.” The ROU chief was aware of only two times the FBI had invoked those procedures, “which demonstrated to him that using a classified technique in a criminal case was difficult.” This policy, the report concluded, also helped account for the CEAU chief’s focus “only on unclassified techniques that could readily be disclosed in court.”

The OIG report ends by noting that the OTD now requires that CEAU and ROU must “de-conflict” “whenever addressing devices in need of a technical solution.” The report also recommends changes within FBI “to ensure the full coordination that such incidents clearly demand.” It states that the OTD intends to reorganize and add a new section “to consolidate resources to address the ‘Going Dark’ problem” and “improve coordination between the units” that work on digital devices. The OIG believes these efforts “should help to avoid some of the disconnects” that occurred in the Apple vs. FBI case.

On the surface, this might seem like a good thing. It was exasperating to read the OIG report and find out that the OTD’s left hand didn’t know what the right hand was doing (and apparently didn’t want to know). If the FBI has to get its act together rather than putting pressure on tech companies to do its bidding, we might be tempted to laud that development. But we should be very wary of cheering any crumbling of the internal OTD “wall” between criminal and national security cases. Here’s why.

The Sixth Amendment guarantees criminal defendants the right to a fair trial, and per Brady v. Maryland, the Fourteenth Amendment’s Due Process Clause requires the prosecution in a criminal case to turn over all exculpatory evidence to the defendant. That means that when criminal investigators use technological techniques to surveil a suspect and gather digital evidence against him, they should have to disclose the technique to him in the prosecution.

But that hasn’t always been the case. Several years ago, the government used malware that exploited a vulnerability in the Tor browser in order to catch visitors to a dark-web child-pornography site. (In its warrant application, the government euphemistically called the malware a “Network Investigative Technique,” or NIT, which raises the possibility that the judge who approved the warrant did not fully understand what she was authorizing.) The DOJ subsequently prosecuted dozens of people nationwide on the basis of this single “NIT warrant.” After those cases began, the government classified parts of the NIT, and then resisted disclosing the details of the NIT to defense counsel. In almost every case, the courts agreed it didn’t have to. But in the one case where the court ruled it did, the government dropped the case—against an accused child pornographer, supposedly one of the “worst of the worst” kinds of criminal—rather than reveal the NIT’s secrets. CIS explored these evidentiary issues last year in an event we co-hosted with Mozilla and a resulting blog post.

As the OIG report notes, the reason for the wall between criminal and national security technological tools is that the latter are classified, whereas the former are not. The latter are not designed or deployed with an eye towards having to reveal them to a judge and defense counsel in court later—nor having to authenticate the evidence they gathered. If investigators use a technological tool to get evidence from a suspect’s device, then want to introduce that evidence in court, they will have to prove to the court that the tool worked as intended and that the evidence is sound—that is, the tool didn’t create or alter the data being introduced as evidence. Authenticating evidence is consistent with the high standard set in criminal matters of proving guilt beyond a reasonable doubt. But those considerations are not so much in play in the national security or intelligence context.  

If the OIG report prompts the FBI to give the CEAU, which focuses on criminal matters, more access to tools developed or acquired by ROU, which focuses on national security matters, that could have a detrimental effect on federal criminal cases. When seeking search and seizure warrants, the FBI may not fully explain to judges that they are asking for authorization to use sophisticated, classified technological techniques to extract evidence from defendants’ devices. In the resulting prosecutions, the government may refuse to disclose information about the classified technique, or even its existence, to defense counsel or experts. That secrecy will impair the court’s truth-seeking function as well as the defendant’s ability to mount a defense.  

What is more, removing the divide between criminal and national security tools could ultimately hurt the FBI, too. If courts do order disclosure of the FBI’s techniques in criminal cases, the FBI’s national security and intelligence units might decide they cannot risk using those techniques anymore. That is a significant reason why the wall was there in the first place: to protect those missions.

In short, it is not good news—for the courts, for criminal suspects, maybe not for the FBI itself—that the Bureau is apparently planning to lower the criminal/national security wall for high-tech investigative tools in the wake of the OIG’s Apple vs. FBI investigation. Until now, according to the OIG report, “using a classified technique in a criminal case was difficult.” It should be difficult. It should not be normalized. But that may be the result of the FBI’s planned changes.

It is ironic that the OIG report into the FBI’s behavior during Apple vs. FBI may lead to the FBI’s criminal investigators’ achieving that case’s objective: getting more capabilities to crack into digital devices. But if they get that increased power, it won’t be because a federal court approved the agency’s overbroad interpretation of the All Writs Act, as the CEAU chief apparently hoped it would. Rather, it will be because the OIG report prompts internal changes within the Bureau.

Despite the OIG’s internal watchdog role for DOJ, secretive policy changes within an executive-branch agency are not amenable to checks and balances or public oversight. Apple vs. FBI was an adversarial proceeding between DOJ and Apple, conducted in an independent, coequal branch of government, before a detached, neutral magistrate judge, open to public observation and input. (The case garnered about 20 amicus curiae briefs, including CIS’s, before its sudden end.)

By contrast, if FBI starts pulling down the wall between criminal cases’ and national security matters’ investigative techniques, not only will the public and the judiciary have no say in that change, we may not even know it’s happening. The OIG asked the FBI to provide a status report later this year on its implementation of its reorg plans. But the OIG report also redacts information about OTD, CEAU, and ROU and what they do. We can expect the same will be true of the status report, if and when FBI provides one to OIG. And that’s assuming that the OIG will make even a redacted version public.

The CEAU chief wanted the FBI’s criminal investigations to have more power when attempting to access encrypted digital evidence. Due to the OIG report’s recommendations, it looks like he will get it—through the back door.

 

Add new comment