Stanford CIS

Some Thoughts About Apple’s New Advanced Data Protection Feature

By Riana Pfefferkorn on

Last week, Apple announced several new security features: the ability to authenticate contacts in iMessage (reminiscent of Signal’s verification feature), support for hardware security keys, and opt-in end-to-end encryption (E2EE) for iCloud, which Apple calls Advanced Data Protection (ADP for short). Previously, 13 categories of data (such as health data and payment information) were already E2EE in iCloud; for users who turn on ADP, that number goes up to 24. ADP is available for Apple beta users now, with plans to roll it out in the U.S. by year’s end and to expand to other countries next year.

In addition, Apple confirmed that it is conclusively abandoning the controversial plan it had announced last summer to client-side scan photos as they’re being uploaded to iCloud to look for child sex abuse material (CSAM). That plan was put on pause after massive pushback from civil society (including yours truly). Observers commented at the time that this client-side scanning feature, where the scanning would happen before an image hit the cloud, only made sense if Apple was planning to E2EE iCloud (since scanning wouldn’t be possible in an E2EE cloud) – and here we are.

These security features – and the death of the client-side scanning plan – are all great news, but it’s ADP that’s gotten the most attention. E2EE cloud backups are a big deal, as Matt Green explains here. Apple itself commissioned a whitepaper, released concurrently with this announcement, about the rising security threat to consumer data stored in the cloud. Given the privacy and security gains of making cloud storage E2EE, Apple had reportedly planned for iCloud E2EE in the past, but had backed away from the idea. Now, it’s back – but only for those who choose it. What’s more, not everyone gets to choose it: managed Apple IDs and child accounts aren’t eligible for ADP, according to Apple’s documentation.

Today, I’m on the Lawfare podcast discussing ADP: what does it mean for users who choose it and for law enforcement investigations, why make it opt-in and what are the ramifications of that choice, the FBI’s half-hearted reaction, and so forth. It was a great discussion where I had the chance to talk through a lot of what’s been on my mind since last week’s announcement. There are some points I made on the podcast that I’d like to reiterate here, plus a couple more I’d like to add that I didn’t have the opportunity (or time) to bring up during the conversation.

China

For Apple users in mainland China, their iCloud data is held by a government-owned Chinese company. Apple struck this deal with the surveillance-happy Chinese government so it could keep operating in the massive Chinese market. The concessions Apple has made to operate in China are in tension, to put it lightly, with its portrayal of itself as the champion of user privacy and security – China is the big unspoken asterisk on that branding. In announcing ADP, Apple claimed that E2EE iCloud will eventually be offered in China, too, as part of ADP’s gradual global rollout.

Sure, pal. Call me when that happens. Let’s move on.

Speaking of government regulation…

Children’s Online Safety Bills

The UK and California have enacted, and members of the Senate have proposed, regulatory requirements for online services to protect their child users’ safety. The UK’s Age-Appropriate Design Code came into force last fall. California’s law, also called the Age-Appropriate Design Code Act, passed earlier this year (and is the subject of a lawsuit just filed today). In Congress, Senators Blumenthal and Blackburn have introduced the Kids Online Safety Act (KOSA) bill.

All three require online services to set minors’ accounts to the most privacy-protective settings by default. Because it’s E2EE, ADP is inarguably a higher privacy setting than Apple’s standard data protection for iCloud. That seems to tee up a potential conflict with these requirements, insofar as (1) ADP is opt-in, not on by default, and (2) child accounts are not eligible for ADP at all.

(By the way, it’s important to note that there is other proposed online safety legislation in the UK that could spell trouble for ADP for different reasons. But I want to talk about the AADC.)

Will the ADP plan violate the UK and California AADCs that are already in effect? I doubt it, because they have carve-outs. Standard 7 of the UK AADC (whose coverage is broad enough to include iCloud) says, “Settings must be ‘high privacy’ by default (unless you can demonstrate a compelling reason for a different default setting, taking account of the best interests of the child).” Similarly, California’s AADC, Cal. Civ. Code § 1798.99.31(a)(6), requires “[a] business that provides an online service, product, or feature likely to be accessed by children” to “[c]onfigure all default privacy settings provided to children by the online service, product, or feature to settings that offer a high level of privacy, unless the business can demonstrate a compelling reason that a different setting is in the best interests of children.”

Apple could argue that it has compelling reasons not to make ADP available and on by default for child accounts. Those compelling reasons could include (1) allowing the parent’s iCloud account to continue to have certain controls over the child’s iCloud account (hey, regulators love parental controls online) and (2) avoiding data loss if the child loses access to their iCloud and also loses their recovery key (look, if you think adults are forgetful and clumsy…). I think Apple can thread the needle when it comes to these high-privacy-by-default obligations in the two AADCs.

But the corresponding provision in the KOSA bill does not include a “compelling reason” carve-out. The language of this bill has been in flux. In the bill text here, section 4(a)(2) says,

“DEFAULT SAFEGUARD SETTINGS FOR MINORS.—A covered platform shall provide that, in the case of a user that the platform knows or reasonably believes to be a minor, the default setting for any safeguard [to control the minor’s experience and personal data on the covered platform] shall be the strongest option available.”

In bill text that I believe just came out yesterday, available here, section 4(a)(3) says,

“DEFAULT SAFEGUARD SETTINGS FOR MINORS.—A covered platform shall provide that, in the case of a user that the platform knows or should know is a minor, the default setting for any safeguard … shall be the option available on the platform that provides the most protective level of control that is offered by the platform over privacy and safety for that user.”

(The bill defines “covered platform” to mean “a commercial software application or electronic service that connects to the internet and that is used, or is reasonably likely to be used, by a minor.”)

There’s no “compelling reason/best interest” carve-out here. It therefore seems to me that Senator Blumenthal’s bill would require Apple to not just make ADP available for child accounts, but opt them all into it, so that all children’s iClouds are E2EE by default.

Of course, as I’ve written about too many times to count, Senator Blumenthal hates strong encryption. He and other senators (including his KOSA co-sponsor Senator Blackburn) hauled Apple in front of the Senate Judiciary Committee three years ago to bawl Apple out over how its encrypted services are supposedly enabling the abuse of children, and called on Apple to do more for child safety. You can easily draw a direct line from that hearing to Apple’s failed client-side scanning proposal.

And yet, Blumenthal’s children’s online safety bill would seem to mandate default E2EE for children’s iCloud accounts. I’m not sure he’s realized that yet. Once he does (and oh, to be a fly on the wall at that moment), I anticipate we’ll see him add an AADC-like “compelling reason/best interest” exception with a quickness.

The story from regulators the world over always seems to be, “You must protect your users’ privacy, especially that of child users, but you shouldn’t use strong encryption,” without reckoning with the incompatibility of these two demands. I guess the real ask is “Protect privacy, just not too well.”

And speaking of children…

Is Apple Going to Scan Non-E2EE iCloud Accounts for CSAM?

One thing we learned during the hullabaloo last year over Apple’s client-side scanning plans was that Apple did not scan users’ cloud account contents for CSAM – hence the plan to scan at the point of upload. This made Apple an outlier in contrast to other major cloud storage providers’ longstanding practices. (That said, it’s long been possible to encrypt your files before uploading them to the cloud. To which Dropbox recently decided: if you can’t beat ‘em, acquire ‘em.) When Apple backed off its plans last summer, that raised the question: What was the company going to do instead – if anything?

The scotched plan was, as said, evidently a response to criticism that Apple ought to do more to detect CSAM on its services. (Note that Apple was already scanning Mail attachments for CSAM. Since iCloud Mail is excluded from ADP due to interoperability reasons, along with Calendar and Contacts, ADP won’t impede Mail scanning.) The client-side scanning plan was essentially a middle ground between the prior status quo for Apple and that of its cloud storage competitors: OK, scan, but not in the cloud. With that plan gone, was Apple going to revert to its status quo, or would it quietly fall into line with its competitors?

Apple never announced what it would do instead, and now that it’s announced ADP, the question holds renewed relevance. As said, E2EE iCloud data can’t be scanned for abuse. But if users can now opt in to E2EE their iCloud Photos, you have to wonder: For users who don’t opt into ADP, is Apple scanning their iCloud Photos for CSAM (or other abusive content)? And if so, with the ADP rollout, will Apple start using any data or insights from non-E2EE iCloud scanning to try to detect abuse in E2EE iCloud accounts? (Or, for that matter, will it analyze non-E2EE metadata of ADP accounts to try to detect abuse?)

In the new world of ADP, scanning non-E2EE iCloud data would still be relevant in at least two contexts: child accounts and Shared Albums in Photos. Both, as said, are excluded from ADP. If Apple does (or already has) start to scan non-E2EE iCloud data for CSAM, those could prove to be two important sources of abuse detection. I noted on the Lawfare podcast that a would-be CSAM trader who’s used to iMessage being E2EE by default, and who opts into ADP on the understanding that Photos in iCloud will now be E2EE, might mistakenly think that Shared Albums will behave the same way – and get caught by a scan. Likewise, scans might detect child abuse if child iCloud accounts hold CSAM that a child user sends or receives (i.e., in a grooming or sextortion context).

If Apple offers opt-in E2EE but starts scanning non-E2EE iCloud accounts, then these two carve-outs (three if you count Mail) could support Apple’s narrative about why offering E2EE isn’t a step backwards for its child safety efforts (which had already been criticized as inadequate). Offering E2EE iCloud while scanning non-E2EE iCloud would let Apple claim to be protecting user privacy and security and child safety at the same time – even if “we don’t scan iCloud” was previously an important component of its user privacy story.

To be clear, I have no idea whether Apple is, or has plans to start, scanning non-E2EE iCloud for CSAM. But it would be awfully nice to get a public answer to that question.

Conclusion

There are a lot of other ramifications of ADP that I haven’t yet thought through. (Such as: how will E2EE cloud storage affect legal doctrines such as reasonable expectation of privacy and the third party doctrine?) And the obvious questions about ADP’s impact on law enforcement investigations haven’t been taken up by Apple to date. They can’t avoid those questions forever, given the foregrounding of law enforcement investigative equities in the ongoing encryption debate.

Of course, improving privacy and security through encryption is a way to prevent crime from happening in the first place – that’s what Apple’s emphasizing in introducing ADP. I agree with Apple that the benefits of making strong encryption ever more ubiquitous will continue to outweigh the drawbacks. I haven’t decided yet whether ADP will be the right choice for me. But I’m happy that I’ll now have it as an option.