Dark Patterns and the CCPA

This past spring, while the pandemic was worsening around us and the first shelter in place orders went into effect, I began an exploratory research project with  then graduate student researcher (now Stanford Ph.D graduate) Andreas Katsanevas and a team of fantastic undergraduates: Claudia Bobadilla, Nivedha Kelley Soundappan, Emilia Porubcin (all from Stanford), and Morgan Livingston (U.C. Berkeley). We met virtually each week to examine how companies were implementing their California Consumer Privacy Act (CCPA) notices, focusing primarily on the Do Not Sell (DNS) requirement, but also reviewing their privacy policies as well as their processes for exercising CCPA rights. We examined a wide variety of companies (focusing beyond the obvious tech giants) that we figured would be subject to the CCPA. In addition to examining privacy policies, DNS links, and forms for compliance with the statute, we also looked for evidence of dark patterns: “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice.” We were curious to see if any of these CCPA compliance mechanisms constructed barriers to challenge consumers from exercising their new rights.

Unfortunately, we found several forms of dark patterns that challenge or interfere with Californians’ ability to submit DNS requests. We must be clear that we are not focused on proving any intent to deceive or interfere by these companies — ultimately, what matters here is not intent, but the results, whether they are due to a desire to prevent consumers from exercising these requests, or from sloppy design. Based on my own past experience in the corporate world, I’m personally inclined to blame the latter; company infrastructures such as help pages, or at the very bottom of the heap — privacy policies — are typically a very low priority at most companies, and not the hot area where you devote your core resources. Regardless, it’s a sign of how little time and effort goes into ensuring that these processes meet anything beyond a minimum bar of compliance, and that when it comes to treating these rights seriously, companies invest very little. (Curiously, we also observed a wide variability in the quality of CCPA forms created by OneTrust, one of the primary companies offering CCPA compliance solutions, especially for brick and mortar retailers lacking the digital resources to host their own. We wonder aloud why OneTrust doesn’t do a better job of making sure all of their clients are using forms and processes for CCPA compliance that are consistent in their user experience and meet a minimum level of usability standards.)

After reviewing these findings with Adriana Stephan, a M.A. student in cyber policy here at Stanford who helped translate them into specific recommendations, on Oct. 28 we submitted a summary of our findings to the California Attorney General’s office in response to their latest round of CCPA revisions, which included explicit language on dark patterns within DNS processes. We took this opportunity to voice our support for their revisions, including examples from our own research, as well as calling out additional areas that we think would benefit for future clarification. Our primary findings and recommendations are reproduced below, and our letter (including examples of our findings) is attached to this blog post.

Summary of dark pattern findings:

  • Do Not Sell flows (the steps by which a consumer initiates a Do Not Sell request up to completion) that included unnecessary steps for making a DNS request, such as: 

  1. Sending consumers from the DNS link on a company’s homepage to the company’s privacy policy page (or other indirect routes), rather than directly to a DNS form, thus requiring consumers to hunt through the policy to find the link to the DNS form;

  2. Requiring consumers to select a button or toggle embedded within a page to make a request, often without instructions or clear labels, such that is it unclear which option initiates the DNS state ;  

  • DNS forms that asked consumers to provide personal information that appeared extraneous to the DNS request;

  • Forms offered only in English by companies that likely have large non-English speaking customer bases;

  • DNS landing pages and/or forms that used confusing (e.g., double negatives) or manipulative language (e.g. emotionally charged or guilt-inducing) that attempts to persuade consumers not to exercise their rights;

  • DNS landing pages that included copious amounts of text preceding the form that was not directly salient to making a request. Forcing consumers to spend additional time or energy to read extraneous information may decrease the likelihood of completing a DNS request;

  • For companies that honor DNS requests only via email, many of these companies provided little or no instruction to consumers about how to complete the request (e.g., what information to include in an email), did not offer automated shortcuts for composing emails (e.g., mailto functionality that can pre-populate an email with the address and subject link when clicked), and provided email addresses that appeared to be non-specific to DNS requests, which may increase the burden on the consumer to engage in continual back-and-forth with the company to make the DNS request.


Specific Recommendations:

    1.    Provide forms, rather than email addresses, for consumers to make DNS requests 


    2.    Offer DNS forms in languages other than English, and also use simple, easy to understand language


    3.    Avoid crowding DNS forms with extraneous information


    4.    Provide consumers a streamlined form that does not require them to take extraneous steps to complete a DNS request. For multiple-purpose forms (e.g. forms allowing consumers to also exercise their deletion and access rights), make the selection choices simple and clear. 


    5.    Absent a mandate to respect Global Privacy Control signals, provide a standardized interface for consumers to exercise their DNS rights.