Public Access to Smart Data

Last month, the Supreme Court of California may have decided the future of the public’s access to “smart city” data without knowing it. In ACLU v Los Angeles Police Department, the court accepted that raw data collected by Los Angeles police and sheriff departments, using automated licence plate readers (ALPRs), constituted a public record subject to disclosure under California’s Public Records Act (CPRA) absent an exemption. The court held that the catch-all disclosure exemption in the CPRA applied, which requires balancing the public interest in preventing disclosure where certain harms can be identified against the public interest served by disclosure such as furthering the public’s understanding of the privacy risks of the ALPR program.

Finding a significant risk to privacy in the disclosure of the raw data -- data that would show the license plate number, date, time, and location information of each license plate recorded -- the court had no difficulty applying the exemption on the record before it. Nonetheless, the court remanded for consideration of “the feasibility of, and interests implicated by, methods of anonymization [and] …. to explore other methods of anonymization and redaction as well.” The agency’s objection that the ALPR was not designed to facilitate CPRA disclosure as a native function was no excuse to avoid anonymized or redacted disclosure.

So what does all that mean for smart cities in the U.S.?  Every state has some form of public records disclosure law that recognizes data collected and maintained by government agencies is a public record. The National Freedom of Information Coalition maintains a list of the laws here. If cities (or any state governmental entity for that matter) are going to collect data, the public generally is going to have a right of access to it. If there is a privacy risk associated with disclosure of that data, then the agency initially, and ultimately a court, will have to balance whether the public’s interest in seeing the data is outweighed by the privacy risk.

The privacy risk in regard to disclosure of ALPR raw data is obvious. As the court said:
 

ALPR data showing where a person was at a certain time could potentially reveal where that person lives, works, or frequently visits. ALPR data could also be used to identify people whom the police frequently encounter, such as witnesses or suspects under investigation.... In short, as the trial court observed, “Members of the public would be justifiably concerned about LAPD or LASD releasing information regarding the specific locations of their vehicles on specific dates and times to anyone.”

The privacy risks in other sensor cases may be much less clear. For example, I previously wrote about the privacy revelations extracted from raw taxi cab pick-up and drop-off data in New York City. The analysis of the supposedly anonymized data produced chilling details about taxi customers’ private lives, including uniquely identifying riders’ homes through the latitude and longitude data disclosed.

Cities are deploying sensors to collect a myriad of data, including recording ambient sound and images. Cities also are providing Wi-Fi and kiosk-based Internet access.  What of all that data?  And what about the mosaic of data collected individually by different agencies that may be combined by analysts and researchers? Will agencies and courts be equipped well enough to balance less obvious privacy interests against the public’s interest in disclosure?

California has mandated that any person or entity in California that employs an ALPR must publish a privacy policy about how the data is being collected, used and shared. The Electronic Frontier Foundation maintains a site that lists the various collecting agencies’ policies. One of the parties to above case, the Los Angeles Sheriff’s Department, has published its privacy policy here. And it is a creditable effort, limiting dissemination of the data only “for legitimate law enforcement, criminal justice, or public safety purposes,” prohibiting commercial use, and requiring records to be maintained of access to the data.

There are no similar state or federal laws or policies for smart data collection. Instead, cities are rushing to become the “smartest city” without putting sound privacy practices in place before data is collected. The ACLU v. LAPD case should be a wake up call to would-be smart cities that the public will want to know what they collect and how they are using and disclosing data. And those cities will be responsible to design anonymization tools for exporting data if they don’t design such capabilities in at the front end. Cities don’t need a law to be transparent with their data collection practices. They should do it as a matter of sound policy.

Lastly, the ACLU v. LAPD case is interesting for another reason. The ACLU did not challenge the lawfulness of the collection of license plate data by the police. No doubt, this was a practical litigation decision in light of the wide acceptance of the notion that anything that may be perceived in a public space may be collected, retained, distributed and searched by anyone including the government. Even Justice Brandeis historically accepted that privacy invasions involve intrusions into seclusion, which cannot occur in public. But with social media’s broad reach today, it may be time to question whether individuals really assume the risk and implicitly consent to the collection of data about them when in public places.

While we may be a long way away from accepting a tort of public disclosure of facts obtained in public places, smart city data collection, use and disclosure ought to raise serious policy questions about an individual’s interest in maintaining some level of obscurity in public spaces and control over how information about them is used. This is especially true given that information, once collected, may be combined with other data to reveal private information about a person and his or her habits and activities -- information that will persist forever, be retrievable and searchable, and be distributed globally.

 

Add new comment