Jonathan Mayer is a Ph.D. candidate in computer science at Stanford University, where he received his J.D. in 2013. He was named one of the Forbes 30 Under 30 in 2014, for his work on technology security and privacy. Jonathan's research and commentary frequently appears in national publications, and he has contributed to federal and state law enforcement actions.
Jonathan is a Cybersecurity Fellow at the Center for International Security and Cooperation, a Junior Affiliate Scholar at the Center for Internet and Society, and a Stanford Interdisciplinary Graduate Fellow. He earned his A.B. at Princeton University in 2009, concentrating in the Woodrow Wilson School of Public and International Affairs. Jonathan has consulted for both federal and state law enforcement agencies, and his research on consumer privacy has contributed to multiple regulatory interventions. A proud Chicago native, Jonathan is undaunted by freezing weather and enjoys celery salt on a hot dog.
Last week we reported some early results from the Stanford Security Lab's new web measurement platform on how advertising networks respond to opt outs and Do Not Track. This week we're back with a new discovery in the online advertising ecosystem: Epic Marketplace,1 a member of the self-regulatory Network Advertising Initiative (NAI), is history stealing.
Many thanks once again to research assistants Akshay Jagadeesh and Jovanni Hernandez.
Over the past several months researchers at the Stanford Security Lab have been developing a platform for measuring dynamic web content. One of our chief applications is a system for automated enforcement of Do Not Track by detecting the myriad forms of third-party tracking, including cookies, HTML5 storage, fingerprinting, and much more. While the software isn't quite polished enough for public release, we're eager to share some unexpected early results on the advertising ecosystem. Please bear in mind that these are preliminary findings from experimental software; our primary aims at this stage are developing the platform and validating the approach to third-party tracking detection. Many thanks to Jovanni Hernandez and Akshay Jagadeesh for their invaluable research assistance.
Late last week FTC Commissioner Rosch penned a column in which he repeated a number of hackneyed criticisms of Do Not Track. Senators McCaskill and Pryor articulated similar concerns at a recent hearing. This piece sequentially deconstructs Rosch's column and replies to each of his substantive critiques.
Do Not Track is on its way to becoming an Internet standard. In collaboration with Sid Stamm at Mozilla we've submitted an Internet-Draft to the IETF, specifying both the HTTP header syntax and the requirements for compliance.
This is just the beginning of the IETF's process and the evolution of the draft. But it's a transformative moment for web privacy: Do Not Track is now a formal standards proposal. Every browser, advertising network, analytics service, and social plug-in provider has a clear instruction manual on how to implement Do Not Track.
We owe a tremendous debt of gratitude to the colleagues and friends whose efforts have made Do Not Track a reality: Alissa Cooper, Peter Eckersley, Alex Fowler, John Mitchell, Ashkan Soltani, Lee Tien, and Harlan Yu. And we particularly thank Chris Soghoian, Do Not Track's unflagging champion for nearly two years.
Last Friday we submitted a comment to the FTC articulating our vision for Do Not Track. We expanded on a number of views already expressed on this blog: Do Not Track is about much more than behavioral advertising, an HTTP header is the right implementation, and Do Not Track is no threat to ad-supported businesses. Here are the new highlights. (For a fuller exposition of each, please see our comment.)
By Jonathan Mayer and Edward W. Felten
Special to The Bee
By Edward Felten and Jonathan Mayer
Snooping on the Internet is tricky. The network is diffuse, global, and packed with potential targets. There’s no central system for identifying or locating individuals, so it’s hard to keep track of who is online and what they’re up to. What’s a spy agency to do?
Privacy Substitutes by Jonathan Mayer & Arvind Narayanan
"Jonathan Mayer, 28, will complete a dual J.D./computer science Ph.D. at Stanford in 2015 and believes the combination of academics and experience – which included helping Mozilla build a new privacy feature into its browser – has opened up opportunities across the technology industry. “There’s no doubt the demand is sky-high. I’ve had a number of unsolicited inquiries about potential jobs,” Mayer says."
""The State Department definitely has some work to do," said Jonathan Mayer, a computer scientist and lawyer affiliated with Stanford Law School's Center for Internet and Society."
"Stanford computer researcher Jonathan Mayer pushed back at the notion that Clinton used a “homebrew” email system, and said it clearly had “nontrivial security protections in place.”"
“This kind of behavioral and demographic targeting has been going on [in business] since the early 2000s,” said Jonathan Mayer, a lawyer and computer scientist who specializes in privacy and security issues as a graduate fellow at Stanford University.
"Jonathan Mayer, a computer scientist at Stanford University, said historical records provided some evidence that the server could have been located in the Clinton home near Chappaqua, New York. Later, either the server was physically moved or the data was rerouted.
Mayer said it was impossible to tell from tests on the historical server whether it was well secured against hacker attack – a critical question given the sensitivity of Clinton’s role and the aggressiveness of the cyber threat from countries such as China.
Presented by: Catholic University Columbus School of Law’s Journal of Law & Technology
2016 Journal of Law & Technology Symposium
Cybersecurity and Privacy in the Internet Economy: Information Sharing, Data Security, and Intellectual Property
March 17, 2016
2:00 p.m. - 5:30 p.m.
Because of Edward Snowden’s remarkable public service, we know that the National Security Agency, with the cooperation of some large firms, has amassed an unprecedented database of personal information. The ostensible goal in collecting that information is to protect national security. The effect, according to Reed Hundt, is to undermine democracy.
This talk presents an empirical assessment of the NSA’s legal restrictions, including research cited by President Obama’s intelligence review group. We find that present limits on bulk surveillance programs come up far short; authorities to intercept international Internet traffic and domestic telephone metadata place ordinary Americans at risk.
Solutions to many pressing economic and societal challenges lie in better understanding data. New tools for analyzing disparate information sets, called Big Data, have revolutionized our ability to find signals amongst the noise. Big Data techniques hold promise for breakthroughs ranging from better health care, a cleaner environment, safer cities, and more effective marketing. Yet, privacy advocates are concerned that the same advances will upend the power relationships between government, business and individuals, and lead to prosecutorial abuse, racial or other profiling, discrimination, redlining, overcriminalization, and other restricted freedoms.
Have you ever borrowed a smartphone without asking? Modified a URL? Scraped a website? Called an undocumented API? Congratulations: you might have violated federal law! A 1986 statute, the Computer Fraud and Abuse Act (CFAA), provides both civil and criminal remedies for mere "unauthorized" access to a computer.
In this first episode, Mike and I explore how your simplest digital footprints – fragments of Google searches, Facebook likes, and innocuous tweets – can expose deeply intimate facts about you. Like whether your parents are divorced and whether you own a gun. In fact, these vanilla datasets that we all generate every time we use the Internet reveal surprising clues about our personalities and behavior. So how can that information be used, and who is collecting it? We talk to Michal Kosinski of Stanford’s Graduate School of Business, and Jonathan Mayer, a computer scientist and lawyer.
As consumers increasingly adopt encryption tools, government officials have warned of the “Going Dark” problem – the notion that widespread encryption will thwart legitimate government efforts to investigate crime and safeguard national security. To address this problem, law enforcement and intelligence community officials have suggested that companies include “backdoors” in their products to permit lawful government access to encrypted data. This proposal has been met with criticism from technologists and privacy advocates alike.
"WELNA: It could indeed. Hackers, by definition, are trying to break into other people's computer accounts and steal their information, so monitoring their activity means watching them poach on other people's Internet usage and private data. I talked with Jonathan Mayer, a computer security fellow at Stanford who's reviewed these latest Snowden documents. He says because of the way the surveillance law is written, the NSA can actually hang on to that hacked information.
CIS Affiliate Scholar David Levine interviews Jonathan Mayer, Stanford Ph.D. candidate in computer science, author of Terms of Abuse: An Empirical Assessment of the Federal Hacking Law, and How to Fix It.
Listen to the full piece at Marketplace.org.
"Now Neustar might lose the contract to Ericsson, which is based in Sweden. Neustar says this would be bad for national security, said Jonathan Mayer, a fellow at Stanford's Center for International Security and Cooperation.
“It certainly is a legitimate concern that the company that routes calls is in position to know a fair amount about law enforcement and intelligence investigations,” Mayer said."