Jonathan Mayer is a Ph.D. candidate in computer science at Stanford University, where he received his J.D. in 2013. He was named one of the Forbes 30 Under 30 in 2014, for his work on technology security and privacy. Jonathan's research and commentary frequently appears in national publications, and he has contributed to federal and state law enforcement actions.
Jonathan is a Cybersecurity Fellow at the Center for International Security and Cooperation, a Junior Affiliate Scholar at the Center for Internet and Society, and a Stanford Interdisciplinary Graduate Fellow. He earned his A.B. at Princeton University in 2009, concentrating in the Woodrow Wilson School of Public and International Affairs. Jonathan has consulted for both federal and state law enforcement agencies, and his research on consumer privacy has contributed to multiple regulatory interventions. A proud Chicago native, Jonathan is undaunted by freezing weather and enjoys celery salt on a hot dog.
Last week we reported some early results from the Stanford Security Lab's new web measurement platform on how advertising networks respond to opt outs and Do Not Track. This week we're back with a new discovery in the online advertising ecosystem: Epic Marketplace,1 a member of the self-regulatory Network Advertising Initiative (NAI), is history stealing.
Many thanks once again to research assistants Akshay Jagadeesh and Jovanni Hernandez.
Over the past several months researchers at the Stanford Security Lab have been developing a platform for measuring dynamic web content. One of our chief applications is a system for automated enforcement of Do Not Track by detecting the myriad forms of third-party tracking, including cookies, HTML5 storage, fingerprinting, and much more. While the software isn't quite polished enough for public release, we're eager to share some unexpected early results on the advertising ecosystem. Please bear in mind that these are preliminary findings from experimental software; our primary aims at this stage are developing the platform and validating the approach to third-party tracking detection. Many thanks to Jovanni Hernandez and Akshay Jagadeesh for their invaluable research assistance.
Late last week FTC Commissioner Rosch penned a column in which he repeated a number of hackneyed criticisms of Do Not Track. Senators McCaskill and Pryor articulated similar concerns at a recent hearing. This piece sequentially deconstructs Rosch's column and replies to each of his substantive critiques.
Do Not Track is on its way to becoming an Internet standard. In collaboration with Sid Stamm at Mozilla we've submitted an Internet-Draft to the IETF, specifying both the HTTP header syntax and the requirements for compliance.
This is just the beginning of the IETF's process and the evolution of the draft. But it's a transformative moment for web privacy: Do Not Track is now a formal standards proposal. Every browser, advertising network, analytics service, and social plug-in provider has a clear instruction manual on how to implement Do Not Track.
We owe a tremendous debt of gratitude to the colleagues and friends whose efforts have made Do Not Track a reality: Alissa Cooper, Peter Eckersley, Alex Fowler, John Mitchell, Ashkan Soltani, Lee Tien, and Harlan Yu. And we particularly thank Chris Soghoian, Do Not Track's unflagging champion for nearly two years.
Last Friday we submitted a comment to the FTC articulating our vision for Do Not Track. We expanded on a number of views already expressed on this blog: Do Not Track is about much more than behavioral advertising, an HTTP header is the right implementation, and Do Not Track is no threat to ad-supported businesses. Here are the new highlights. (For a fuller exposition of each, please see our comment.)
By Jonathan Mayer and Edward W. Felten
Special to The Bee
By Edward Felten and Jonathan Mayer
Snooping on the Internet is tricky. The network is diffuse, global, and packed with potential targets. There’s no central system for identifying or locating individuals, so it’s hard to keep track of who is online and what they’re up to. What’s a spy agency to do?
Privacy Substitutes by Jonathan Mayer & Arvind Narayanan
"Jonathan Mayer, a graduate fellow at Stanford University, launched a 2011 investigation that discovered how OK Cupid appeared to sell many categories of information about its users to two data management platforms — companies that aggregate, consolidate, and sell user data to target online ads. The information OK Cupid "leaked" (in Mayer's terminology) included age and income, along with drug use and drinking frequency, and preferences for cats or dogs.
When asked about Mayer's study, Rudder told VICE News that, "this is public stuff anyone could glean."
Jonathan Mayer, a PhD candidate in computer science & law lecturer at Stanford University, said that Mrs Clinton may have told her Internet service provider that she was starting a small business to allow her to set up the server.
"Right now, we have "limited technical evidence" about Clinton's email system, said Jonathan Mayer, a Ph.D. candidate in computer science at Stanford University and a cybersecurity fellow at the university's Center for International Security and Cooperation.
Mayer gained some information about the current state of the Clinton domain by doing a little bit of digging in the domain name system. (Though he didn't go too far, he said, for "obvious reasons.")
"It certainly wasn't a boneheaded setup," he told Mashable."
"“The padlock is a means of telling you that who you are talking to is who you think you are talking to. Superfish made that mechanism ineffective,” said Jonathan Mayer, a lawyer and computer science graduate student at Stanford University who specializes in digital privacy."
"Jonathan Mayer, a computer scientist and lawyer at Stanford who has studied the security practices of education technology startups, says he’s been horrified by what he’s found—including programs that didn’t use the secure https protocol or that don’t hide passwords as users enter them. “Very straightforward technical problems, stuff that should be licked by businesses that have even a modest degree of sophistication, those are the mistakes that are being made right and left,” he says. “In 2015, this is almost tech malpractice.”"
Presented by: Catholic University Columbus School of Law’s Journal of Law & Technology
2016 Journal of Law & Technology Symposium
Cybersecurity and Privacy in the Internet Economy: Information Sharing, Data Security, and Intellectual Property
March 17, 2016
2:00 p.m. - 5:30 p.m.
Because of Edward Snowden’s remarkable public service, we know that the National Security Agency, with the cooperation of some large firms, has amassed an unprecedented database of personal information. The ostensible goal in collecting that information is to protect national security. The effect, according to Reed Hundt, is to undermine democracy.
This talk presents an empirical assessment of the NSA’s legal restrictions, including research cited by President Obama’s intelligence review group. We find that present limits on bulk surveillance programs come up far short; authorities to intercept international Internet traffic and domestic telephone metadata place ordinary Americans at risk.
Solutions to many pressing economic and societal challenges lie in better understanding data. New tools for analyzing disparate information sets, called Big Data, have revolutionized our ability to find signals amongst the noise. Big Data techniques hold promise for breakthroughs ranging from better health care, a cleaner environment, safer cities, and more effective marketing. Yet, privacy advocates are concerned that the same advances will upend the power relationships between government, business and individuals, and lead to prosecutorial abuse, racial or other profiling, discrimination, redlining, overcriminalization, and other restricted freedoms.
Have you ever borrowed a smartphone without asking? Modified a URL? Scraped a website? Called an undocumented API? Congratulations: you might have violated federal law! A 1986 statute, the Computer Fraud and Abuse Act (CFAA), provides both civil and criminal remedies for mere "unauthorized" access to a computer.
In this first episode, Mike and I explore how your simplest digital footprints – fragments of Google searches, Facebook likes, and innocuous tweets – can expose deeply intimate facts about you. Like whether your parents are divorced and whether you own a gun. In fact, these vanilla datasets that we all generate every time we use the Internet reveal surprising clues about our personalities and behavior. So how can that information be used, and who is collecting it? We talk to Michal Kosinski of Stanford’s Graduate School of Business, and Jonathan Mayer, a computer scientist and lawyer.
As consumers increasingly adopt encryption tools, government officials have warned of the “Going Dark” problem – the notion that widespread encryption will thwart legitimate government efforts to investigate crime and safeguard national security. To address this problem, law enforcement and intelligence community officials have suggested that companies include “backdoors” in their products to permit lawful government access to encrypted data. This proposal has been met with criticism from technologists and privacy advocates alike.
"WELNA: It could indeed. Hackers, by definition, are trying to break into other people's computer accounts and steal their information, so monitoring their activity means watching them poach on other people's Internet usage and private data. I talked with Jonathan Mayer, a computer security fellow at Stanford who's reviewed these latest Snowden documents. He says because of the way the surveillance law is written, the NSA can actually hang on to that hacked information.
CIS Affiliate Scholar David Levine interviews Jonathan Mayer, Stanford Ph.D. candidate in computer science, author of Terms of Abuse: An Empirical Assessment of the Federal Hacking Law, and How to Fix It.
Listen to the full piece at Marketplace.org.
"Now Neustar might lose the contract to Ericsson, which is based in Sweden. Neustar says this would be bad for national security, said Jonathan Mayer, a fellow at Stanford's Center for International Security and Cooperation.
“It certainly is a legitimate concern that the company that routes calls is in position to know a fair amount about law enforcement and intelligence investigations,” Mayer said."