Jonathan Mayer is a Ph.D. candidate in computer science at Stanford University, where he received his J.D. in 2013. He was named one of the Forbes 30 Under 30 in 2014, for his work on technology security and privacy. Jonathan's research and commentary frequently appears in national publications, and he has contributed to federal and state law enforcement actions.
Jonathan is a Cybersecurity Fellow at the Center for International Security and Cooperation, a Junior Affiliate Scholar at the Center for Internet and Society, and a Stanford Interdisciplinary Graduate Fellow. He earned his A.B. at Princeton University in 2009, concentrating in the Woodrow Wilson School of Public and International Affairs. Jonathan has consulted for both federal and state law enforcement agencies, and his research on consumer privacy has contributed to multiple regulatory interventions. A proud Chicago native, Jonathan is undaunted by freezing weather and enjoys celery salt on a hot dog.
We're pleased to announce we're beginning work on an IETF Internet-Draft for the Do Not Track header. We look forward to incorporating broad feedback.
In anticipation of the first version of the Internet-Draft, we're making a few minor updates to the header. The reference implementations at DoNotTrack.Us will be revised shortly.
"If you remove tracking, you remove advertisers." "Stop [data] sharing and you put a stop to the Internet as we know it." "Thousands of small websites may disappear." "Would you like to pay $20 a month for Facebook?" A spate of such recent commentaries have speculated that Do Not Track could hobble advertising-supported businesses. Here's why it won't.
Since our introduction of DoNotTrack.Us last week we've received a deluge of questions. This post answers some of the most common inquiries. If we haven't covered an issue you'd like a response on, shoot us an email and stay tuned - more Q & A posts are in the pipeline.
Q: Do Not Track does not block third-party tracking. Wouldn't that be a better solution?
Some privacy-conscious users block third-party tracking, most commonly through browser add-ons. This type of self-help is completely compatible with and complementary to Do Not Track; many Do Not Track users may elect to use blocking software. But blocking alone is not a complete solution to web tracking. Here are our chief concerns:
- Universal blocking is infeasible. Web security research (1, 2, 3) has uncovered dozens of means of tracking users; technical barriers to all these approaches are not practical. And a recent informal study of popular Firefox blocking add-ons suggests that blocking is, in practice, far from a universal opt out. Users should not be left guessing as to whether they've actually opted out of tracking.
- Blocking software requires perpetual development and user vigilance. There is frequent turnover of tracking services and tracking technologies. If a developer takes a break, its blocking tool will diminish in effectiveness. Users must, consequently, periodically ensure their blocking software is still maintained and up-to-date.
- Blocking inhibits third-party tools. A number of popular website tools and plug-ins are hosted by a third party that also tracks users. Blocking would disable these tools, while Do Not Track accommodates them.
The web privacy debate is stuck. Privacy proponents decry the diffusion of behavioral advertising and tracking services (1, 2, 3); industry coalitions respond by expounding the merits of personalized content and advertising revenue (1, 2). But for the average user, the arguments are academic: there is no viable technology for opting out of web tracking. A registry of tracking services, like privacy advocates proposed years ago, is cumbersome and unmanageable. Fiddling with cookies, as many advertising networks and anti-regulation advocates recommend, is an incomplete and temporary fix; both Google and NAI (an advertising industry association) have already moved away from opt-out cookies.
Do Not Track ends this standoff. It provides a web tracking opt-out that is user-friendly, effective, and completely interoperable with the existing web. The technology is simple: whenever your web browser makes a request, it includes an opt-out preference. It's then up to advertisers and tracking services to honor that preference – voluntarily, by industry self-regulation, or by law.
Arvind Narayanan and I have been researching Do Not Track for several months, and are pleased to now introduce DoNotTrack.Us, a compilation of what we've learned. The resource explains Do Not Track, provides prototype implementations, and answers some common questions. We'll be updating it in the coming months with new findings and responses to feedback.
Excited as we are about the Do Not Track technology, it is but a first step. Important substantive policy questions remain open: What tracking should be impermissible? When a user visits a site, what constitutes a third party? We look forward to collaborating with advertising networks, NGO's, regulators, lawmakers, and other stakeholders in answering these crucial questions.
Late last year the Obama administration reopened talks with Russia over the militarization of cyberspace and assented to cybersecurity discussion in the United Nations First Committee (Disarmament and National Security). My intention in this three-part series is to probe Russian and American foreign policy on cyberwarfare and advance the thesis that the Russians are negotiating for specific strategic or diplomatic gains, while the Americans are primarily procedurally invested owing to the “reset” in Russian relations and changing perceptions of cyberwarfare.
Cross-posted from Freedom to Tinker.
By Jonathan Mayer and Edward W. Felten
Special to The Bee
By Edward Felten and Jonathan Mayer
Snooping on the Internet is tricky. The network is diffuse, global, and packed with potential targets. There’s no central system for identifying or locating individuals, so it’s hard to keep track of who is online and what they’re up to. What’s a spy agency to do?
Privacy Substitutes by Jonathan Mayer & Arvind Narayanan
Arguing that a defendant’s conviction for website hacking should be overturned because legitimate, highly valuable security and privacy research commonly employs techniques that are essentially identical to what the defendant did and that such independent research is of great value to academics, government regulators and the public even when – often especially when — conducted without a website owner’s permission.
"Unfortunately, consumers don't have a lot of options for evaluating the security of dating services, according to Jonathan Mayer, a computer scientist and lawyer affiliated with Stanford's Center for Internet and Society. And the explosion of services in the market means that start-ups may not be putting users' privacy first.
"Young apps often don't prioritize security and privacy," he said. "Growth is everything in the start-up space -- and that can come at users' expense."
""With this acquisition, Verizon appears to be tearing down the wall between telecommunications and personalized advertising," said Jonathan Mayer, a computer researcher at Stanford University. "The FCC might have something to say about that."
Mayer argued that telecom companies like Verizon are "in a privileged and trusted position" because they have such sweeping access to sensitive information flowing over their networks, and it's often difficult for consumers to switch to competitors. "
"In an interview with The Daily, Jonathan Mayer J.D. ’13, a Cybersecurity Fellow at Stanford’s Center for International Security and Cooperation (CISAC) and lecturer at the Law School, talked about the impacts of the two NSA surveillance programs on the general public.
The Stanford Daily (TSD): From a legal perspective, what does the Second Circuit’s decision mean?
"“This breach related to the lowest-level unclassified email,” said Jonathan Mayer, a graduate fellow in computer science at Stanford University. “But that said, that can still include very sensitive information, maybe not state secrets, but information that would be of great interest to a foreign nation. It’s unfortunate.”"
"Other experts pointed out that we don’t know for sure what happens to these exchanges when they are collected. Jonathan Mayer, a cybersecurity fellow at Stanford University’s Center for Internet and Society, said many important features of Upstream collection "haven’t leaked and remain classified," such as whether it involves bulk collection, massive-but-targeted collection or only individually targeted collection, he said.
Presented by: Catholic University Columbus School of Law’s Journal of Law & Technology
2016 Journal of Law & Technology Symposium
Cybersecurity and Privacy in the Internet Economy: Information Sharing, Data Security, and Intellectual Property
March 17, 2016
2:00 p.m. - 5:30 p.m.
Because of Edward Snowden’s remarkable public service, we know that the National Security Agency, with the cooperation of some large firms, has amassed an unprecedented database of personal information. The ostensible goal in collecting that information is to protect national security. The effect, according to Reed Hundt, is to undermine democracy.
This talk presents an empirical assessment of the NSA’s legal restrictions, including research cited by President Obama’s intelligence review group. We find that present limits on bulk surveillance programs come up far short; authorities to intercept international Internet traffic and domestic telephone metadata place ordinary Americans at risk.
Solutions to many pressing economic and societal challenges lie in better understanding data. New tools for analyzing disparate information sets, called Big Data, have revolutionized our ability to find signals amongst the noise. Big Data techniques hold promise for breakthroughs ranging from better health care, a cleaner environment, safer cities, and more effective marketing. Yet, privacy advocates are concerned that the same advances will upend the power relationships between government, business and individuals, and lead to prosecutorial abuse, racial or other profiling, discrimination, redlining, overcriminalization, and other restricted freedoms.
Have you ever borrowed a smartphone without asking? Modified a URL? Scraped a website? Called an undocumented API? Congratulations: you might have violated federal law! A 1986 statute, the Computer Fraud and Abuse Act (CFAA), provides both civil and criminal remedies for mere "unauthorized" access to a computer.
In this first episode, Mike and I explore how your simplest digital footprints – fragments of Google searches, Facebook likes, and innocuous tweets – can expose deeply intimate facts about you. Like whether your parents are divorced and whether you own a gun. In fact, these vanilla datasets that we all generate every time we use the Internet reveal surprising clues about our personalities and behavior. So how can that information be used, and who is collecting it? We talk to Michal Kosinski of Stanford’s Graduate School of Business, and Jonathan Mayer, a computer scientist and lawyer.
As consumers increasingly adopt encryption tools, government officials have warned of the “Going Dark” problem – the notion that widespread encryption will thwart legitimate government efforts to investigate crime and safeguard national security. To address this problem, law enforcement and intelligence community officials have suggested that companies include “backdoors” in their products to permit lawful government access to encrypted data. This proposal has been met with criticism from technologists and privacy advocates alike.
"WELNA: It could indeed. Hackers, by definition, are trying to break into other people's computer accounts and steal their information, so monitoring their activity means watching them poach on other people's Internet usage and private data. I talked with Jonathan Mayer, a computer security fellow at Stanford who's reviewed these latest Snowden documents. He says because of the way the surveillance law is written, the NSA can actually hang on to that hacked information.
CIS Affiliate Scholar David Levine interviews Jonathan Mayer, Stanford Ph.D. candidate in computer science, author of Terms of Abuse: An Empirical Assessment of the Federal Hacking Law, and How to Fix It.
Listen to the full piece at Marketplace.org.
"Now Neustar might lose the contract to Ericsson, which is based in Sweden. Neustar says this would be bad for national security, said Jonathan Mayer, a fellow at Stanford's Center for International Security and Cooperation.
“It certainly is a legitimate concern that the company that routes calls is in position to know a fair amount about law enforcement and intelligence investigations,” Mayer said."