High Res Photo of Jennifer Granick
Photo credit: Michael Sugrue
Julia Angwin’s blog post today is incorrect. Stanford never promised not to use Google money for privacy research.
Encryption helps human rights workers, activists, journalists, financial institutions, innovative businesses, and governments protect the confidentiality, integrity, and economic value of their activities. However, strong encryption may mean that governments cannot make sense of data they would otherwise be able to lawfully access in a criminal or intelligence investigation.
Arguing that a defendant’s conviction for website hacking should be overturned because legitimate, highly valuable security and privacy research commonly employs techniques that are essentially identical to what the defendant did and that such independent research is of great value to academics, government regulators and the public even when – often especially when — conducted without a website owner’s permission.
Arguing that if the court should not compel Apple to create software to enable unlocking and search of the San Bernardino shooter’s iPhone, it will jeopardize digital and personal security more generally.
After the Estate of James Joyce refused to allow a scholar to quote Joyce in her book, we successfully defended her right under the fair use doctrine to use the quotes she needed to illustrate her scholarship. After we prevailed in the case, the Estate paid $240,000 of our client’s legal fees.
Last week’s big cybersecurity news was that the FBI obtained a court order to force Apple to develop new software that would bypass several iPhone security features so the FBI can attempt to unlock the work phone of one of the San Bernardino shooters. Apple plans to challenge that order. (Full disclosure: I am planning on writing a technologists’ amicus brief on Apple’s side in that challenge.)
On Friday, Congress will vote on a mutated version of security threat sharing legislation that had previously passed through the House and Senate. These earlier versions would have permitted private companies to share with the federal government categories of data related to computer security threat signatures. Companies that did so would also receive legal immunity from liability under the Electronic Communications Privacy Act (ECPA) and other privacy laws.
Here’s the latest in the encryption case we’ve been writing about in which the Justice Department is asking Magistrate Judge James Orenstein to order Apple to unlock a criminal defendant’s passcode-protected iPhone. The government seized and has authority to search the phone pursuant to a search warrant.
Pending before federal magistrate judge James Orenstein is the government’s request for an order obligating Apple, Inc. to unlock an iPhone and thereby assist prosecutors in decrypting data the government has seized and is authorized to search pursuant to a warrant.
Last week, we wrote about an order from a federal magistrate judge in New York that questioned the government’s ability, under an ancient federal law called the All Writs Act, to compel Apple to decrypt a locked device which the government had seized and is authorized to search pursuant to a warrant.
"“It seems like the government lied to Twitter about why it wanted the information,” says Jennifer Granick, Director of Civil Liberties at the Stanford Center for Internet and Society. “It’s not entitled to the information under the statutory authority it cites.”"
The Republican chair of the House Intelligence Committee, Devin Nunes has just said that Donald Trump’s communications were likely picked up by US intelligence agencies through “incidental collection.” Before Nunes’ statement, I interviewed Jennifer Stisa Granick, the director of civil liberties at Stanford University’s Center for the Internet and Society, about her new
"Some people writing on intelligence and surveillance note that close working relations such as this can allow intelligence agencies to evade domestic controls. Jennifer Granick, in her new Cambridge University Press book, American Spies: Modern Surveillance, Why You Should Care, and What To Do About It, notes that Five Eyes countries aren’t supposed to spy on one another’s citizens. However, she says that the NSA has prepared policies that would allow it to spy on Five Eyes citizens without permission. She furthermore suggests that:
"Although it would be “unheard of” for the federal government to prosecute a company for using leaked classified information to improve its products, there “are some issues with the fact that the information is classified,” said Jennifer Granick, director of civil liberties at Stanford Law School’s Center for internet and Society.
Given uncertainty about the views of the Justice Department, “I can see why legal counsel at big companies might hesitate to reach out to Julian Assange to negotiate access to classified information,” she said."
"While WikiLeaks has often been criticized for releasing sensitive data without regard for the consequences, Mr. Assange is acting responsibly this time, said Jennifer Granick, the director of civil liberties at the Stanford Center for Internet and Society. WikiLeaks redacted the actual computer code for C.I.A. exploits from its initial release to avoid spreading such cyberweapons.
“He is trying to do the right thing,” Ms. Granick said."
Stanford CIS brings together scholars, academics, legislators, students, programmers, security researchers, and scientists to study the interaction of new technologies and the law and to examine how the synergy between the two can either promote or harm public goods like free speech, innovation, privacy, public commons, diversity, and scientific inquiry
Co-hosted and presented by The Tech Museum of Innovation and the San Jose Museum of Art.
For more information and to purchase tickets visit: https://www.eventbrite.com/e/death-of-the-open-internet-a-black-hat-qa-w...
Welcome to Startup Policy Lab’s The Policy Series, hosted by Runway! For our first October session, we go one-on-one with Jennifer Granick, Director of Civil Liberties at Stanford Center for Internet and Society (CIS).
The Lifecycle of a Revolution
Speaker: Jennifer Granick, Stanford University NSA stands for National Security Agency, but the agency is at odds with itself in its security mission. Undermining global encryption standards, intercepting Internet companies' data center transmissions, using auto-update to spread malware, and demanding law enforcement back doors in products and services are all business as usual. What legal basis does NSA and FBI have for these demands, and do they make the country more or less safe?
Jennifer Granick talks about how notions of privacy have changed over the years and where she thinks things are headed in the future. She is a professor at the Stanford School of Law and Director of Civil Liberties at the Center for Internet and Society, where she specializes in the intersection of engineering, privacy and the law.
What kind of surveillance assistance can the U.S. government force companies to provide? This issue has entered the public consciousness due to the FBI's demand in February that Apple write software to help it access the San Bernardino shooter's encrypted iPhone. Technical assistance orders can go beyond the usual government requests for user data, requiring a company to actively participate in the government's monitoring of the targeted user(s).
In this week's feature interview we're chatting with Stanford's very own Jennifer Granick about a recent ruling in a Virginia court that appears to give the FBI permission to hack into any computer it wants, sans warrant. Well that's what the headlines are screaming, anyway. But as you'll hear, it's not quite that black and white.
""What was remarkable was that the public hadn't seen the argument surfaced," says Jennifer Granick at the Stanford Center for Internet and Society. She says Judge Orenstein was trying to stoke a public debate. "Judge Orenstein had concerns about whether the government's legal argument was a valid legal argument."