Press

CIS in the news.

  • Killer robots are coming: Scientists warn UN needs treaty to maintain human control over all weapons

    Date published: 
    November 15, 2017

    "Scientists who specialize in Artificial Intelligence are warning that technology has advanced to the point where we will soon see lethal weapons that can decide to kill completely free of human control. Killer robots raise the prospect of a new kind of war; one in which it’s possible to selectively target and slaughter entire populations at little cost.

  • The Making of an American Nazi

    Date published: 
    November 14, 2017

    "But this activity is difficult to prosecute when trolls know how to conceal their identity. A lone troll might leave his victim only one voicemail telling her to burn in an oven, which would fail to meet the criteria for cyberstalking. When hundreds of trolls do the same, though, the effect can be terrifying. “It’s like a bee swarm,” says Danielle Citron, a professor at the University of Maryland’s School of Law and a leading expert on cyberharassment. “You have a thousand bee stings. Each sting is painful. But it’s perceived as one awful, throbbing, giant mass.”"

  • How One Woman's Digital Life Was Weaponized Against Her

    Date published: 
    November 14, 2017

    "People are starting to understand “that the web watches them back,” says Aleecia McDonald, a privacy researcher at Stanford’s Center for Internet and Society. But we still don’t appreciate the extent to which it’s happening or what risks we might face in the future. McDonald suggests thinking of the internet as a backward-facing time machine that we are constantly loading with ammunition: “Everything that’s on file about you for the last 15 years and the next 40 years” may someday be used against you with technology that, at this time, we can’t understand or predict. 

  • Texas killings may aid Rosenstein’s crusade on encryption

    Date published: 
    November 11, 2017

    "“It’s basically the gloves coming off,” said Riana Pfefferkorn, a cryptography fellow at Stanford University’s Center for Internet and Society.

    Indeed, Pfefferkorn said she was “somewhat worried about the [government’s] ability to capitalize on that public sentiment.”"
  • As mass data collection becomes the norm, concerns about surveillance are growing

    Date published: 
    November 10, 2017

    "This is why conversations regarding smart city data collection sometimes miss the point. Albert Gidari, Director of Privacy at the Stanford Centre for Internet and Society, believes focusing on personally identifiable information (PII) is myopic – particularly when there is so much valuable data that can be mined from citizens before you’ve asked for their identity directly.

  • Congress’s end run around a pillar of online free speech

    Date published: 
    November 10, 2017

    "Daphne Keller of the Stanford Center for Internet and Society says that the new law could push some platforms and publishers to crack down on a wide variety of speech, to avoid the threat of lawsuits. It would give them “a reason to err on the side of removing internet users’ speech in response to any controversy,” she says, “and in response to false or mistaken allegations, which are often levied against online speech.”"

  • How to Make Cars Cooperate

    Date published: 
    November 9, 2017

    "Without federal help, however, the upfront cost of connected infrastructure can be prohibitive for small towns and cities. Bryant Walker Smith, a law professor at the University of South Carolina and an affiliate scholar at Stanford’s Center for Internet and Society, recently challenged a group of students to come up with ways to secure public funding for vehicle-to-infrastructure technology that would enable more governments to afford it.

  • Another mass shooting, another locked phone

    Date published: 
    November 9, 2017

    "Riana Pfefferkorn, a cryptography fellow at the Stanford Center for Internet and Society, said the phone could contain information not available elsewhere, like data stored and deleted locally, but in this case she doesn’t see the point.

    “I don’t think this is a case where that would be forensically significant,” Pfefferkorn said of accessing the phone."

  • With Amazon Key’s launch, customers and lawyers have lots of questions

    Date published: 
    November 8, 2017

    ""Would it be possible for a person unknowingly to authorize a law enforcement agency or a criminal to access Amazon Key?" Elizabeth Joh, a law professor at the University of California, Davis, e-mailed Ars. "If a criminal gains access and some harm occurs, who is responsible? And what criminal law would apply? Also, does Amazon have in its disclaimers that law enforcement might ask for access through Amazon Key? Does Amazon plan on being transparent about this?""

  • It’s Getting Harder for Tech Companies To Deny Responsibility for Content

    Date published: 
    November 7, 2017

    "The opposing view, held by advocates for victims of crime or harassment online, is that giving tech companies immunity removes any incentive they have to conduct due diligence. Danielle Citron, a professor at the University of Maryland Francis King Carey School of Law who also serves on Twitter’s Trust and Safety Council, co-authored a paper this summer entitled “The Internet Will Not Break," which called for making the law’s immunity less sweeping.

  • Congress Can Crack Down On Tech Companies, But It Can’t Do Much To Their Algorithms

    Date published: 
    November 6, 2017

    "“When platforms don’t know what to do, the legally over-cautious response is to go way overboard on taking things down just in case they’re illegal,” Daphne Keller, Director of Intermediary Liability at Stanford University’s Center for Internet and Society, told BuzzFeed News. “My worst case scenario legislation would be some vague obligation for platforms to make sure that users don’t do bad things.”"

  • So What the Hell Is Doxxing?

    Date published: 
    November 4, 2017

    "For some, doxxing is morally troubling. Law professor Danielle Citron is one. “It provides a permission structure to go outside the law and punish each other,” she says. “It’s like shaming in cyber-mobs.”"

  • Trump's Twitter lockout raises safeguard concerns

    Date published: 
    November 4, 2017

    "Jennifer Granick, a lawyer with the ACLU’s technology division, said that abuses of power will become unavoidable if companies continue to face pressure to moderate their content.

    “It's not a surprise that Twitter employees have this capability,” Granick said. “The public and Congress have been demanding that the platform companies create the ability to ban people from the platform or delete particular messages.”"

  • Making smart machines ethical: Montreal forum seeks to lead conversation on responsible AI

    Date published: 
    November 3, 2017

    "“We are just at the beginning,” said Dr. Peter Asaro, a philosopher of science, technology, and media at the New School and co-founder of the International Committee for Robot Arms Control. “These ethical issues in society are going to have to be worked out. Where do we want machines? How are we going to manage the consequences of automation in different sectors?”"

  • Trump Twitter takedown brings more grief for Silicon Valley

    Date published: 
    November 3, 2017

    "“There’s always been employees who have misused the keys,” said ACLU surveillance and cybersecurity counsel Jennifer Granick. She pointed to the tension among some who would prefer that tech platforms censor users' content, whether that’s policing Russian-planted accounts and ads or kicking Trump off Twitter for what they perceive as hate speech. “They’re under extreme pressure from Congress,” she said."

  • Making smart machines ethical: Montreal forum seeks to lead conversation on responsible AI

    Date published: 
    November 2, 2017

    ""It's hard to say who is responsible. As a casual user you have no idea how these things are built," said Peter Asaro, an assistant professor at The New School in New York and an AI philosopher.

    And as algorithms become more complex, its very creators may no longer understand how it works or what comes out.  

    "The accountability will be what they do about it when something bad happens," Asaro said."

  • 15 Things We Learned From the Tech Giants at the Senate Hearings

    Date published: 
    November 2, 2017

    "“Russia Today qualified, really because of algorithms, to participate in an advertising program. There are objective standards around popularity to be able to participate in that program. Platforms or publishers like RT drop in and out of the program as things change,” Salgado said. “The removal of RT from the program was actually the result of dropping viewership, not as a result of any action otherwise. There was nothing about RT or its content that meant that it stayed in or stayed out.”"

  • Russian ads, now publicly released, show sophistication of influence campaign

    Date published: 
    November 1, 2017

    "Technology lawyer Albert Gidari, director of privacy at the Stanford University Center for Internet and Society, said that in turning over the ads, companies were entering complex legal territory. Ads have long been considered private data on par with email content and other records that the government must have a search warrant to obtain, he said.

  • Honk if you're not driving

    Date published: 
    November 1, 2017

    "As an internationally recognized expert on the law of self-driving vehicles, Bryant Walker Smith is frequently asked to weigh in on legal issues related to automated driving. But the UofSC law professor’s expertise isn’t limited to cars and the people not driving them. His insights into tort law and product liability, and his broader interest in what he terms “the law of the newly possible,” are helping prepare USC law students for an evolving legal landscape.

Pages