The Center for Internet and Society at Stanford Law School is a leader in the study of the law and policy around the Internet and other emerging technologies.
“We are truly fucked.” That was Motherboard’s spot-on reaction to deep fake sex videos (realistic-looking videos that swap a person’s face into sex scenes actually involving other people). And that sleazy application is just the tip of the iceberg.
"This raises the question of whether Congress could draft a law narrow enough to help victims of deepfakes without such unintended consequences. As a cautionary tale, Annemarie Bridy, a law professor at the University of Idaho, points to the misuse of the copyright takedown system in which companies and individuals have acted in bad faith to remove legitimate criticism and other legal content.
Still, given what’s at stake with pornographic deep fake videos, Bridy says, it could be worth drafting a new law.
"Danielle Citron, a University of Maryland law professor who has studied ways to combat online abuse, says the country is in desperate need of a more comprehensive criminal statute that would cover what she calls “invasions of sexual privacy and assassinations of character.” “We need real deterrents,” she said. “Otherwise, it’s just a game of whack-a-mole.”"
"When Danielle Citron, a professor of law at the University of Maryland, first became aware of the fake porn movies, she was initially struck by how viscerally they violated these women’s right to privacy. But once she started thinking about deep fakes, she realized that if they spread beyond the trolls on Reddit they could be even more dangerous. They could be weaponized in ways that weaken the fabric of democratic society itself.
"The keynote speaker was Danielle Citron, Morton and Sophia Macht Professor of Law at the University of Maryland’s Francis King Carey School of Law. Citron addressed the rise of “deep fakes,” sophisticated fake audio and video that can be easily produced by people with access to the technology.
Citron warned the audience that the democratization of this technology could have devastating effects on the political process. She discussed the possibility of a fabricated video that incriminates or embarrasses a political candidate surfacing the night before an election.
"Those types of efforts could increase if the intelligence community agrees to weigh in on deepfakes, said Danielle Citron, a privacy law expert at the University of Maryland and co-author of a new paper on the potential impact of deepfake technology.
"“This would hardly be the first time that a new technology with a lot of potential was first adopted by pornography,” Calo said. “You see that with the internet, with VHS, and if you did a little digging, you’d see that … there being this economic motivation, that often times technology do get to play early in pornography.”"
New software tools use artificial intelligence to create realistic-looking but fake videos of real people seeming to say and do things they never did. These so-called "deepfakes" will soon cause a number of problems for the courts, particularly when it comes to authenticating evidence in litigation. They may even undermine the justice system by eroding juries' belief in the knowability of what is real. Come discuss the implications of deepfakes for trial practice with CIS Associate Director of Surveillance and Cybersecurity Riana Pfefferkorn.