“We are truly fucked.” That was Motherboard’s spot-on reaction to deep fake sex videos (realistic-looking videos that swap a person’s face into sex scenes actually involving other people). And that sleazy application is just the tip of the iceberg.
"When Danielle Citron, a professor of law at the University of Maryland, first became aware of the fake porn movies, she was initially struck by how viscerally they violated these women’s right to privacy. But once she started thinking about deep fakes, she realized that if they spread beyond the trolls on Reddit they could be even more dangerous. They could be weaponized in ways that weaken the fabric of democratic society itself.
"The keynote speaker was Danielle Citron, Morton and Sophia Macht Professor of Law at the University of Maryland’s Francis King Carey School of Law. Citron addressed the rise of “deep fakes,” sophisticated fake audio and video that can be easily produced by people with access to the technology.
Citron warned the audience that the democratization of this technology could have devastating effects on the political process. She discussed the possibility of a fabricated video that incriminates or embarrasses a political candidate surfacing the night before an election.
"Those types of efforts could increase if the intelligence community agrees to weigh in on deepfakes, said Danielle Citron, a privacy law expert at the University of Maryland and co-author of a new paper on the potential impact of deepfake technology.
"“This would hardly be the first time that a new technology with a lot of potential was first adopted by pornography,” Calo said. “You see that with the internet, with VHS, and if you did a little digging, you’d see that … there being this economic motivation, that often times technology do get to play early in pornography.”"
New software tools use artificial intelligence to create realistic-looking but fake videos of real people seeming to say and do things they never did. These so-called "deepfakes" will soon cause a number of problems for the courts, particularly when it comes to authenticating evidence in litigation. They may even undermine the justice system by eroding juries' belief in the knowability of what is real. Come discuss the implications of deepfakes for trial practice with CIS Associate Director of Surveillance and Cybersecurity Riana Pfefferkorn.