The following was excerpted from an article that will appear in a future issue of NWLawyer. The author was also recently interviewed for the “What’s Next” newsletter on LAW.COM, which you can read here.
“We are truly fucked.” That was Motherboard’s spot-on reaction to deep fake sex videos (realistic-looking videos that swap a person’s face into sex scenes actually involving other people). And that sleazy application is just the tip of the iceberg.
"Welcome back to What’s Next, where we report on the intersection of law and technology. Today, we talk with Stanford’s Riana Pfefferkorn about deepfakes and why lawyers need to care about this alarming issue. Also, autonomous vehicles could affect our zoning laws (think fewer parking garages). More on that random legal aspect, and more, below.
"“Imagine the night before an IPO, a deep fake video of the CEO comes out of the CEO soliciting a child prostitute or doing drugs,” University of Maryland Francis King Carey School of Law professor and privacy expert Danielle Citron, JD, said to a full house in the school’s Ceremonial Moot Courtroom.
“There goes the IPO, and the faith of the marketplace for the CEO is wrecked,” she continued."
"Lawmakers don't get many chances to get this right, says Citron. "You gotta write this correctly." One or two over-broad or ineffective bills and an appetite for a deepfakes law might turn into backlash.
"This raises the question of whether Congress could draft a law narrow enough to help victims of deepfakes without such unintended consequences. As a cautionary tale, Annemarie Bridy, a law professor at the University of Idaho, points to the misuse of the copyright takedown system in which companies and individuals have acted in bad faith to remove legitimate criticism and other legal content.
Still, given what’s at stake with pornographic deep fake videos, Bridy says, it could be worth drafting a new law.
"Danielle Citron, a University of Maryland law professor who has studied ways to combat online abuse, says the country is in desperate need of a more comprehensive criminal statute that would cover what she calls “invasions of sexual privacy and assassinations of character.” “We need real deterrents,” she said. “Otherwise, it’s just a game of whack-a-mole.”"
"When Danielle Citron, a professor of law at the University of Maryland, first became aware of the fake porn movies, she was initially struck by how viscerally they violated these women’s right to privacy. But once she started thinking about deep fakes, she realized that if they spread beyond the trolls on Reddit they could be even more dangerous. They could be weaponized in ways that weaken the fabric of democratic society itself.
"The keynote speaker was Danielle Citron, Morton and Sophia Macht Professor of Law at the University of Maryland’s Francis King Carey School of Law. Citron addressed the rise of “deep fakes,” sophisticated fake audio and video that can be easily produced by people with access to the technology.
Citron warned the audience that the democratization of this technology could have devastating effects on the political process. She discussed the possibility of a fabricated video that incriminates or embarrasses a political candidate surfacing the night before an election.
"Those types of efforts could increase if the intelligence community agrees to weigh in on deepfakes, said Danielle Citron, a privacy law expert at the University of Maryland and co-author of a new paper on the potential impact of deepfake technology.
"“This would hardly be the first time that a new technology with a lot of potential was first adopted by pornography,” Calo said. “You see that with the internet, with VHS, and if you did a little digging, you’d see that … there being this economic motivation, that often times technology do get to play early in pornography.”"
New software tools use artificial intelligence to create realistic-looking but fake videos of real people seeming to say and do things they never did. These so-called "deepfakes" will soon cause a number of problems for the courts, particularly when it comes to authenticating evidence in litigation. They may even undermine the justice system by eroding juries' belief in the knowability of what is real. Come discuss the implications of deepfakes for trial practice with CIS Associate Director of Surveillance and Cybersecurity Riana Pfefferkorn.