Campaigns Must Prepare for Deepfakes: This Is What Their Plan Should Look Like

Publication Type: 
Other Writing
Publication Date: 
September 5, 2019

It is only a matter of time before maliciously manipulated or fabricated content surfaces of a major presidential candidate in 2020. The video manipulation of House Speaker Nancy Pelosi in May demonstrates the speed with which even a “cheap fake” can spread. But the technology is quickly getting more sophisticated, and we must prepare for “deepfakes”—fully synthesized audio or video of someone saying or doing something they did not say or do. Soon (meaning months, not years), it may be impossible to tell real videos from fake ones. The truth will have a tough time emerging in a deepfake-ridden marketplace of ideas.

Doctored media, typically in the form of short videos or audio clips, could be used to embarrass, defame, or otherwise damage candidates for office. Recent advances in artificial intelligence have increased the realism of deepfakes and substantially cut the resources necessary to create them. On August 9, a deepfake of Democratic National Committee Chair Tom Perez was presented to a conference room of hackers, who largely failed to realize that anything was amiss.

The key is in the timing. Imagine the night before an election, a deepfake is posted showing a candidate making controversial remarks. The deepfake could tip the election and undermine people’s faith in elections. This is not hypothetical. In the past six months, manipulated media has targeted a senior Malaysian ministerDonald Trump, and others.

It does not matter that digital fakery can, for the present moment, be detected pretty easily. People have a visceral reaction to video and audio. They believe what their eyes and ears are telling them—even if all signs suggest that the video and audio content is fake. If the video and audio is provocative, then it will surely go viral. Studies show that people are ten times more likely to spread fake news than accurate stories because fakery evokes a stronger emotional reaction. So no matter how unbelievable deepfakes are, the damage will still be real.

Read the full piece at the Carnegie Endowment for International Peace