Tackling the ‘Deep Fake,’ House Grasps for Solution to Doctored Videos

"Sharing Ayyub’s story with the committee, University of Maryland law professor Danielle Citron noted that the journalist awoke to find her  face had been superimposed on the body of a woman in porno in April 2018, less than 24 hours after an appearance on BBC and Al Jazeera in which she condemned Indian religious leaders who had advocated on behalf of defendants involved in an 8-year-old’s gang rape and murder.

As death and rape threats against Ayyub rolled in, Citron noted that those antagonizing the journalist also disseminated her home address in messages that suggested she was available for sex.

“She went offline for several months, she couldn’t work,” Citron said. “She lost her safety. The harm is profound and it will increasingly be felt by women and minorities.”

Citron has studied the intersection of cyber hate crimes, liability and deep-fake technology closely. Though she said the very notion of the technique is “scary,” Citron emphasized that lawmakers can solve it with a tweak to Section 230 of the Communications Decency Act.

“Section 230 says no online server should be treated as a speaker or publisher for user content,” Citron said. “We can change that to say no online service that engaged in reasonable moderator practices shall be treated as a speaker or publisher for another user’s content.”"