Last month, I discussed deepfake technology and our efforts to prevent malicious pornographic deepfakes. But, as our presidential election season nears, I wanted to highlight the very serious, looming threat that deepfakes also pose to our democracy.
Again, deepfakes are forged or fake videos created via artificial intelligence where a person’s likeness, including their face and voice, is realistically swapped with someone else’s. (Click here to view a sample.) Since the introduction of deepfakes, they have been used extensively to insert women’s likenesses into pornographic films without consent, otherwise known as malicious deepfake pornography. But, deepfakes can and will soon be used for a number of other pernicious purposes, and one of our biggest concerns is election tampering.
Deepfake technology and social media are a potent mix for individual or organized actors looking to influence elections. Imagine days before a pivotal, and seemingly close, election a video is released on social media showing a candidate having an affair, or uttering hate speech behind closed doors, or taking a bribe from a lobbyist. Given the combination of the high level of public interest in and the salacious nature of the footage, this content is well-suited to go viral quickly garnering views from thousands of potential voters. Without much time or counter-evidence, candidates will simply not be able to mount a credible or effective defense, and the fake video will ultimately cause a sufficient number of voters to alter their votes.
Deepfakes are so dangerous in the election context because of their hyperrealism and the lack of source footage. This past month you may have seen the altered video footage circulating around social media of Speaker Nancy Pelosi. The original footage was slowed down to make it seem that the Speaker was either inebriated or had some health issue. The press coverage of the video soon featured the source footage as a comparison providing an irrefutable point of defense to the faked material. Political victims of deepfakes have no such material available in their defense and will only be able to provide the media with a statement of denial, statements that the public has generally grown accustomed to and mistrustful of.
Thus far, leaders of the social media platforms have generally made public statements expressing their concern over deepfake technology and their commitment to working on policies to mitigate the threat. But even so, their options are severely limited. Platforms can certainly remove videos that are obvious fakes, yet those obvious fakes are simply less dangerous than the convincing ones. Last month, a deepfake of Facebook founder and CEO Mark Zuckerberg was released online. Facebook did not remove the video from the platform stating that their current policy is simply to reduce the distribution of these videos instead of removing them. But, frankly, this was a low-stakes decision for Facebook because the footage was obviously fake due to the absurdist language used in the video and its low quality. Facebook, and the other platforms, need to consider now what they would do had this deepfake been believable and compromising, as the next one might well be.
Countering the threat of deepfakes, however, must extend well beyond just the social media platforms and must include comprehensive, urgent efforts involving education, legislation, and technology development. That’s why we are educating people across the country to be on the lookout for deepfakes, so we can minimize their detrimental effects once they are released during the campaigns. We are advocating for funding to develop essential technology to identify deepfakes and thus neutralize their danger. And, we have sponsored legislation at the state level to criminalize efforts at interfering in elections via deepfake technology.
The potential for deepfake technology to be used to distort and undermine our democratic elections is real, and the Organization for Social Media Safety is committed to protecting all of us from this imminent danger. We do not have much time – stay tuned.