DEEPFAKESEvents That Never Happened Could Influence the 2024 Presidential Election – a Cybersecurity Researcher Explains Situation Deepfakes

By Christopher Schwartz

Published 18 July 2023

The basic idea and technology of a situation deepfake are the same as with any other deepfake, but with a bolder ambition: to manipulate a real event or invent one from thin air. Situation deepfakes have already been used in recent weeks – the first in a Republican National Committee’s ad against President Joe Biden, the second in an anti-Trump ad by Ron DeSantis’s campaign.

Imagine an October surprise like no other: Only a week before Nov. 5, 2024, a video recording reveals a secret meeting between Joe Biden and Volodymyr Zelenskyy. The American and Ukrainian presidents agree to immediately initiate Ukraine into NATO under “the special emergency membership protocol” and prepare for a nuclear weapons strike against Russia. Suddenly, the world is on the cusp of Armageddon.

While journalists could point out that no such protocol exists and social media users might notice odd video-gamelike qualities of the video, others might feel that their worst fears have been confirmed. When Election Day comes, these concerned citizens may let the video sway their votes, unaware that they have just been manipulated by a situation deepfake – an event that never actually happened.

Situation deepfakes represent the next stage of technologies that have already shaken audiences’ perceptions of reality. In our research at the DeFake Project, my colleagues at the Rochester Institute of Technology, the University of MississippiMichigan State University and I study how deepfakes are made and what measures voters can take to defend themselves from them.

Imagining Events That Never Happened
A deepfake is created when someone uses an artificial intelligence tool, especially deep learning, to manipulate or generate a face, a voice or – with the rise of large language models like ChatGPT – conversational language. These can be combined to form “situation deepfakes.”

The basic idea and technology of a situation deepfake are the same as with any other deepfake, but with a bolder ambition: to manipulate a real event or invent one from thin air. Examples include depictions of Donald Trump’s perp walk and Trump hugging Anthony Fauci, neither of which happened. The hug shot was promoted by a Twitter account associated with the presidential campaign of Trump rival Ron DeSantis. An attack ad targeting Joe Biden’s 2024 campaign published by the Republican National Committee was made entirely with AI.

At the DeFake Project, our research has found that deepfakes, including situations, are typically created by some mixture of adding one piece of media with another; using a video to animate an image or alter another video, dubbed puppeteering; conjuring a piece of media into existence, typically using generative AI; or some combination of these techniques.