Truth decayEnhanced Deepfakes Capabilities for Less-Skilled Threat Actors Mean More Misinformation

Published 2 January 2020

The ability to create manipulated content is not new. But what has changed with the advances in artificial intelligence is you can now build a very convincing deepfake without being an expert in technology. This “democratization” of deepfakes will increase the quantity of misinformation and disinformation aiming to weaken and undermine evidence-based discourse.

The ability to create manipulated content is not new. Manipulated images were used as far back as the Second World War in campaigns designed to make people believe things that weren’t true.

Steve Grobman, a senior vice president and chief technology officer at McAfee, writes in a McAfee blog post that what has changed with the advances in artificial intelligence is you can now build a very convincing deepfake without being an expert in technology. There are websites set up where you can upload a video and receive in return, a deepfake video. “There are very compelling capabilities in the public domain that can deliver both deepfake audio and video abilities to hundreds of thousands of potential threats actors with the skills to create persuasive phony content,” he writes.

He notes that deepfake video or text can be weaponized to enhance information warfare.

Freely available video of public comments can be used to train a machine-learning model that can develop a deepfake video depicting one person’s words coming out of another’s mouth. Attackers can now create automated, targeted content to increase the probability that an individual or groups fall for a campaign. In this way, AI and machine learning can be combined to create massive chaos.

We should assume that adversaries are going to use the best technology to accomplish their goals, so if we think about nation-state actors attempting to manipulate an election, using deepfake video to manipulate an audience makes a lot of sense.

Adversaries will try to create wedges and divides in society, or if a cybercriminal can have a CEO make what appears to be a compelling statement that a company missed earnings or that there’s a fatal flaw in a product that’s going to require a massive recall. Such a video can be distributed to manipulate a stock price or enable other financial crimes

Grobman’s – and McAfee’s – conclusions: “We predict the ability of an untrained class to create deepfakes will enhance an increase in quantity of misinformation.”