Truth decayTruth decay People Who Spread Deepfakes Think Their Lies Reveal a Deeper Truth

By Mark Andrejevic

Published 9 July 2019

While photographic fakes have been around since the dawn of photography, the more recent use of deep learning artificial intelligence techniques (the “deep” in deepfakes) is leading to the creation of increasingly credible computer simulations. Because the problem seems to be a technological one, it’s tempting to cast about for technological, rather than social or political, solutions. The flaw of such solutions is they assume people and platforms circulating fake information will defer to the truth when confronted with it.

The recent viral “deepfake” video of Mark Zuckerberg declaring, “whoever controls the data controls the world” was not a particularly convincing imitation of the Facebook CEO, but it was spectacularly successful at focusing attention on the threat of digital media manipulation.

While photographic fakes have been around since the dawn of photography, the more recent use of deep learning artificial intelligence techniques (the “deep” in deepfakes) is leading to the creation of increasingly credible computer simulations.

The Zuckerberg video attracted online attention both because it featured the tech wunderkind who is partially responsible for flooding the world with fake news, and because it highlighted the technology that will surely make the problem worse.

‘False positives’ aren’t the only problem
We have seen the pain and tragedy that viral falsehoods can cause, from the harassment of parents who lost children in the Sandy Hook shooting, to mob murders in India and elsewhere.

Deepfakes, we worry, will only worsen the problem. What if they are used to falsely implicate someone in a murder? To provide fake orders to troops on the battlefield? Or to incite armed conflict?

We might describe such events as the “false positives” of deep fakery: events that seemed to happen, but didn’t. On the other hand, there are the “false negatives”: events that did happen, but which run the risk of being dismissed as just another fake.

Think of US President Donald Trump’s claim that the voice on the notorious Access Hollywood tape, in which he boasts about groping women, was not his own. Trump has made a political specialty out of asking people not to believe their eyes or ears. He misled people about the size of the audience at his inauguration, and said he didn’t call Meghan Markle “nasty” in an interview when he did.

This strategy works by calling into question any and all mediated evidence. That is, anything we do not experience directly ourselves, and even much of what we do to the extent that it is not shared by others.