Considered opinion: Truth decayThe era of fake video begins
“Deepfake” videos produced by Russian-linked trolls are the latest weapon in the ongoing fake news war. The Kremlin-backed trolls are already experimenting with new video manipulation techniques which use artificial intelligence to create convincing doctored videos. Franklin Foer writes the internet has always contained the seeds of postmodern hell, and that mass manipulation, from clickbait to Russian bots to the addictive trickery that governs Facebook’s News Feed, is the currency of the medium. In this respect, the rise of deepfakes is the culmination of the internet’s history to date—and probably only a low-grade version of what’s to come. Fake-but-realistic video clips are not the end point of the flight from reality that technologists would have us take. The apotheosis of this vision is virtual reality. “The ability to manipulate consumers will grow because VR definitionally creates confusion about what is real,” Foer writes. “Several decades ago, after giving the nascent technology a try, the psychedelic pamphleteer Timothy Leary reportedly called it ‘the new LSD’.”
The Telegraph reports that official monitors are warning that “deepfake” videos produced by Russian-linked trolls are the latest weapon in the ongoing fake news war.
Experts at the U.K.-led East Stratcom Task Force, an EU counter-disinformation unit that monitors, analyzes, and debunks disinformation operations, say that Kremlin-backed trolls are already experimenting with new video manipulation techniques which use artificial intelligence to create convincing doctored videos.
Franklin Foer writes in The Atlantic that
In a dank corner of the internet, it is possible to find actresses from Game of Thrones or Harry Potter engaged in all manner of sex acts. Or at least to the world the carnal figures look like those actresses, and the faces in the videos are indeed their own. Everything south of the neck, however, belongs to different women. An artificial intelligence has almost seamlessly stitched the familiar visages into pornographic scenes, one face swapped for another. The genre is one of the cruelest, most invasive forms of identity theft invented in the internet era. At the core of the cruelty is the acuity of the technology: A casual observer can’t easily detect the hoax.
This development, which has been the subject of much hand-wringing in the tech press, is the work of a programmer who goes by the nom de hack “deepfakes.” And it is merely a beta version of a much more ambitious project. One of deepfakes’s compatriots told Vice’s Motherboard site in January that he intends to democratize this work. He wants to refine the process, further automating it, which would allow anyone to transpose the disembodied head of a crush or an ex or a co-worker into an extant pornographic clip with just a few simple steps. No technical knowledge would be required. And because academic and commercial labs are developing even more-sophisticated tools for non-pornographic purposes—algorithms that map facial expressions and mimic voices with precision—the sordid fakes will soon acquire even greater verisimilitude.