Detecting “Deepfake” Videos by Checking for the Pulse

the physiology as another signature to see if it is consistent with previous data is very helpful for detection.”

Deepfakes found “in the wild” are many steps below the kind of quality that Yin’s lab generates, but it means that manipulated videos can be much easier to spot.

“Considering that we work with 3D using our own capture setup, we generate some of our own composites, which are basically ‘fake’ videos,” Ciftci said. “The big difference is that we scan real people and use it, while deepfakes take data from other people and use it. It’s not that different if you think about it that way.

“It’s like the police knowing what all the criminals do and how they do it. You understand how these deepfakes are being done. We learn the tricks and even use some of them in our own data creation.”

Since the FakeCatcher findings were published, 27 researchers around the world have been using the algorithm and the dataset in their own analyses. Whenever these kinds of studies are made public, though, there are concerns about telling malicious deepfake makers how their videos have been shown to be false, allowing them to modify their work to be undetectable in the future.

Ciftci is not too worried about that, however: “It’s not going to be easy for someone who doesn’t know much about the science behind it. They can’t just use what’s out there to make this happen without significant software changes.”

Intel’s involvement in the FakeCatcher research is connected to its interests in volumetric capture and augmented/virtual reality experiences. Intel Studios operates what Demir calls “the world’s largest volumetric capture stage”: 100 cameras in a 10,000-square-foot geodesic dome that can handle about 30 people simultaneously — even a few horses once.

Future plans include volumetric-capture technology to be included in mainstream television shows, sports and augmented-reality applications, where the audience can immerse in any scene. Films in 3D and VR also are in the works, with two VR projects recently premiering at the Venice Film Festival.

By compiling the FakeCatcher data and reverse-engineering it, Intel Studios hopes to make more realistic renderings that incorporate the kind of biological markers that humans with real heartbeats have.

“Intel’s vision is changing from a chip-first company to putting AI, edge computing and data first,” Demir said. “We are making a transformation to AI-specific approaches in any way we can.”

(Interesting to note: Intel’s CEO is Bob Swan, MBA ’85, who last year told the School of Management magazine Reaching Higher that “intellectual curiosity is a wonderful and powerful thing to help you grow and develop and evolve over time.”)

Future research will seek to improve and refine the FakeCatcher technology, drilling further down into the data to determine how the deepfakes are made. That capability has many implications, including cybersecurity and telemedicine, and Yin also hopes for further collaborations with Intel.

“We’re still in the brainstorming stage,” he said. “We want to have an impact not only in academia but also to see if our research would have a role in industry.”