How to Spot Deepfakes: Tips for Identifying Fake AI-Generated Media

How to Spot Deepfakes: Tips for Identifying Fake AI-Generated Media
Table of Contents
1How to Spot Deepfakes: Tips for Identifying Fake AI-Generated Media
Oddness Around the Mouth or Chin
Strange Elements of Speech
Consistency Between Face and Body
Discontinuity Across the Video Clip

In a critical election year with major votes in the UK, US, and France, disinformation is rampant on social media. A significant concern is deepfakes—AI-generated images or audio of political figures designed to mislead voters. While not yet prominent in the UK election, examples are surfacing globally, especially in the US. Here are some visual cues to identify deepfakes:

Oddness Around the Mouth or Chin

Wrinkles and Detail: The skin around the mouth may have fewer wrinkles or less detail.

Blurred Chin: The chin might appear blurry or smudged.

Voice Synchronization: Poor synchronization between voice and mouth movements can be a telltale sign.

For example, a deepfake video from June 17 showed a simulation of Nigel Farage destroying Rishi Sunak’s house in Minecraft. Another video featured Keir Starmer setting a trap in "Nigel’s pub." Dr. Mhairi Aitken from the Alan Turing Institute points out that besides the absurdity of the situations, the imperfect sync between voice and mouth is noticeable.

Strange Elements of Speech

Sentence Structure: AI-generated audio might have odd sentence structures. For instance, a deepfake video of Keir Starmer promoting an investment scheme awkwardly places "pounds" before numbers, like "pounds 35,000 a month."

Monotone Intonation: AI-generated speech often maintains a monotone rhythm and pattern, unlike natural speech.

Aitken advises comparing the voice, mannerisms, and expressions in suspect videos with genuine recordings of the person.

Consistency Between Face and Body

Proportion and Movement: Disproportionate head sizes or differences in skin tones between the neck and face can indicate a deepfake. An example is a video of Ukrainian President Volodymyr Zelenskiy asking civilians to surrender, where the head is a different size than the body, and the body remains immobile.

Hany Farid from UC Berkeley notes that in such "puppet-master" deepfakes, the body below the neck doesn’t move, which is a clear giveaway.

Discontinuity Across the Video Clip

Inconsistent Quality: Look for variations in video quality or disruptions within the clip. For example, a video falsely showing US state department spokesperson Matthew Miller justifying Ukrainian strikes on Belgorod had such inconsistencies and was flagged as a deepfake.

Being aware of these signs can help you discern real footage from deepfakes, reducing the risk of being misled by AI-generated disinformation.