The Vulnerabilities of KYC Authentication in the Face of Generative AI

The Vulnerabilities of KYC Authentication in the Face of Generative AI

Know Your Customer (KYC) procedures, designed to verify the identity of customers, have become a common practice for financial institutions, banks, and fintech startups. Key players in the industry, such as Wise, Revolut, Gemini, and LiteBit, utilize ID images, including cross-checked selfies, as part of the KYC authentication process.

Recently, concerns have been raised about the potential impact of generative AI on the effectiveness of KYC checks. Viral posts on social media platforms like X (formerly Twitter) and Reddit have demonstrated how an attacker, leveraging open-source and off-the-shelf software, could manipulate a selfie using generative AI tools to pass a KYC test. While there is no concrete evidence of such tools being used to deceive an actual KYC system, the ease with which convincing deepfaked ID images can be created raises alarms within the security community.

In a typical KYC ID image authentication, customers submit a photo of themselves holding an official document, such as a passport or driver’s license. The submitted image is then cross-referenced with existing documents and selfies to prevent impersonation. However, the process has never been foolproof, with fraudsters historically selling forged IDs and manipulated selfies.

Generative AI introduces new challenges to KYC security. Tutorials online demonstrate how tools like Stable Diffusion, an open-source image generator, can create synthetic renderings of a person in various environments. Attackers can manipulate these renderings to make it appear as though the individual is holding an ID document, which can then be seamlessly inserted using image editing software.

While creating convincing deepfake ID images may have previously required advanced knowledge of photo editing software, the barrier to entry has lowered. Stable Diffusion, in combination with additional tools and extensions, can produce realistic images with lighting, shadows, and environments. According to a Reddit user named harsh, who shared a workflow for creating deepfake ID selfies, it takes approximately one to two days to generate a convincing image.

Despite the potential threat posed by deepfaked KYC images, the security landscape may provide some safeguards. Some platforms implement "liveness" checks, requiring users to perform specific actions in a short video to prove they are real. However, even these checks can be bypassed using generative AI, as observed by Jimmy Su, Chief Security Officer for cryptocurrency exchange Binance.

In light of these developments, the reliability of KYC as a security measure is questioned. While some experts, like Su, believe deepfaked images have not yet reached a point where they can deceive human reviewers, there is a growing consensus that continued advancements in generative AI could render KYC authentication less effective in the near future.