How Does Biometric Spoofing Work?The Mechanics of Biological Mimicry
How Biometric Spoofing Works: Common Tactics Used to Trick Biometric Scanners
What comes to mind when you think of spoofing?
Maybe you picture a phishing email that appears to come from an official address, or an incoming call from a scammer that looks like it’s from your bank. What you’re probably not picturing is someone molding a fake finger out of wood glue, or holding a high-resolution photo up to a camera.
Those are classic tricks. But, current biometric spoofing tactics have gotten even more sophisticated. They range from AI-enabled deepfakes that can swap faces in real-time to voice cloning technologies that can mimic anyone with just a few seconds of audio.
In this chapter, we pry apart the toolkit that fraudsters use to defeat your biometric defenses.
Biometric Spoofing
Your face is more unique than your password: that’s the basic idea behind biometrics authentication. Biometrics are powerful, but they can still be spoofed. Today, we're discussing how biometric spoofing works, why it’s a problem, and ways to guard against the danger.
How Does Biometric Spoofing Work?
As an example of how biometric spoofing works, let’s look at one of the most commonly used techniques: fingerprint recognition.
Modern systems are considerably harder to fool than older ones. That said, there are still many ways to copy or create workable imitation fingerprints using a range of easily accessed materials. An individual’s fingerprints can be captured using:
- Paper printouts of fingerprint photos
- Gelatin (animal collagen) or wax (organic or petroleum-based)
- Modeling clay or Play-Doh (Play-Doh can be particularly effective if the color reflects natural skin tones)
- Pliable Silicones (used for dental impressions)
- Latex (natural rubber)
- Regular school glue or wood glue
- Silly Putty (can be mixed with other elements for better conductivity)
Again, many of these techniques (and others) will only work with older scanners. However, the more widespread use of biometric technology is incentivizing fraudsters to do their research and discover new methods of defeating these systems.
New tech is making other biometric verification methods open to attacks. Fraudsters now have means to defeat iris, vein, and even DNA-based systems.
Even scarier? With the right biometric hacking software, fraudsters may also be able to create AI-generated deepfakes that register as authentic human beings. Deepfakes use a form of artificial intelligence (or “deep learning”) to produce bogus images of fingerprints, retinal patterns, and so on. Advanced deepfake technology can even create convincing fictional photo profiles from scratch.
Common Biometric Spoofing Techniques
Hackers have developed tactics to target fingerprints, facial recognition, iris, voice, and behavioral biometric indicators.
As high-tech as they may seem, biometric systems can be spoofed even by low-tech approaches. From primitive tactics to complex cyberattacks, here are some common ways fraudsters deceive biometric authentication systems:
Defend yourself against fraud... in all its forms.
Request a Demo
AI-Powered Biometric Spoofing: The Next Frontier
Artificial intelligence has changed the biometric spoofing game. A sophisticated attack used to require specialized equipment and technical expertise into attacks. Now, it can be executed with consumer-grade tools.
The barrier to entry has plummeted dramatically. Ten years ago, facial spoofing required sophisticated silicone masks or high-resolution 3-D printing in order to fool liveness detection that required users to blink, smile, or turn their heads. Now, though, AI models can generate photorealistic, 3-D modeled faces from just a few social media photos.
Voice cloning has gotten particularly dangerous; scammers only need a few seconds of audio to create convincing voice models capable of bypassing voice authentication systems used by payment apps. They can harvest samples from social media videos, voicemails, or even just when you say “hello” in response to a spam call.
AI's ability to enable real-time attacks is a big problem, too. Advanced machine learning models can perform live face-swapping during video authentication, for example. These systems analyze the victim's movements in real-time and seamlessly apply them to the fraudster's synthetic face, creating the illusion of a genuine authentication attempt.
The trajectory is clear: biometric spoofing will be more and more of a problem as AI models get more sophisticated and accessible. Generative adversarial networks (GANs) are already being trained specifically to fool biometric systems, creating an arms race between security providers and fraudsters.
What Makes Biometric Spoofing Different From Other Threats?
When functioning correctly, there are many advantages to using biometrics as a form of authentication.
Unlike static forms of authentication, biometric identification is intrinsic to an individual. It cannot be lost or transferred. It is person-specific, and is easy to use. But, all these benefits depend on systems being implemented and performing correctly.
If things go awry, biometrics theft can cause more trouble for its victim than identity theft. With the latter, fraudsters steal an online profile. If biometrics are stolen, they take the entire victim, for all intents and purposes.
Think about it: if personal information gets out, passwords and account numbers can be changed. That’s not the case with biometric data. You can’t switch out your face or fingerprint; that information is permanent, and as such, is permanently compromised.