Criminal Used AI-Generated Voice of Child to Scam Parent

ai voice scam

The advent of AI-generated voices has raised many concerns amongst the public. People can possibly use this technology to scam or frame others for crimes. Today’s story is a particularly heinous case of scamming that embodies everything AI detractors fear.

Earlier this year, Arizona resident Jennifer DeStefano received a call that altered her life. The voice on the other end sounded just like that of her daughter Brie’s: “Mom! I messed up,” it screamed.

Why Was Someone Using an AI-Generated Version of DeStefano’s Voice?

The cries of what DeStefano thought to be her daughter were then interrupted by the voice of a man.

“Listen here, I have your daughter,” it said. “You call the police, you call anybody, I’m gonna pop her something so full of drugs. I’m gonna have my way with her then drop her off in Mexico, and you’re never going to see her again.”

The man demanded $1 million from DeStefano in exchange for the release of Brie. Thankfully, it wasn’t long before the entire incident proved to be a scam, as Jennifer quickly called Brie, who was confused by her mother’s hysteria.

Fake or not, however, DeStefano will forever be traumatized by those four minutes she spent believing her daughter had been kidnapped.

A Classic Scam Becoming More Sophisticated

Imposter scams have been widespread for decades. An older version of the scam, for example, involved the scammer calling someone’s grandparent and telling them that their grandchild has been hurt and needs money.

With AI-generated voices, the scam not only becomes easier, it becomes more convincing. With under a minute’s worth of audio of someone’s voice, scammers are able to use cheap AI tools to create a clone. 

Where do the scammers get the audio clips of their victims’ voices? Well, usually they are sourced from social media.

“A scammer could use AI to clone the voice of your loved one,” the Federal Trade Commission said in a statement. “All he needs is a short audio clip of your family member’s voice — which he could get from content posted online — and a voice-cloning program. When the scammer calls you… [the voice will] sound just like your loved one.”

Can You Protect Yourself From AI Voice Scams?

ai voice scam
Photo Credit: Teguhjatipras via Pixabay

There is no surefire way of protecting yourself from these scams, but if you find yourself in a situation like DeStefano’s, common sense will be your best friend.

If the voice on the other end is demanding exorbitant amounts of money or the purchasing of gift cards, you are likely being scammed. Like DeStefano, you should immediately call the person whose voice is being used and verify that the information you’ve been receiving is in fact real. If all else fails, contact law enforcement.



Leave a Reply

Your email address will not be published. Required fields are marked *

On Key

Related Posts

Blackbeard's Treasure

Five Can’t Miss Treasure Hunts

It’s hard to watch films like Indiana Jones and Tomb Raider without putting ourselves in the shoes of their protagonists and wanting to search for