Search This Blog

Powered by Blogger.

Blog Archive

Labels

Is Your Child in Actual Danger? Wary of Family Emergency Voice-Cloning Frauds

The FTC says, AI-powered voice-cloning software can make the impersonation scam seem even more authentic.

 

If you receive an unusual phone call from a family member in trouble, be cautious: the other person on the line could be a scammer impersonating a family member using AI voice technologies. The Federal Trade Commission has issued a warning about fraudsters using commercially available voice-cloning software for family emergency scams. 

These scams have been around for a long time, and they involve the perpetrator impersonating a family member, usually a child or grandchild. The fraudster will then call the victim and claim that they are in desperate need of money to deal with an emergency. According to the FTC, artificial intelligence-powered voice-cloning software can make the impersonation scam appear even more authentic, duping victims into handing over their money.

All he (the scammer) needs is a short audio clip of your family member's voice—which he could get from content posted online—and a voice-cloning program. When the scammer calls you, he’ll sound just like your loved one,” the FTC says in the Monday warning.

The FTC did not immediately respond to a request for comment, leaving it unclear whether the US regulator has noticed an increase in voice-cloning scams. However, the warning comes just a few weeks after The Washington Post detailed how scammers are using voice-cloning software to prey on unsuspecting families.

In one case, the scammer impersonated a Canadian couple's grandson, who claimed to be in jail, using the technology. In another case, the fraudsters used voice-cloning technology to successfully steal $15,449 from a couple who were also duped into believing their son had been arrested.

The fact that voice-cloning services are becoming widely available on the internet isn't helping matters. As a result, it's possible that scams will become more prevalent over time, though at least a few AI-powered voice-generation providers are developing safeguards to prevent potential abuse. The FTC says there is an easy way to detect a family emergency scam to keep consumers safe. "Don't believe the voice. Call the person who allegedly contacted you to confirm the story. 

“Don’t trust the voice. Call the person who supposedly contacted you and verify the story. Use a phone number you know is theirs,” the FTC stated. “If you can’t reach your loved one, try to get in touch with them through another family member or their friends.”

Targeted victims should also consider asking the alleged family member in trouble a personal question about which the scammer is unaware.
Share it:

Cyber Fraud

data security

Fraud

Safety

Scam

Voice Cloning