Search This Blog

Powered by Blogger.

Blog Archive

Labels

Showing posts with label Voice Cloning. Show all posts

AI 'Kidnapping' Scams: A Growing Threat

Cybercriminals have started using artificial intelligence (AI) technology to carry out virtual abduction schemes, which is a worrying trend. These scams, which use chatbots and AI voice cloning techniques, have become much more prevalent recently and pose a serious threat to people. 

The emergence of AI-powered voice cloning tools has provided cybercriminals with a powerful tool to execute virtual kidnapping scams. By using these tools, perpetrators can mimic the voice of a target's family member or close acquaintance, creating a sense of urgency and fear. This psychological manipulation is designed to coerce the victim into complying with the scammer's demands, typically involving a ransom payment.

Moreover, advancements in natural language processing and AI chatbots have made it easier for cybercriminals to engage in conversation with victims, making the scams more convincing and sophisticated. These AI-driven chatbots can simulate human-like responses and engage in prolonged interactions, making victims believe they are indeed communicating with their loved ones in distress.

The impact of these AI 'kidnapping' scams can be devastating, causing immense emotional distress and financial losses. Victims who fall prey to these scams often endure intense fear and anxiety, genuinely believing that their loved ones are in danger. The scammers take advantage of this vulnerability to extort money or personal information from the victims.

To combat this growing threat, law enforcement agencies and cybersecurity experts are actively working to raise awareness and develop countermeasures. It is crucial for individuals to be vigilant and educate themselves about the tactics employed by these scammers. Recognizing the signs of a virtual kidnapping scam, such as sudden demands for money, unusual behavior from the caller, or inconsistencies in the story, can help potential victims avoid falling into the trap.

A proactive approach to solving this problem is also required from technology businesses and AI developers. To stop the abuse of AI voice cloning technology, strict security measures must be put in place. Furthermore, using sophisticated algorithms to identify and stop malicious chatbots can deter attackers.

AI Voice Cloning Technology Evoking Threat Among People


A mother, in America, heard a voice in her phone that seemed chillingly real – it was her daughter apparently sobbing, following which a man’s voice took over that demanded a ransom amount. However, the girl in the phone was an AI clone, and her abduction was well, fake.

According to some cybersecurity experts, the biggest threat of AI, is its ability to annihilate the line that differentiate reality from fiction, since it will provide cybercriminals with a simple and efficient tool for spreading misinformation.

AI Voice-cloning Technologies

“Help me, mom, please help me,” heard Jennifer DesStefano, an Arizona resident, from the other end of the line.

She says she was “100 percent” convinced that it was her 15-year-old daughter, with her voice seemingly distressed. Her daughter, at the time was away on a skiing trip.

"It was never a question of who is this? It was completely her voice... it was the way she would have cried," told DeStefano to a local television station in April.

It was not until later that the fraudster took over the call, which came from a private number, and demanded up to $1 million.

As soon as DeStefano made contact with her daughter, the AI-powered deception was finished. However, the horrifying incident, which is currently the subject of a police investigation, highlighted how fraudsters may abuse AI clones.

This is however, not the first case of such happenings, as fraudsters are employing remarkably convincing AI voice cloning technologies, which are publicly accessible online, to steal from victims by impersonating their family members in a new generation of schemes that has alarmed US authorities.

Another such case comes from Chicago, where the 19-year-old Eddie’s grandfather received a call, where someone’s voice sounded just like him where he claimed to be needing money after a ‘car accident’.

Before the deceit was revealed, his grandfather scrambled to gather money and even pondered remortgaging his home due to the persuasive nature of the McAfee Labs-reported hoax.

"Because it is now easy to generate highly realistic voice clones... nearly anyone with any online presence is vulnerable to an attack[…]These scams are gaining traction and spreading," Hany Farid, a professor at the UC Berkeley School of Information, told AFP.

Gal Tal-Hochberg, group chief technology officer at the venture capital firm Team8 further notes to AFP, saying "We're fast approaching the point where you can't trust the things that you see on the internet."

"We are going to need new technology to know if the person you think you're talking to is actually the person you're talking to," he said.  

Is Your Child in Actual Danger? Wary of Family Emergency Voice-Cloning Frauds

 

If you receive an unusual phone call from a family member in trouble, be cautious: the other person on the line could be a scammer impersonating a family member using AI voice technologies. The Federal Trade Commission has issued a warning about fraudsters using commercially available voice-cloning software for family emergency scams. 

These scams have been around for a long time, and they involve the perpetrator impersonating a family member, usually a child or grandchild. The fraudster will then call the victim and claim that they are in desperate need of money to deal with an emergency. According to the FTC, artificial intelligence-powered voice-cloning software can make the impersonation scam appear even more authentic, duping victims into handing over their money.

All he (the scammer) needs is a short audio clip of your family member's voice—which he could get from content posted online—and a voice-cloning program. When the scammer calls you, he’ll sound just like your loved one,” the FTC says in the Monday warning.

The FTC did not immediately respond to a request for comment, leaving it unclear whether the US regulator has noticed an increase in voice-cloning scams. However, the warning comes just a few weeks after The Washington Post detailed how scammers are using voice-cloning software to prey on unsuspecting families.

In one case, the scammer impersonated a Canadian couple's grandson, who claimed to be in jail, using the technology. In another case, the fraudsters used voice-cloning technology to successfully steal $15,449 from a couple who were also duped into believing their son had been arrested.

The fact that voice-cloning services are becoming widely available on the internet isn't helping matters. As a result, it's possible that scams will become more prevalent over time, though at least a few AI-powered voice-generation providers are developing safeguards to prevent potential abuse. The FTC says there is an easy way to detect a family emergency scam to keep consumers safe. "Don't believe the voice. Call the person who allegedly contacted you to confirm the story. 

“Don’t trust the voice. Call the person who supposedly contacted you and verify the story. Use a phone number you know is theirs,” the FTC stated. “If you can’t reach your loved one, try to get in touch with them through another family member or their friends.”

Targeted victims should also consider asking the alleged family member in trouble a personal question about which the scammer is unaware.