Search This Blog

Powered by Blogger.

Blog Archive

Labels

Showing posts with label AI clone. Show all posts

AI Voice Cloning Technology Evoking Threat Among People


A mother, in America, heard a voice in her phone that seemed chillingly real – it was her daughter apparently sobbing, following which a man’s voice took over that demanded a ransom amount. However, the girl in the phone was an AI clone, and her abduction was well, fake.

According to some cybersecurity experts, the biggest threat of AI, is its ability to annihilate the line that differentiate reality from fiction, since it will provide cybercriminals with a simple and efficient tool for spreading misinformation.

AI Voice-cloning Technologies

“Help me, mom, please help me,” heard Jennifer DesStefano, an Arizona resident, from the other end of the line.

She says she was “100 percent” convinced that it was her 15-year-old daughter, with her voice seemingly distressed. Her daughter, at the time was away on a skiing trip.

"It was never a question of who is this? It was completely her voice... it was the way she would have cried," told DeStefano to a local television station in April.

It was not until later that the fraudster took over the call, which came from a private number, and demanded up to $1 million.

As soon as DeStefano made contact with her daughter, the AI-powered deception was finished. However, the horrifying incident, which is currently the subject of a police investigation, highlighted how fraudsters may abuse AI clones.

This is however, not the first case of such happenings, as fraudsters are employing remarkably convincing AI voice cloning technologies, which are publicly accessible online, to steal from victims by impersonating their family members in a new generation of schemes that has alarmed US authorities.

Another such case comes from Chicago, where the 19-year-old Eddie’s grandfather received a call, where someone’s voice sounded just like him where he claimed to be needing money after a ‘car accident’.

Before the deceit was revealed, his grandfather scrambled to gather money and even pondered remortgaging his home due to the persuasive nature of the McAfee Labs-reported hoax.

"Because it is now easy to generate highly realistic voice clones... nearly anyone with any online presence is vulnerable to an attack[…]These scams are gaining traction and spreading," Hany Farid, a professor at the UC Berkeley School of Information, told AFP.

Gal Tal-Hochberg, group chief technology officer at the venture capital firm Team8 further notes to AFP, saying "We're fast approaching the point where you can't trust the things that you see on the internet."

"We are going to need new technology to know if the person you think you're talking to is actually the person you're talking to," he said.