Search This Blog

Powered by Blogger.

Blog Archive

Labels

Study: Artificial Intelligence is Fueling a Rise in Online Voice Scams

A study warns that artificial intelligence is fueling a rise in online voice scams.

 

In accordance with McAfee, AI technology is supporting an increase in online speech scams, with only three seconds of audio required to duplicate a person's voice. McAfee studied 7,054 people from seven countries and discovered that one-quarter of adults have previously experienced some type of AI speech fraud, with one in ten being targeted directly and 15% reporting that it happened to someone they know. 77% of victims reported financial losses as a result. 

Furthermore, McAfee Labs security researchers have revealed their findings and analysis following an in-depth investigation of AI voice-cloning technology and its application by cyber criminals. Scammers replicating voices with AI technology. Everyone's voice is distinct, like a biometric fingerprint, which is why hearing someone speak is considered trustworthy.

However, with 53% of adults giving their speech data online at least once a week (through social media, voice notes, and other means) and 49% doing so up to 10 times a week, copying how someone sounds is now a potent tool in a cybercriminal's inventory.

With the popularity and usage of artificial intelligence techniques on the rise, it is now easier than ever to edit photos, videos, and, perhaps most alarmingly, the voices of friends and family members. According to McAfee's research, scammers are utilizing AI technology to clone voices and then send a phoney voicemail or phone the victim's contacts professing to be in crisis - and with 70% of adults unsure they could tell the difference between the cloned version and the genuine thing, it's no surprise this technique is gaining momentum.

45% of respondents stated they would respond to a voicemail or voice message claiming to be from a friend or loved one in need of money, especially if they believed the request came from their partner or spouse (40%), parent (31%), or child (20%).
 
At 41%, parents aged 50 and up are most likely to respond to a child. Messages saying that the sender had been in a car accident (48%), robbed (47%), lost their phone or wallet (43%), or required assistance when travelling abroad (41%), were the most likely to generate a response.

However, the cost of falling for an AI voice scam can be enormous, with more than a third of those who lost money stating it cost them more than $1,000, and 7% being fooled out of $5,000 to $15,000. The survey also discovered that the growth of deepfakes and disinformation has made people more skeptical of what they see online, with 32% of adults stating they are now less trusting of social media than they were previously.

“Artificial intelligence brings incredible opportunities, but with any technology, there is always the potential for it to be used maliciously in the wrong hands. This is what we’re seeing today with the access and ease of use of AI tools helping cybercriminals to scale their efforts in increasingly convincing ways,” said Steve Grobman, McAfee CTO.

McAfee researchers spent three weeks studying the accessibility, ease of use, and usefulness of AI voice-cloning tools as part of their analysis and assessment of this emerging trend, discovering more than a dozen publicly available on the internet.

There are both free and commercial tools available, and many just require a basic degree of skill and competence to utilize. In one case, three seconds of audio was enough to provide an 85% match, but with additional time and work, the accuracy can be increased.

McAfee researchers were able to achieve a 95% voice match based on a limited number of audio samples by training the data models. The more realistic the clone, the higher a cybercriminal's chances of duping someone into turning over their money or performing other desired actions. A fraudster might make thousands of dollars in only a few hours using these lies based on the emotional flaws inherent in close relationships.

“Advanced artificial intelligence tools are changing the game for cybercriminals. Now, with very little effort, they can clone a person’s voice and deceive a close contact into sending money,” said Grobman.

“It’s important to remain vigilant and to take proactive steps to keep you and your loved ones safe. Should you receive a call from your spouse or a family member in distress and asking for money, verify the caller – use a previously agreed codeword, or ask a question only they would know. Identity and privacy protection services will also help limit the digital footprint of personal information that a criminal can use to develop a compelling narrative when creating a voice clone,” concluded Grobman.

McAfee's researchers noticed that they had no issue mimicking accents from throughout the world, whether they were from the US, UK, India, or Australia, but more distinctive voices were more difficult to replicate.

For example, the voice of someone who speaks at an unusual pace, rhythm, or style requires more effort to effectively clone and is thus less likely to be targeted. The research team's overarching conclusion was that artificial intelligence has already changed the game for cybercriminals. The barrier to entry has never been lower, making it easier to perpetrate cybercrime.

To protect against AI voice cloning, it is recommended to establish a unique verbal "codeword" with trusted family members or friends and to always ask for it if they contact you for assistance, especially if they are elderly or vulnerable. When receiving calls, texts, or emails, it is important to question the source and consider if the request seems legitimate. If in doubt, it is advisable to hang up and contact the person directly to verify the information before responding or sending money. It is also important to be cautious about sharing personal information online and to carefully consider who is in your social media network. Additionally, consider using identity monitoring services to protect your personally identifiable information and prevent cyber criminals from posing as you.

Share it:

AI

Artificial Intelligence

Scam

Technology

Voice Scam