Search This Blog

Powered by Blogger.

Blog Archive

Labels

Showing posts with label Voice Clonnig. Show all posts

AI Scams: When Your Child's Voice Isn't Their Own

 

A fresh species of fraud has recently surfaced, preying on unwary victims by utilizing cutting-edge artificial intelligence technologies. A particularly alarming development is the use of AI-generated voice calls, in which con artists imitate children's voices to trick parents into thinking they are chatting with their own children only to be duped by a fatal AI hoax.

For law enforcement organizations and families around the world, these AI fraud calls are an increasing issue. These con artists imitate a child's voice using cutting-edge AI speech technology to trick parents into thinking their youngster needs money right away and is in distress.

Numerous high-profile incidents have been published, garnering attention and making parents feel exposed and uneasy. One mother reported getting a frightening call from a teenager who claimed to be her daughter's age and to be involved in a kidnapping. She paid a sizeable sum of money to the con artists in a panic and a desperate attempt to protect her child, only to learn later that it was an AI voice and that her daughter had been safe the entire time.

Due to the widespread reporting of these instances, awareness-raising efforts and preventative actions are urgently needed. It's crucial to realize that AI-generated voices have developed to a remarkable level of sophistication and are now almost indistinguishable from actual human voices in order to comprehend how these frauds work. Fraudsters rely on parents' natural desire to protect their children at all costs by using this technology to influence emotions.

Technology businesses and law enforcement organizations are collaborating to fight these AI scams in response to the growing worry. One method involves improving voice recognition software to better accurately identify sounds produced by AI. To stay one step ahead of their schemes, however, is difficult because con artists are constantly changing their strategies.

Experts stress the significance of maintaining vigilance and taking proactive steps to guard oneself and loved ones against falling for such fraud. It is essential to establish the caller's identification through other ways if parents receive unexpected calls asking for money, especially under upsetting circumstances. The scenario can be verified by speaking with the youngster directly or by requesting a dependable relative or acquaintance.

Children must be taught about AI scams in order to avoid accidentally disclosing personal information that scammers could use against them. Parents should talk to their children about the dangers of giving out personal information over the phone or online and highlight the need to always confirm a caller's identity, even if they seem familiar.

Technology is always developing, which creates both opportunities and difficulties. Scammers can take advantage of modern techniques to target people's vulnerabilities, as evidenced by the increase of AI frauds. Technology companies, law enforcement, and people all need to work together to combat these scams in order to prevent themselves and their loved ones from falling for AI fraud. People must also be knowledgeable, careful, and proactive in preventing AI fraud.

Kidnapping Scam Implicates AI Cloning

 


With ChatGPT and other businesses developing artificial intelligence (AI) technology for their customers, artificial intelligence (AI) has gained traction. The three major technology companies, Google, Microsoft, and Meta appear to be investing heavily and concentrating their efforts on artificial intelligence.

A woman recently posted a Facebook post about her experience with artificial intelligence-based fraud. It is highly recommended that people protect themselves against similar incidents by creating a secret family word or question that is known only to their family members. This will enable them to authenticate that they are not being scammed by automated systems. They will also share the news item on social media sites to spread the word. 

In the last few years, AI tools have made it possible for scammers to exploit the human habit to steal millions of dollars from people. This is done by exploiting their vulnerability to exploit them. An organized group of fraudsters used cloned voices and modulated messages to send a modulated message to the girl's mother, accusing her of kidnapping her daughter by allowing them to do it. 

A woman from Arizona named Jennifer DeStefano reported that a few days ago, she received a call from an unknown number, according to news reports from WKYT, a CBS News-affiliated US news outlet. During a recent interview with the news outlet, DeStefano revealed that her 15-year-old daughter also received the call while skiing during the incident. 

As DeStefano picked up the phone, the next thing she heard was her daughter crying and sobbing, calling her mother for help. " She said, 'Mom, these criminal men have me. Help me, help me.' " 

As soon as this man gets on the phone, he says, 'Listen, listen to this. The man said, 'I've got your daughter,' DeStefano responded, explaining that the man had described exactly how the event unfolded. 

The man then demanded a ransom of USD 1 million to release the teenager. As they approached the 'kidnapper,' DeStefano said she did not have that much money, so he agreed to keep USD 50,000 from her. 

As she continued, she said, "I am planning to have my way with her and drop her off in Mexico," and at that moment, she said, "I just started shaking." Ms. DeStefano added. In the background, she can be heard yelling “Help me, Mom!”. Please help me. Help me," and bawling.

When DeStefano received the call, she asserted that her daughter was in the dance studio with other mothers when she picked up the phone. 

The first telephone call was made to 911, while the second was made to DeStefano's husband. She confirmed within minutes that her teenage daughter was safe on her skiing trip during her skiing trip. She indicated, however, that when she answered the phone, the voice that came over sounded just like her daughter's voice. 

In an interview with NBC 15, she told the network that she was truly convinced that her daughter was on the line, rather than a machine learning platform (AI) that was being used.  

According to Subbarao Kambhampati, a computer science professor and artificial intelligence expert at Arizona State University, in the beginning, it will be necessary to have a large number of samples. As a result, you will be able to carry out this task within the three seconds that you have to spare. It took three seconds to complete the task. It's possible to get a close idea of how you sound in just three seconds. 

It has been reported that if a large enough sample size of subjects is used, AI might mimic accents and emotions, according to the professor. 

According to a post on the mother's Facebook page, DeStefano was particularly unnerved by the voice simulation as Brei has no public social media accounts to be heard and barely communicates with her parents through social media at all. 

"In regards to Brie's voice, she has several interviews which she does for sports/school, in which a significant portion is her own." Brie's mother explained. "Children with public accounts should, however, be extra cautious. This should be taken very seriously."

FBI experts warn that fraudsters often find their targets on social media sites to commit fraud. The police are currently investigating the situation. It is still unknown who the fraudsters are, and no one has been able to capture them or find them.