Search This Blog

Showing posts with label Deepfake. Show all posts

 Sophos: Hackers Avoid Deep Fakes as Phishing Attacks are Effective

According to a prominent security counsel for the UK-based infosec business Sophos, the fear of deepfake scams is entirely exaggerated.

According to John Shier, senior security adviser for cybersecurity company Sophos, hackers may never need to utilize deepfakes on a large scale because there are other, more effective ways to deceive individuals into giving up personal information and financial data.

As per Shier, phishing and other types of social engineering are much more effective than deepfakes, which are artificial intelligence-generated videos that imitate human speech.

What are deepfakes?

Scammers frequently use technology to carry out 'Identity Theft'. In order to demonstrate the risks of deepfakes, researchers in 2018 employed the technology to assume the identity of former US President Barack Obama and disseminate a hoax online.

Shier believes that while deepfakes may be overkill for some kinds of fraud, romance scams—in which a scammer develops a close relationship with their victim online in order to persuade them to send them money—could make good use of the technology because videos will give an online identity inherent legitimacy.

Since deepfake technology has gotten simpler to access and apply, Eric Horvitz, chief science officer at Microsoft, outlines his opinion that in the near future, "we won't be able to tell if the person we're chatting to on a video conversation is real or an impostor."

The expert also anticipates that deepfakes will become more common in several sectors, including romance scams. Making convincing false personas requires a significant commitment of time, effort, and devotion, and adding a deepfake does not require much more work. Shier is concerned that deepfaked romance frauds might become an issue if AI makes it possible for the con artist to operate on a large scale.

Shier was hesitant to assign a date for industrialized deepfake bots, but he claimed that the required technology is becoming better and better every year.

The researcher noted that "AI experts make it sound like it is still a few years away from the huge effect." In the interim, we will observe well-funded criminal organizations carrying out the subsequent degree of compromise to deceive victims into writing checks into accounts.

Deepfakes have historically been employed primarily to produce sexualized images and movies, almost always featuring women.

Nevertheless, a Binance PR executive recently disclosed that fraudsters had developed a deepfaked clone that took part in Zoom calls and attempted to conduct bitcoin scams.

Deepfakes may not necessarily be a scammer's primary tactic, but security researchers at Trend Micro said last month that they are frequently used to augment other techniques. The lifelike computerized images have recently appeared in online advertisements, phony business meetings, and job seeker frauds. The distress is that anybody could become a victim because the internet is so pervasive.

Binance Executive: Scammers Created a 'Deep Fake Hologram' of him to Fool Victims


According to a Binance public relations executive, fraudsters created a deep-fake "AI hologram" of him to scam cryptocurrency projects via Zoom video calls.

Patrick Hillmann, chief communications officer at the crypto hypermart, stated he received messages from project teams thanking him for meeting with them virtually to discuss listing their digital assets on Binance over the past month. This raised some suspicions because Hillmann isn't involved in the exchange's listings and doesn't know the people messaging him.

"It turns out that a sophisticated hacking team used previous news interviews and TV appearances over the years to create a 'deep fake' of me," Hillmann said. "Other than the 15 pounds that I gained during COVID being noticeably absent, this deep fake was refined enough to fool several highly intelligent crypto community members."

Hillmann included a screenshot of a project manager asking him to confirm that he was, in fact, on a Zoom call in his write-up this week. The hologram is the latest example of cybercriminals impersonating Binance employees and executives on Twitter, LinkedIn, and other social media platforms.

Scams abound in the cryptocurrency world.
Despite highlighting a wealth of security experts and systems at Binance, Hillman insisted that users must be the first line of defence against scammers. He wrote that they can do so by being vigilant, using the Binance Verify tool, and reporting anything suspicious to Binance support.

“I was not prepared for the onslaught of cyberattacks, phishing attacks, and scams that regularly target the crypto community. Now I understand why Binance goes to the lengths it does,” he added.

The only proof Hillman provided was a screenshot of a chat with someone asking him to confirm a Zoom call they previously had. Hillman responds: “That was not me,” before the unidentified person posts a link to somebody’s LinkedIn profile, telling Hillman “This person sent me a Zoom link then your hologram was in the zoom, please report the scam”.

The fight against deepfakes
Deepfakes are becoming more common in the age of misinformation and artificial intelligence, as technological advancements make convincing digital impersonations of people online more viable.

They are sometimes highly realistic fabrications that have sparked global outrage, particularly when used in a political context. A deepfake video of Ukrainian President Volodymyr Zelenskyy was posted online in March of this year, with the digital impersonation of the leader telling citizens to surrender to Russia.

On Twitter, one version of the deepfake was viewed over 120,000 times. In its fight against disinformation, the European Union has targeted deepfakes, recently requiring tech companies such as Google, Facebook, and Twitter to take countermeasures or face heavy fines.