Search This Blog

Powered by Blogger.

Blog Archive

Labels

Footer About

Footer About

Labels

Showing posts with label CSAM. Show all posts

BreachForums Founder Resentenced to Three Years After Appeal

 

In a significant legal outcome for the cybersecurity landscape, Conor Fitzpatrick, the founder of the notorious BreachForums underground hacking site, has been resentenced to three years in federal prison after appeals overturned his previous lenient sentence. 

Fitzpatrick, who operated under the alias Pompompurin, was originally arrested in March 2023 for running the forum and faced multiple charges: access device conspiracy, access device solicitation, and possession of child sexual abuse material (CSAM). He pleaded guilty to all counts in January 2024 and was initially handed 17 days in jail and 20 years of supervised release, a punishment prosecutors sharply criticized as dramatically insufficient given the gravity of his crimes. 

Appeals and resentencing 

The U.S. Court of Appeals for the Fourth Circuit agreed with prosecutors, declaring the original sentence “substantively unreasonable” for failing to serve proper sentencing purposes. This led to Fitzpatrick’s resentencing and a harsher three-year prison term.

BreachForums, which emerged in March 2022 as a successor to the dismantled RaidForums, became one of the most active online marketplaces for stolen data and compromised credentials. At its peak, it hosted more than 14 billion individual records and counted 330,000 members among its user base. U.S. authorities emphasized that Fitzpatrick “personally profited from the sale of vast quantities of stolen information,” ranging from private personal details to sensitive commercial data. 

Despite repeated law enforcement takedown attempts, BreachForums managed to resurface multiple times, illustrating the resilience of such underground communities. The arrest of Baphomet, the admin who took over after Fitzpatrick was detained, did little to slow the forum; it slipped into the hands of ShinyHunters, a cybercriminal group linked to several high-profile data breaches. 

As of mid-September 2025, BreachForums is offline, with its maintainers announcing a decision to “go dark”—a phrase that suggests not just temporary shutdown, but a possible strategic retreat rather than a permanent closure. This mirrors the recent moves of other infamous cybercrime collectives like Lapsus$ and Scattered Spider, who have also vanished from the digital underground, at least for now. 

Context and implications 

The case of Conor Fitzpatrick and BreachForums highlights the challenges of prosecuting transnational cybercrime and the difficulties law enforcement faces in permanently dismantling underground hacking forums. Despite impressive numbers—14 billion records, hundreds of thousands of members—the legal outcome for operators is often uncertain, with initial sentences sometimes appearing disproportionately light compared to the scale of the harm caused.

The resentencing of Fitzpatrick marks a tightening stance by the U.S. Department of Justice, signaling that courts are now more willing to impose harsher penalties on those who profit from stolen data and operate platforms that enable large-scale cybercrime. Yet, even as high-profile forums like BreachForums disappear, the enduring cycle of takedown, migration, and reemergence of similar platforms suggests that the broader threat will persist as long as demand for stolen data remains high.

EU Proposes New Law to Allow Bulk Scanning of Chat Messages

 

The European elections have ended, and the European football tournament is in full flow; why not allow bulk searches of people's private communications, including encrypted ones? Activists around Europe are outraged by the proposed European Union legislation. 

The EU governments' vote on Thursday in a significant Permanent Representatives Committee meeting would not have been the final obstacle to the legislation that aims to identify child sexual abuse material (CSAM). At the last minute, the contentious question was taken off the agenda. 

However, if the EU Council approves the Chat Control regulation later rather than sooner, experts believe it will be enacted towards the end of the difficult political process. Thus, the activists have asked Europeans to take action and keep up the pressure.

EU Council deaf to criticism

Actually, a regulation requiring chat services like Facebook Messenger and WhatsApp to sift through users' private chats in order to look for grooming and CSAM was first put out in 2022. 

Needless to say, privacy experts denounced it, with cryptography professor Matthew Green stating that the document described "the most sophisticated mass surveillance machinery ever deployed outside of China and the USSR.” 

“Let me be clear what that means: to detect “grooming” is not simply searching for known CSAM. It isn’t using AI to detect new CSAM, which is also on the table. It’s running algorithms reading your actual text messages to figure out what you’re saying, at scale,” stated Green. 

However, the EU has not backed down, and the draft law is currently going through the system. To be more specific, the proposed law would establish a "upload moderation" system to analyse all digital messages, including shared images, videos, and links.

The document is rather wild. Consider end-to-end encryption: on the one hand, the proposed legislation states that it is vital, but it also warns that encrypted messaging platforms may "inadvertently become secure zones where child sexual abuse material can be shared or disseminated." 

The method appears to involve scanning message content before encrypting it using apps such as WhatsApp, Messenger, or Signal. That sounds unconvincing, and it most likely is. 

Even if the regulation is approved by EU countries, additional problems may arise once the general public becomes aware of what is at stake. According to a study conducted last year by the European Digital Rights group, 66% of young people in the EU oppose the idea of having their private messages scanned.

Grooming Cases Reach Unprecedented Heights Amidst Regulatory Delays

 


Campaigners are calling for no more delays in the online safety bill, which is being pushed by the Government, as thousands of crimes related to online grooming have been reported during the wait for updated online safety laws. 

There has been a lengthy wait before the long-awaited bill can become law, but the proposed legislation went through repeated changes and delays on its way to becoming law in the autumn. Additionally, ministers have come under fire in recent days from tech companies for what they think is an attempt by the government to undermine encryption technology.

A call has been made by the NSPCC to support the bill after the charity announced that in the last six years, UK police forces have recorded 34,000 cases of online grooming crimes, affecting children and young people. In 2017, the charity began calling for more robust online safety regulations to be put in place in order to protect users. 

NSPCC statistics show that 6350 incidents of sexual communication with children were reported last year, an increase of 82 per cent since the offence of sexual communication with children was introduced in 2017/18, according to data obtained from 42 UK police forces. 

Moreover, the figures say that in 83 percent of the cases of social media grooming over the last six years, when the gender of the victim could be determined, the victims were girls, the charity noted. According to the police data, more than 150 apps, games and websites were also used for the purpose of targeting children. If children are to be protected from abuse and neglect, then the NSPCC believes that the Bill is indispensable. 

As a result of this law, firms and big tech bosses will have to adopt stricter responsibilities for protecting young users if it passes. Nevertheless, the NSPCC wants assurances that new technologies, including artificial intelligence, will be regulated by the legislation as well.

A study of the data shows that 73% of the reported crimes involved either Snapchat or an associated website, where 5,500 of the incidents involved children between the ages of 5 and 12. A few weeks away from the end of the summer recess, parliament will resume sessions to wrap up the debate on the bill, which is expected to be passed soon after. 

A severe impasse in the UK is threatening the future of end-to-end encryption due to its ongoing implications. Increasing numbers of tech companies are offering encrypted messaging services to app users in order to satisfy their demands for more privacy since this means that the message can be viewed only by the sender and the recipient, rather than anyone else. The records cannot even be accessed by the tech companies themselves in most cases. 

Even though most of us would agree that privacy is, in general, something we all cherish above all others, there is a grave element of risk that cannot be ignored when trying to achieve it. However, depending on which platform you go to, these privacy features can be accessed by everyone, and the platforms claim they offer extra protection for people such as victims of domestic abuse, journalists, political activists, and others. It is also claimed by them that if a backdoor is added to their services, it will undermine the security of their system for everybody. 

Despite the fact that the tech industry and legislators have a consensus that something needs to be changed, the tradeoff between privacy and security has prevented any meaningful progress from being made. 

As stated in the latest draft of the Online Safety Bill, it demands a backdoor through which the authorities will have the ability to access social media services for the sole purpose of unlawful surveillance. 

Nevertheless, there are concerns among tech companies that ostensibly loosening any protection against data scraping might provide hackers and data thieves with a window of opportunity to perpetrate havoc on our sensitive information by exploiting any loopholes. It is generally considered that social media platforms prefer developing their own safety precautions as opposed to taking a proactive approach to prevent the spread of child sex abuse material (CSAM). In addition, updates are used to tighten up their grip on the spread of other forms of harmful and age-restricted content, so that children do not encounter harmful content.

Even with the efforts of each individual company, the statistics indicate that the epidemic of online child grooming continues to worsen – an epidemic exacerbated by social media's unintentional role as a smokescreen for online child maltreatment. 

The chief executive of the NSPCC, Sir Peter Wanless, commented that the study demonstrates that there is a considerable amount of child abuse on social media as well as the human cost associated with fundamentally unsafe products. There are many offences against children online, so it is imperative that we remember how important the Online Safety Bill is and why children need the ground-breaking protection it will provide. 

As a result of the ongoing butting heads between Silicon Valley giants and government regulators, there has been speculation that the Communications Regulatory Authority, or Ofcom, might get involved and bring about changes that will impact the entire industry. If the right balance can be struck between both relevant entities, it will be interesting to see if both parties can benefit. There is no doubt about it, though, that it is a matter of equivocally needing all hands on deck in order to get the job done.