Search This Blog

Powered by Blogger.

Blog Archive

Labels

Footer About

Footer About

Labels

Meta Begins Removing Under-16 Users Ahead of Australia’s New Social Media Ban

Children under the age of 16 will no longer be allowed to have social media accounts.

 



Meta has started taking down accounts belonging to Australians under 16 on Instagram, Facebook and Threads, beginning a week before Australia’s new age-restriction law comes into force. The company recently alerted users it believes are between 13 and 15 that their profiles would soon be shut down, and the rollout has now begun.

Current estimates suggest that a large number of accounts will be affected, including roughly hundreds of thousands across Meta’s platforms. Since Threads operates through Instagram credentials, any underage Instagram account will also lose access to Threads.

Australia’s new policy, which becomes fully active on 10 December, prevents anyone under 16 from holding an account on major social media sites. This law is the first of its kind globally. Platforms that fail to take meaningful action can face penalties reaching up to 49.5 million Australian dollars. The responsibility to monitor and enforce this age limit rests with the companies, not parents or children.

A Meta spokesperson explained that following the new rules will require ongoing adjustments, as compliance involves several layers of technology and review. The company has argued that the government should shift age verification to app stores, where users could verify their age once when downloading an app. Meta claims this would reduce the need for children to repeatedly confirm their age across multiple platforms and may better protect privacy.

Before their accounts are removed, underage users can download and store their photos, videos and messages. Those who believe Meta has made an incorrect assessment can request a review and prove their age by submitting government identification or a short video-based verification.

The new law affects a wide list of services, including Facebook, Instagram, Snapchat, TikTok, Threads, YouTube, X, Reddit, Twitch and Kick. However, platforms designed for younger audiences or tools used primarily for education, such as YouTube Kids, Google Classroom and messaging apps like WhatsApp, are not included. Authorities have also been examining whether children are shifting to lesser-known apps, and companies behind emerging platforms like Lemon8 and Yope have already begun evaluating whether they fall under the new rules.

Government officials have stated that the goal is to reduce children’s exposure to harmful online material, which includes violent content, misogynistic messages, eating disorder promotion, suicide-related material and grooming attempts. A national study reported that the vast majority of children aged 10 to 15 use social media, with many encountering unsafe or damaging content.

Critics, however, warn that age verification tools may misidentify users, create privacy risks or fail to stop determined teenagers from using alternative accounts. Others argue that removing teens from regulated platforms might push them toward unmonitored apps, reducing online safety rather than improving it.

Australian authorities expect challenges in the early weeks of implementation but maintain that the long-term goal is to reduce risks for the youngest generation of online users.



Share it:

Age Restriction

Australia

Facebook

Instagram

Meta

Social Media

Technology