Meta has removed accounts of approximately 500,000 Australian teenagers under 16 from Instagram, Facebook, and Threads, starting a week before the nationwide social media ban for under-16s takes effect on December 10, 2025.
The company began notifying users aged 13 to 15 in November via email, SMS, and in-app messages, allowing them to download data before deactivation from December 4.
This proactive step complies with Australia’s strict law, welcomed by officials like eSafety Commissioner Julie Inman Grant for protecting youth mental health, though Meta notes challenges in age verification and some parents express concerns over lost social connections.
Platforms like TikTok, Snapchat, YouTube, X, Reddit, and Kick face similar restrictions, with fines up to 49.5 million Australian dollars for non-compliance.
Law’s Urgent Timeline
Australia’s parliament passed the world’s strictest social media age restrictions, mandating platforms to block under-16s without parental consent starting December 10.
Meta accelerated enforcement, deactivating accounts from December 4 and blocking new sign-ups for minors, aiming to complete removals by the deadline. This follows months of debate on online harms like cyberbullying, body image issues, and exposure to inappropriate content affecting young users.
The policy targets mainstream platforms but exempts Meta’s Messenger app, preserving some teen communication options. Notifications urge affected users to update contact details for account reactivation at 16 or download cherished memories and connections. Prime Minister Anthony Albanese supports the ban to let kids be kids, while global lawmakers in the UK and US states observe closely.
Scale and Enforcement Challenges
Meta reports over 150,000 monthly active Facebook users and 350,000 on Instagram aged 13 to 15 in Australia, totalling around 500,000 accounts across its platforms including Threads, which links to Instagram.
Teens disputing removal must prove age via facial scans or ID, addressing widespread fake birthdates that evade restrictions. Antigone Davis, Meta’s global head of safety, called compliance an ongoing multi-layered process, acknowledging imperfections in detection.
Mia Garlick, Meta’s regional policy director, emphasised preserving user data: For all our users aged 15 and under, we understand the importance of the treasured memories, connections, and content within your accounts.
eSafety Commissioner Julie Inman Grant praised Meta’s initiative but stressed ongoing dialogue with families, educators, and industry to balance safety and digital rights. Around 2 million Australians aged 12 to 17 use social media regularly, highlighting the ban’s broad reach.
Stakeholder Reactions
Officials view Meta’s early action positively amid rising youth mental health concerns from platforms. Parents and advocacy groups worry about isolating teens who rely on social media for friendships, especially in rural areas. Tech firms like Meta opposed the blunt measure initially but pledged compliance to avoid massive fines.
This marks the first mass teen account shutdown by government mandate, setting a precedent. Users receive two weeks’ notice, with frozen accounts accessible later. The law responds to research on risks, pushing platforms toward better safeguards worldwide.
The Logical Indian’s Perspective
The Logical Indian applauds Australia’s bold step and Meta’s compliance in shielding children from digital pitfalls, fostering environments of kindness, empathy, and mental well-being essential for harmonious growth.
Yet, enforcement must pair with education, parental guidance, and inclusive dialogue to avoid unintended isolation, ensuring tech serves as a tool for positive connections rather than harm.

