
Children across Australia awoke on Wednesday without access to their social media accounts due to the world’s first ban aimed at safeguarding minors under 16 from addictive algorithms, online predators, and digital bullies.
No other nation has enacted such extensive measures, and the implementation of this strict new law is being closely observed by lawmakers globally.
The 10 prohibited platforms—Instagram, Facebook, Threads, Snapchat, YouTube, TikTok, Kick, Reddit, Twitch, and X—state they will adhere to the ban by employing age verification technologies to identify and block under-16 users’ accounts, though they don’t believe this will make children safer.
Prime Minister Anthony Albanese called it a “proud day” for Australia.
“Today is a day where Australian families reclaim power from the big tech companies. They are standing up for the right of children to be children and for parents to have greater peace of mind,” Albanese told public broadcaster ABC on Wednesday. But he conceded “it won’t be simple.”
Under the legislation, platforms must prove they have taken “reasonable steps” to deactivate accounts used by individuals under 16 and to prevent new accounts from being opened, to avoid penalties up to A$49.5 million ($32 million).
Some children—and their parents—are expected to defy the prohibition, but neither faces any repercussions.
YouTube account holders will be automatically logged out on December 10. Their channels will no longer be visible; however, their data will be preserved so they can reactivate their accounts upon turning 16. Children will still be able to view YouTube without logging in.
TikTok reports that all accounts utilized by minors under 16 will be deactivated on December 10. It asserts that it doesn’t matter what email is used or whose name is on the account—age verification technology will determine the user. Content previously posted by young users will no longer be viewable. The platform is also urging parents who suspect their children might have lied about their age when setting up accounts to report them.
Twitch states that from December 10, minors under 16 will be barred from creating new accounts on the site popular with gamers, but existing accounts for those under 16 will only be deactivated on January 9. The company did not respond to a request to explain the delay.
Meta began removing the accounts of teens under 16 across Instagram, Facebook, and Threads on December 4. Users were invited to download their content, which will remain there should they decide to reclaim their accounts at age 16.
Facebook and Instagram prompts display on mobile phones as Meta prepares for the new regulation banning social networks for under-16 users in Australia on December 6, 2025.
Facebook and Instagram prompts display on mobile phones as Meta prepares for the new regulation banning social networks for under-16 users in Australia on December 6, 2025. Holly Adams/Reuters
Reddit mentioned it will suspend accounts of users under 16 and prohibit new openings.
X did not reply to inquiries about how it will comply with the ban but strongly objects to the law as an infringement on free expression.
Kick, a live-streaming service similar to Twitch, did not respond to a request for comment.
Which platforms are excluded?
Alongside the roster of banned sites is a list of platforms not presently considered part of the prohibition. These include Discord, GitHub, Google Classroom, LEGO Play, Messenger, Pinterest, Roblox, Steam and Steam Chat, WhatsApp, and YouTube Kids.
The option to exclude Roblox has been perceived by many Australians as a puzzling selection, given recent reports of children being targeted by adult predators within the games.
eSafety Commissioner Julie Inman Grant stated that discussions with Roblox commenced in June, and the company agreed to institute new controls that will be rolled out this month in Australia, New Zealand, the Netherlands, and other locales in January.
Users will be required to verify their age to enable chat features, and they will only be able to communicate with individuals around their same age.
How do platforms determine accounts under 16?
The restricted platforms already possessed a good sense of who was using their services from the birthdates users entered when opening an account, but the new legislation mandates they actively verify age.
This has provoked pushback from some adult users who worry about being asked to confirm their age. An Age Assurance Technology trial conducted earlier this year convinced the government that age checks could proceed without compromising privacy.
Platforms are verifying age through live video selfies, email addresses, or official documents. According to Yoti, the age verification firm that counts Meta among its clients, most users opt for video selfies using facial data to estimate age.
How are kids responding?
Some are seeking alternative services offering comparable functions that aren’t prohibited.
Yope, a photo-sharing platform, reported attracting 100,000 new Australian users through word-of-mouth as the looming ban approached. Lemon8, a TikTok-like platform also owned by ByteDance, has also been advertised among teens as a backup.
Both platforms were put on notice by the eSafety Commissioner. Lemon8 says it will abide by Australia’s new statutes, while Yope informed CNN the ban does not pertain to it because it permits no messaging with strangers.
The eSafety Commissioner notes that the roster of prohibited sites is dynamic, and new sites may be added as they gain traction or offer novel services.
The fluid nature of the roster and the impetus for other operators to serve millions of teens seeking alternatives have sparked criticism that the government has created a “whack-a-mole” scenario it may never prevail in.
Youth counselors and support groups are apprehensive that children relying on social media for inclusion will end up in unregulated digital arenas with even fewer safeguards, and are monitoring where these avenues lead.
What happens next?
Part of the rationale for the ban was to keep children offline and more engaged in real-world activities, which is what officials plan to measure.
“We’ll be looking at everything: Are kids sleeping more, are they socializing more? Are they taking less antidepressants? Are they reading more books? Are they going outside, playing sports?” Commissioner Inman Grant shared last week in an interview with the Sydney Dialogue.
But she added they will also monitor for unintended consequences.
“Are they migrating to darker corners of the internet, and what is the outcome?”
Six experts from the Stanford Internet Observatory will collaborate with the eSafety Commissioner to gather data, and the entire process will be reviewed by an independent Academic Advisory Panel comprising 11 scholars from the US, UK, and Australia.
Stanford University stated its approach, methods, and findings will be made public for review by researchers, the public, and policymakers worldwide.
“We hope the evidence generated can directly support and influence the decision-making of other nations striving to advance online child safety in their jurisdictions,” the university said in a statement.