Australia Bans Social Media for Users Under 16
Australia has implemented the world’s first nationwide social media age restriction, preventing users under 16 from accessing ten major platforms starting December 10, 2025. The eSafety Commissioner has designated Facebook, Instagram, Snapchat, Threads, TikTok, Twitch, X, YouTube, Kick, and Reddit as age-restricted platforms under this groundbreaking legislation.

The new law requires these social media companies to take reasonable steps to prevent Australians under 16 from creating or maintaining accounts. Platforms that fail to comply face substantial civil penalties, with courts able to impose fines up to 49.5 million Australian dollars, equivalent to approximately 33 million US dollars. This represents one of the most significant regulatory actions taken against social media companies globally.
The restrictions target platforms that meet three specific criteria: enabling online social interaction between multiple users, allowing users to link with or interact with other users, and permitting users to post material on the service. The Australian government has explicitly excluded online gaming platforms and standalone messaging applications from these restrictions, though messaging services with social media-style features may still fall under the age requirement.
The legislation clarifies that this measure is not a ban but rather a delay in account creation for younger Australians. Children under 16 who access age-restricted platforms will not face penalties, nor will their parents or guardians. The responsibility lies entirely with the platforms themselves to implement effective age verification and prevention measures.
The eSafety Commissioner has published regulatory guidance to help platforms determine which age assurance methods comply with the Online Safety Act. This guidance draws from the Australian Government’s Age Assurance Technology Trial, which evaluated various age verification technologies. The trial’s findings, released in August 2025, indicated that age assurance technologies can be private, robust, and effective when deployed correctly and potentially combined with other verification methods.
The Australian government conducted extensive consultations before implementing these restrictions. The eSafety Commissioner engaged with industry representatives, civil society organizations, academics, researchers, education sector professionals, and children themselves. These consultations examined the use of age assurance technologies, potential impacts on users including privacy and digital rights concerns, possible circumvention methods, and strategies for effective communication about the changes.
The legislative timeline moved swiftly through Australian parliament. The Social Media Minimum Age Bill was introduced in November 2024, followed by a brief public consultation period before passage. The bill received Royal Assent in December 2024, becoming law. Throughout 2025, the government released research findings, developed regulatory frameworks, and prepared platforms and the public for implementation.
Australian authorities position these restrictions as part of a broader child safety and mental health strategy. The eSafety Commissioner emphasizes that the restrictions aim to protect young Australians from pressures and risks associated with social media use, particularly design features that encourage excessive screen time and exposure to potentially harmful content. The government argues that delaying social media access during critical developmental years will benefit young people’s wellbeing.
The Australian government frames this initiative as addressing the mental health crisis among young people, linking social media use to increased anxiety, depression, and other psychological challenges. Prime Minister Anthony Albanese characterized the reform as a profound social and cultural change for the nation, suggesting that young Australians should engage in alternative activities such as sports, music, or reading instead of spending time on social media platforms.
Critics have noted inconsistencies in the platform selection criteria. Gaming-focused applications like Discord, Roblox, and Steam remain accessible to users under 16 despite documented concerns about child exploitation and safety issues on these platforms. The exclusion of gaming services has raised questions about the comprehensiveness and effectiveness of the age restrictions in achieving their stated child protection goals.
The enforcement mechanism relies on platform compliance rather than user verification. The eSafety Commissioner can obtain information from service providers about their compliance efforts and enforce compliance when necessary. The law mandates an independent review of the restrictions’ operation within two years of taking effect, allowing for assessment and potential adjustments based on real-world implementation results.
International observers are closely monitoring Australia’s approach as other countries consider similar measures. The Australian model represents the most comprehensive attempt to restrict youth access to social media at a national level, potentially influencing regulatory approaches in other jurisdictions. However, questions remain about practical implementation, the effectiveness of age verification technologies, and potential unintended consequences.
The restrictions intersect with other Australian online safety initiatives, including the Basic Online Safety Expectations that set standards for online service providers, Age-Restricted Material Codes preventing children from accessing inappropriate content, and the Safety by Design framework encouraging services to embed safety features from initial development. Together, these measures create a comprehensive regulatory environment aimed at protecting Australians, particularly children, in digital spaces.
The government has launched a national awareness campaign titled For the Good of Their Wellbeing to inform the public about the changes. The eSafety Commissioner continues to update lists of platforms considered age-restricted as new services emerge and existing platforms evolve. This ongoing assessment ensures the restrictions adapt to the changing digital landscape.
As implementation proceeds, platforms must balance compliance requirements with user privacy protections. The Information Commissioner monitors privacy compliance under both the Social Media Minimum Age law and the broader Privacy Act. The eSafety Commissioner has emphasized that safety and privacy need not be mutually exclusive goals in developing age verification systems.



