What Australia’s Under-16 Social Media Rules Mean for Young People and Families
In December 2025, Australia introduced new rules restricting social media account access for people under 16. The changes sit within amendments to the Online Safety Act 2021 and are designed to reduce harm by placing responsibility on platforms rather than on young people or their parents.
The rules are often described as a “ban”. In practice, they are more specific than that.
From 10 December 2025, certain platforms must take “reasonable steps” to prevent Australians under 16 from creating or maintaining accounts. Young people can still view publicly available content while logged out. There are no penalties for teenagers or their families, but platforms that fail to comply face significant civil penalties.
So what does this mean in real terms for young people and the adults who care for them?
Why Were the Rules Introduced?
Concern about the impact of social media on young people’s mental health has grown steadily over the past decade.
National research shows that social media use among teenagers was already near universal before the new rules came into effect. Around 95% of 13 to 15-year-olds reported using social media in 2024. Even among children aged 8 to 12, use was widespread, with many accessing platforms via their parent or carer’s account.
At the same time, mental health indicators for young Australians have worsened. The Australian Institute of Health and Welfare reports a significant increase in rates of anxiety and depression among young people over recent years. The World Health Organizationestimates that globally, one in seven adolescents experiences a mental health condition.
Research suggests that high levels of social media use, particularly more than three hours per day, are associated with increased psychological distress, loneliness and sleep disruption. Cyberbullying remains a serious concern, with almost half of young Australians reporting negative online experiences in some studies.
The government’s position is that delaying access to full account participation may reduce exposure to these risks during a vulnerable stage of brain development.
What Has Changed, and What Has Not?
It is important to understand the limits of the policy.
The rules apply to platforms that enable social interaction, content posting and user engagement. Services such as Instagram, TikTok, Snapchat and X fall within scope. Messaging services, online gaming platforms and education or health support services are excluded under the legislative rules.
Young people under 16:
- Cannot hold accounts on in-scope platforms.
- Can still view publicly available content while logged out.
- Will not be fined or penalised.
This means exposure to social media content does not disappear. Network effects may continue through shared devices, adult accounts or migration to exempt services.
Early reporting from the eSafety Commissioner indicated that millions of under-16 accounts were restricted in the first weeks of implementation. However, this reflects account action, not mental health outcomes. It is too early to determine whether the rules will reduce distress, improve sleep, or decrease bullying at a population level.
The Mental Health Balance
The evidence on social media and wellbeing is complex.
High intensity use is consistently associated with poorer outcomes. However, moderate use can be neutral or even supportive for some young people. Social media can provide:
- Connection for young people who feel isolated.
- Community for LGBTQIA+ youth and culturally diverse groups.
- Access to mental health information and peer support.
- Opportunities for creativity and identity exploration.
Organisations such as headspace and ReachOut have cautioned that while reducing harm is important, access to supportive online spaces also plays a protective role for some young people.
This creates a genuine policy tension. The same platforms that can amplify bullying and comparison can also provide belonging and support.
Families looking for practical guidance on supporting young people’s wellbeing can explore the Sir David Martin Foundation resources page, which brings together tools, information and links to trusted support services (including mental health and alcohol and other drug support).
Privacy and Age Verification
A significant practical question is how platforms verify age.
The amended legislation restricts platforms from relying solely on government-issued identification or accredited digital ID services. They must offer reasonable alternatives. Personal information collected for age assurance must be destroyed after use in accordance with privacy requirements under the Privacy Act 1988.
Even with these safeguards, age verification introduces new privacy considerations. Large-scale data collection, biometric estimation tools and document checks can create anxiety for families and raise questions about data security.
For young people, repeated verification requests or mistaken age assessments may feel stressful or exclusionary. Fairness and accuracy in these systems will be critical to maintaining trust.
Child Rights and Participation
Under international human rights law, including the Convention on the Rights of the Child, young people have rights to freedom of expression, access to information and participation in decisions affecting them.
Human rights organisations have questioned whether a blanket minimum age of 16 is proportionate, arguing that the process moved quickly and involved limited consultation with young people.
Others argue that strong intervention is necessary given the scale of online harm and the documented inability of platforms to enforce existing age limits effectively.
The central question is whether the policy strikes the right balance between protection and participation.
What This Means For Families
For families, the new rules are unlikely to eliminate online exposure. Instead, they shift the conversation.
Rather than focusing solely on access, families may need to consider:
- How devices are used in shared spaces.
- Late-night screen habits and sleep.
- The role of messaging and gaming platforms.
- Where young people access mental health support.
- How digital literacy is being developed before full account access at 16.
Importantly, young people are not “in trouble” under the new framework. Open conversations remain more effective than surveillance or punishment.
What We Still Do Not Know
The legislation requires an independent review within two years of implementation. Meaningful evaluation will need to examine:
- Changes in cyberbullying rates.
- Sleep and wellbeing indicators.
- Mental health service demand.
- Migration to exempt platforms.
- Equity impacts for marginalised youth.
It is possible the rules will reduce harm. It is also possible that harms could shift rather than disappear.
What is clear is that digital life remains a central part of adolescence. Reducing risk requires more than age gates. It requires supportive adults, strong privacy protections, thoughtful product design, and investment in youth mental health services.
For young people and families, the new rules mark the beginning of a transition. How well that transition supports wellbeing will depend not only on enforcement, but on the conversations and supports that surround it.