Table of Contents
Imagine you use Instagram, Snapchat or TikTok — and then one day you’re told you can’t anymore because you’re under 16. That’s what’s happening in Australia. The government has passed a law that, from December 10 2025, people under age 16 must not have accounts on major social media platforms. Meta, TikTok and Snapchat have all said they don’t agree with the ban — but they will follow the law.
Why is this happening? What does “comply but disagree” mean? And how will this affect young users, parents and the whole social-media industry? Let’s find out.
1. What’s the Law?
Australia’s law, the Online Safety Amendment (Social Media Minimum Age) Act 2024, amends existing online safety regulations. Here’s the gist:
- Users under 16 years old will be banned from having accounts on large social media platforms.
- Platforms must take “reasonable steps” to identify and prevent under-16s from holding accounts.
- If a platform fails to comply, it may face fines up to A$49.5 million (around US $32.5 million).
- The law is set to be enforced from December 10 2025.
In short, the government is putting age limits on major social media use in a way few nations have done before.
2. How Are Social Platforms Responding?
Here’s how the major players are reacting:
- Meta (owner of Facebook & Instagram) says it disagrees with the law — but will comply.
- TikTok also opposes the law, saying it may not protect young people as intended. But still: “we will comply”.
- Snapchat’s parent company, Snap, said the same: They believe they are primarily a messaging platform (not “social media” in the same sense) and that the law may be misguided — but they will follow it.
So all three big companies show disagreement with the reasoning of the law — yet they are preparing to deliver on it. They will begin processes to detect under-16 users, deactivate or restrict accounts, and set up age-verification or behaviour-tracking tools.
3. What’s Changing for Young Users & Parents?
Here are practical changes you may see — and what they mean.
A. Accounts Deactivated or Restricted
From December 10, if you’re under 16 and on one of these platforms in Australia:
- The platform may deactivate your account.
- You may be asked to verify your age (via ID, birth date, or behaviour pattern) to stay/return.
B. Age Verification & Behaviour Monitoring
Platforms will introduce or ramp up tools like:
- Automated checks to estimate if a user is under 16 (based on behaviour or age claims).
- Age-assurance mechanisms (e.g., users upload ID, or a selfie with an age check).
C. Data & Optional Account Choices
Young users may be given choices:
- Delete their account and content.
- Or let the platform store their data until they turn 16 and then restore access.
D. Impact on Families
For parents: this raises questions of supervision, digital literacy, and alternatives. Young people might shift to other platforms outside the law’s scope (or to apps not covered).
4. Why The Government Did This
Here are key reasons cited by Australia’s government:
- Concerns over youth mental health and social media’s role in increasing anxiety, loneliness, and harm.
- A desire to give children more of a “childhood” with less online pressure. The Prime Minister said he wants kids playing sports or socialising offline more than being glued to their phones.
- Setting a global precedent: Australia aims to lead on online safety for young people.
5. What Are The Big Challenges?
Even though the law is clear, many issues remain:
- Enforcement & accuracy: How reliably can platforms verify age? Are false positives possible?
- Privacy trade-offs: If platforms demand ID or biometric checks, that could affect privacy rights.
- Unintended consequences: Some argue banning under-16s may push them toward less-safe platforms or offline groups where risks might be greater.
- Global ripple effect: Other nations are watching. How Australia executes this could influence worldwide regulation.
- Technical cost and complexity: Building reliable age checks, monitoring, and deactivations—large platforms face massive tasks.
6. What This Means for the Social Media Companies
Platforms like Meta, TikTok, Snap will have to:
- Build or scale large age-assurance/verification systems.
- Monitor user behaviour and detect under-16 accounts.
- Face potential fines (up to A$49.5 million) for non-compliance.
- Rework how they market, moderate, and design features for younger audiences (or exclude them).
- Possibly lose a portion of their user base (in Australia, under-16s) and adapt to new rules globally if other countries follow.
7. Why It Matters Beyond Australia
- This is one of the first laws of its kind globally — a country banning under-16s entirely from major social media platforms.
- If other countries follow, the design of social media (age verification, guarding minors) may shift heavily.
- Tech companies may face fragmented regulation—platform features in one country may differ in another.
- It raises a question: How to balance online safety for kids with freedom of access, communication and expression?
Conclusion
Australia’s under-16 social media ban is bold and unprecedented. While Meta, TikTok and Snapchat don’t believe the ban is the best way to protect young people, they have chosen to comply rather than fight it — signalling that global tech regulation is changing.
For young users, parents, educators and tech companies, many questions remain: How will age verification work? Where will under-16s go online? Will the new rules succeed in safeguarding youth? Will other countries adopt similar models?
Change is afoot — and Australia may be the first move in a larger global shift. For every young person, it’s a reminder: the digital world keeps evolving — and so must our understanding of it.
