For Australia’s youth, the social media landscape is becoming a rocky road, as the companies behind platforms like Instagram, TikTok, Snapchat and YouTube begin to comply with the country’s landmark and imminent new legislation banning social media for children and younger teens.
Those under 16 — or suspected to be in this age cohort — are likely starting to see pop-up notifications about their accounts being deactivated or put on hold until they become or can prove they are of age.
It’s a bold effort at protecting youth from online harm that’s being keenly watched worldwide.
But some critics have raised questions about the methods and technology being used, as well as whether this will truly make digital spaces safer.
Here’s a quick look at how it’s rolling out and the potential for ripple effects in Canada.
The basics
As of Dec. 10, platforms included in the ban must deactivate accounts for users under 16 and prevent new users in this age group. The social media platforms on the government’s initial list are:
- Threads
- YouTube
- TikTok
- Snapchat
- Twitch
- X (formerly Twitter)
- Kick
Tapping AI to verify
Each company decides which process to use, but basically ages are verified taking into account the information provided at registration, combined with artificial intelligence to analyze accounts, tech analyst Carmi Levy said.
For example, Meta — which owns several social media platforms, including Instagram and Facebook — has said it will use AI, Levy noted.

“If [an account is] following a lot of kids, if they’re engaging in ways that suggest that they’re under 16, then they will be flagged as an underage account and they will be removed from the platform,” he said from London, Ont.
Meta users can appeal for reinstatement by submitting government-issued identification or recording a “video selfie” for analysis, Levy said.
Meanwhile, Snapchat will similarly verify age through government-issued photo ID, third-party software that analyzes a submitted selfie or other software that links to users’ banking information.
Not every company has weighed in with specifics, Levy said, “but we can expect largely the same thing: using AI behind the scenes to essentially creep you online and guess how old you are.”
Verification technology an ‘unmitigated disaster’
Australia’s ban is a nation-wide test for verification technology, which hasn’t been tremendously successful, Levy said.
“Account authentication and age verification has been used by other platforms, in other parts of the world, and it’s safe to say that it’s been an unmitigated disaster,” he said.

Levy pointed to countless users who are already wrongly tagged or removed from platforms, for myriad reasons, and the difficulty they have proving legitimacy.
“A lot of these processes are largely based on automated technologies: There aren’t humans sitting in the backgrounds at Meta or at Google or at X who are ready to handle these kinds of complaints and kind of manage them over time,” he said.
Levy said he’s also leery of the privacy and data integrity implications of submitting sensitive and valuable personal information, whether it’s ID like a driver’s licence or a clear shot of one’s face.
“We’re going to have to trust that as they collect these huge piles of personal information from their users, they’re going to keep it safe,” he said. “And we’ve seen time and again that doesn’t always happen.”
Lawsuit challenges Australia’s ban
Noting that Australia has suggested that other places where youth congregate and communicate — from gaming platforms like Roblox and Fortnite to spaces like Discord — could be added to its list, digital rights advocate Matt Hatfield questions the effectiveness of this path to address harms like cyberbullying.
“I just really worry Australia’s going to wind up sort of chasing young people around the internet from space to space and imposing age blocks and potential censorship as they go,” Hatfield, executive director of digital rights group OpenMedia, said from the Gulf Islands in British Columbia.

“We should be realistic about what this kind of legislation can or can’t do and bear in mind that dealing directly with bullying — both discouraging it and imposing consequences when it happens — is really the more effective way.”
That’s echoed by Noah Jones, a 15-year-old student in Sydney, Australia, who is currently a co-plaintiff in a lawsuit seeking to overturn the ban. He said it infringes young Australians’ constitutional rights and cuts off a vital avenue of communication.
“It just doesn’t make sense why we’re disconnected from the world, and the harmful people and the harmful explicit content isn’t the one suffering from this ban, it’s us,” he told Reuters on Monday.
His peers who skirt the ban will be even less likely to share if they do encounter harm, Jones said. “How are they going to report that to police or parents? Because they weren’t supposed to be on the social media platforms in the first place.”
Potential ripple effect
With multiple nations and the European Union itself pondering new legislation on teen social media use, Levy said he thinks what platforms are unveiling now in Australia “will largely be used as a template for what they ultimately roll out in other countries.”
Hatfield predicts Canada is monitoring this situation before moving forward. Legislation to improve online safety for Canadians was introduced in 2021 and adjusted over the years following cross-country consultations with stakeholders, he said. But the most recently proposed Online Harms Act did not pass due to the federal election held in April.

Still, he thinks a revised bill could be reintroduced relatively easily in Canada — though ideally with an emphasis on social media companies taking into account impacts on young users and not exactly following Australia’s lead.
“It would be a terrible thing if this government suddenly forgot all the lessons that were learned during that [consultation] process and introduced something that didn’t include all of those reflections — something maybe more like Australia: very bold, but not very thoughtful about a balance of rights and protections,” Hatfield said.
Levy said now marks a perfect time for Canadian parents to introduce or revisit with their kids some best practices and behaviour for online spaces.
“The government isn’t going to save your kid digitally,” he said. “These are still conversations that parents need to be having with their kids.”
