Technology Reporter

More than 80% of children in Australia aged 12 or under used social media or messaging services last year that are only meant to be used by over-13s.
The country’s internet regulator eSafety found YouTube, TikTok and Snapchat were the most popular platforms used by young children.
It comes as Australia plans to implement a social media ban for under-16s that is expected by the end of this year.
The companies examined – Discord, Google (YouTube), Meta (Facebook and Instagram), Reddit, Snap, TikTok and Twitch – did not immediately respond to a request for comment.
Users of all of these platforms must be 13 and over to have an account in the main, but there are some exceptions.
For example, YouTube has Family Link – when an account is accessible for children under the age of 13 under the supervision of a guardian – and the separate app YouTube Kids, which is specifically made for children.
In the report, usage of YouTube Kids was not included for this reason.
“The findings of this report will be a helpful input to guide next steps,” said eSafety commissioner Julie Inman Grant.
She said the report found online safety for children was a “shared responsibility” between various people, including social media platforms, the companies who create devices and apps, parents, teachers and politicians.
‘84% use social media’
Researchers questioned over 1,500 children across Australia aged between eight and 12 about their usage of social media and messaging platforms.
They found 84% of the children surveyed had used at least one social media or messaging service since the beginning of last year.
Over half of them used it via the account of a parent or carer.
A third of the children who had used social media or messaging services had their own account, and 80% of them had help setting up their account/accounts from a parent or carer.
The study also found only 13% of children who had an account had them shut down by the social media companies or messaging services for being under the age of 13.
‘Inconsistency’
“These findings indicate there is inconsistency across industry regarding the steps taken to assess the age of end-users at various points in the user experience,” the report’s authors said.
“However, there is one thing they have in common: a lack of robust interventions at the point of account sign-up to a service to prevent someone under 13 from providing a false age or birthdate to set up an account.”
The regulator’s report also surveyed the platforms themselves, which were asked how they verify the ages of younger users.
Snapchat, TikTok, Twitch and YouTube told the authors they deployed tools and technology to detect whether a user may be under the age of 13 once they were using the service.
“Proactive tools and technologies may rely on a user actively engaging with a service (such as connecting with others, communicating with others, sharing and creating content) to detect relevant signals,” the report said.
“This may require time and engagement to detect a child under 13, and in that time the child may be exposed to risks and harms.”