In a startling revelation, over 80% of children in Australia aged 12 and below are reported to have accessed social media or messaging services last year, platforms that technically require users to be at least 13 years old. This data was compiled by Australia’s eSafety regulator, which identified YouTube, TikTok, and Snapchat as the most frequently used services among young users. With a proposed ban on social media for those under 16 expected to roll out by year-end, the urgency surrounding this issue is heightened.

The investigation focused on several major platforms, including Discord, Google (YouTube), Meta (Facebook and Instagram), Reddit, Snap, TikTok, and Twitch. However, these companies did not provide immediate feedback when approached for comments regarding the report's findings. Most of these platforms enforce a minimum age policy of 13 for account creation, although exceptions exist — such as YouTube’s Family Link feature, which allows supervised access for children.

eSafety Commissioner Julie Inman Grant described the report as a vital tool for shaping the next steps toward enhanced online safety for children. She emphasized that the responsibility for safeguarding young internet users is a collective one, involving social media companies, technology creators, parents, educators, and lawmakers.

Surveying over 1,500 children across Australia between the ages of 8 and 12, researchers discovered that a staggering 84% had utilized at least one social media or messaging platform since early last year. Notably, many accessed these platforms through a parent or guardian’s account, while around a third had established their own accounts, typically with parental assistance.

Inconsistency in age verification across the industry was another key finding of the report. It raised concerns about inadequate preventive measures at the point of account registration, allowing many parents to potentially provide false information regarding their child’s age.

The report also canvassed responses from social media platforms regarding their age verification processes, with Snapchat, TikTok, Twitch, and YouTube revealing they employ certain technologies to detect underage users. However, experts noted that these measures often rely on users' engagement time, which might expose children to various online risks before detection occurs.

The pressing need for reforms and stricter age verification is clear, as Australia’s initiative to protect its youth online evolves amidst rapidly changing social media landscapes.