Published on April 4, 2026
Australia’s eSafety regulator has expressed serious concerns about the compliance of major social media platforms with the country’s ban on under-16s accessing their services. In a recent statement, the eSafety Commissioner called for these companies, which include Facebook, Instagram, Snapchat, TikTok, and YouTube, to improve their enforcement measures aimed at safeguarding younger users.
The regulator noted that while these platforms have implemented some age verification processes, they are often insufficient and ineffective. Many young users are still able to bypass restrictions, raising alarms about the potential risks linked to exposure to harmful content and online predators.
The eSafety Commissioner highlighted the need for stronger mechanisms to prevent underage access. This includes more robust age checks and better monitoring of user-generated content, as well as proactive measures to educate both parents and children about the dangers of unsupervised social media use.
In recent years, there has been increasing attention on the impact of social media on mental health, particularly among children and teenagers. Research has shown that prolonged exposure to social media can lead to issues such as anxiety, depression, and poor self-esteem. As these platforms continue to grow in popularity, ensuring the safety of their youngest users has become a pressing issue for regulators worldwide.
The call for better enforcement has garnered support from child protection advocates, who argue for a more accountable digital environment. They emphasize that technology companies must take their responsibilities seriously and prioritize the safety of young users over business interests.
As discussions continue, the eSafety regulator is expected to increase scrutiny on these platforms to ensure compliance with the under-16 ban. The outcome of these efforts could shape the future of social media regulations in Australia, aiming to foster a safer online space for children and adolescents.