Meta Faces EU Scrutiny for Failing to Protect Children on Social Media

Published on April 29, 2026

The European Commission recently assessed Meta’s compliance with the Digital Services Act. Historically, the company has allowed users of all ages on platforms like Facebook and Instagram. This practice has drawn significant concern over the protection of minors online.

Authorities have indicated that Meta is failing in its duty to prevent children from accessing its services. This issue marks a departure from previous regulations focused mainly on adult content. Now, the spotlight is on established social networks to uphold stricter safety standards.

According to the preliminary findings, Meta’s measures to restrict underage access have been inadequate. The European Commission’s investigation reveals gaps in the enforcement of age-verification protocols. These revelations put Meta’s user safety policies under intense scrutiny.

The potential consequences could involve hefty fines and increased regulations for Meta across Europe. As a result, the company may face pressure to redesign its approach to user safety. This situation could reshape how social media platforms manage underage users moving forward.

Related News