EU Regulators Say Meta Failing to Block Underage Users on Facebook and Instagram

EU Launches Investigation Into Meta Over Child Safety Compliance

Meta

The European Commission has concluded that Meta may be breaching European law by failing to adequately prevent children under 13 from accessing its social media platforms.

In preliminary findings released Wednesday, regulators said Meta’s enforcement of the minimum age requirement on Instagram and Facebook is insufficient, raising concerns under the bloc’s Digital Services Act.

Weak Age Verification Measures

According to the Commission, children can easily bypass age restrictions by entering false birth dates during account registration.

Investigators found no effective verification systems in place to confirm users’ real ages.

The Commission also criticized the platform’s reporting tools, describing them as overly complex.

Reporting an underage account can require multiple steps, and even when flagged, such accounts are not always removed promptly or effectively.

Regulators are now calling on Meta to overhaul how it assesses and mitigates risks faced by minors across its platforms in the European Union.

Meta Pushes Back

Meta has rejected the findings, saying it already enforces age limits and actively removes accounts belonging to underage users.

A company spokesperson said the firm is continuing to invest in detection technologies and plans to introduce additional safety measures soon.

Meta also emphasized that age verification remains a broader industry challenge requiring coordinated solutions.

Potential Financial Penalties

If the preliminary conclusions are upheld, Meta could face significant consequences. Under EU law, the company risks fines of up to 6% of its global annual revenue.

The company now has an opportunity to respond formally before regulators issue a final decision.

Mounting Global Pressure

The EU investigation adds to growing scrutiny of Meta’s approach to child safety.

Recent court rulings in the United States have also questioned the company’s platform design and its impact on young users.

Meanwhile, governments worldwide are tightening regulations.

Countries such as Australia have already introduced restrictions on social media use for minors, while others including United Kingdom, Spain, and France are considering similar measures.

Regulators are increasingly pushing for stronger age verification systems, including biometric tools and digital identity checks, as concerns grow over children’s exposure to online risks.

The outcome of the EU’s final ruling could set a major precedent for how global tech companies manage child safety on their platforms.

Share this article

Leave a Reply

Your email address will not be published. Required fields are marked *

Receive the latest news

Subscribe To Our Newsletter

Get notified about new articles