European Union authorities have initiated formal investigations against social media giants Meta and TikTok, alleging that both platforms are failing to adequately protect minors and curb the spread of harmful online content.
The probes mark a significant escalation in the bloc’s efforts to enforce its landmark Digital Services Act (DSA), which came into full effect last year.
The European Commission, the EU’s executive arm responsible for overseeing compliance, announced the actions today, focusing on whether the companies are doing enough to mitigate risks to young users.
Under the DSA, large online platforms classified as “very large” – including Meta’s Facebook and Instagram, as well as TikTok – are required to conduct thorough risk assessments, implement safeguards against exploitative practices, and swiftly address illegal or dangerous material.
For TikTok, the scrutiny centers on its age verification processes and the effectiveness of parental controls. Regulators suspect the app may not be verifying users’ ages accurately, potentially exposing children to inappropriate content or ads.
The investigation also examines TikTok’s algorithms, which could be amplifying addictive or harmful videos targeted at minors.
Meta faces similar charges, with the EU questioning whether its platforms are sufficiently shielding teenagers from bullying, self-harm promotion, or sexual exploitation.
Officials are particularly concerned about the company’s data practices, including how user information from under-18s is collected and used for personalized recommendations.
This probe builds on prior fines Meta has received for privacy lapses, signaling a broader push to hold tech firms accountable for content moderation.
“This is about putting safety first, especially for our children,” said EU Commissioner for Justice Didier Reynders in a statement. “Platforms must step up their game to prevent the digital world from becoming a playground for risks.”
The investigations could lead to hefty penalties if violations are confirmed – up to 6% of a company’s global annual revenue.
For context, TikTok’s parent ByteDance reported over $100 billion in revenue last year, while Meta exceeded $130 billion.
Non-compliance might also force operational changes, such as enhanced content filters or stricter age gates across the EU.
This move comes amid growing parental advocacy and a wave of similar regulatory actions worldwide.
Just last month, the UK fined TikTok £12.7 million for mishandling children’s data, and U.S. states are advancing lawsuits against Meta over youth mental health impacts.
EU officials emphasized that these probes are just the beginning, with more platforms potentially in the crosshairs as the DSA’s second anniversary approaches.
Both companies have pledged cooperation. A TikTok spokesperson said, “We take our responsibilities seriously and are committed to working with the Commission to ensure a safe experience.”
Meta echoed the sentiment, noting its ongoing investments in youth safety tools.
As the digital economy expands, these cases underscore the EU’s role as a global standard-setter for tech regulation, influencing policies from California to Delhi. Further updates are expected as the probes unfold over the coming months.


















