Europe’s digital rules are being tested on child safety
Meta has been found in preliminary breach of European Union law over its handling of underage users on Facebook and Instagram, in a decision that could become one of the most consequential child-safety enforcement actions yet under the bloc’s Digital Services Act.
The European Commission said Meta did not have effective measures in place to stop children under 13 from accessing its services. According to the commission’s initial assessment, the company failed to meet the standards required by the DSA, which obliges major platforms to identify and mitigate risks diligently, including risks tied to children using services that are not meant for them.
What the Commission is alleging
The investigation has been running for nearly two years. Its preliminary findings, issued on April 29, say Meta was unable to enforce its own stated minimum age of 13 for Facebook and Instagram. That gap is central to the commission’s argument. The issue is not just what the platforms claim in their terms and conditions, but whether they can back those claims with effective operational controls.
The commission also stressed that its findings do not prejudge the final outcome. Meta will be allowed to examine the investigation file and present a defense before any final decision is made.
Why the case matters
The case goes beyond one company. Europe has made platform accountability a major regulatory priority, and child protection is among the most politically potent areas of enforcement. A ruling against Meta would signal that regulators are willing to judge age limits not as symbolic statements, but as obligations requiring real implementation.
That matters because age assurance remains difficult across the internet. Platforms routinely say they are not intended for younger children, yet regulators are increasingly asking whether the technical and policy measures behind those statements are credible.
Meta’s response
Meta said it disagrees with the preliminary findings. A company spokesperson said Facebook and Instagram are intended for people aged 13 and older and that Meta already has measures in place to detect and remove accounts held by younger users. The company also said it continues to invest in technologies designed to find and remove underage users and plans to announce additional measures soon.
Meta further argued that understanding users’ ages is an industry-wide challenge requiring industry-wide solutions. That defense may resonate at a technical level, but it does not necessarily answer the legal question the commission is now pressing: whether Meta’s existing systems meet the obligations imposed by the DSA.
The financial stakes
If the preliminary finding is upheld, the consequences could be significant. Under the Digital Services Act, penalties can reach up to 6% of a company’s global annual turnover. The Guardian noted that Meta reported $201 billion in revenue for 2025, which underscores the scale of possible exposure even if any final fine were to come in below the legal maximum.
At the same time, the biggest impact may be operational rather than financial. A final enforcement outcome could push Meta to make more aggressive changes to age detection, account flows, or youth protections across its services in Europe.
Part of a broader European push
The commission’s move lands amid wider political momentum across Europe for stronger limits on children’s access to social platforms. Spain has called for a ban on social media for under-16s, while French lawmakers have backed similar restrictions for under-15s. In the United Kingdom, the government has said it is considering age or functionality restrictions for children under 16.
That wider context matters because it shows the Meta case is not an isolated compliance dispute. It sits within a growing belief among European policymakers that existing platform safeguards are insufficient for minors and that stricter intervention may be necessary.
What comes next
For now, the finding remains preliminary, and the process is still open. But the commission has already made its central view clear: setting a minimum age is not enough if a platform cannot enforce it effectively.
That principle could shape the next phase of internet regulation in Europe. The deeper question is whether digital platforms can continue to rely on self-declared ages and limited enforcement, or whether regulators will compel more robust age controls despite the tradeoffs such systems may create.
Meta’s case may become an early answer. If the commission ultimately confirms the breach, it will strengthen the DSA’s reputation not just as a framework for transparency, but as a tool for forcing concrete product changes on the world’s largest technology companies.
This article is based on reporting by The Guardian. Read the original article.
Originally published on theguardian.com





