After a major payout, the next fight is over how Meta operates

Meta is returning to court in New Mexico for a second and potentially more consequential phase of a child safety case that has already produced a $375 million loss for the company. This next stage is not primarily about money. It is about whether a judge should force changes to the way Facebook, Instagram, and WhatsApp work for users in the state.

New Mexico Attorney General Raúl Torrez has framed the case as an effort to alter business practices, not simply collect damages. That distinction is important. Large financial penalties can hurt, but they do not always change product design or corporate incentives. The state’s argument is that remedies aimed at core platform features could have broader significance for how social media companies approach minors’ safety.

The proposed remedies reach into product design and enforcement

The trial beginning in Santa Fe is expected to run for three weeks. According to the reported description of the state’s requests, New Mexico wants the court to order Meta to implement age verification for users in the state, prohibit end-to-end encryption for users under 18, cap those users’ time on Meta services at 90 hours per month, limit engagement-amplifying design features such as infinite scroll and autoplay, and require the company to detect 99 percent of new child sexual abuse material.

Those requested changes touch nearly every major fault line in digital policy: age assurance, privacy, algorithmic engagement, moderation standards, and platform accountability. Even if any eventual order would apply only to Meta and to its operations in New Mexico, the litigation could become a testing ground for remedies that lawmakers and regulators elsewhere have debated but rarely imposed through court action.

Why this phase may matter more than the dollar amount

The state’s position, as described by Torrez, is blunt. A large company may absorb a nine-figure penalty as a manageable cost, but mandated operational changes can be much harder to contain. Product redesigns affect engineering priorities, legal risk, and user experience all at once. They can also create precedents that invite similar demands from other states or countries.

That is what makes this case worth watching beyond New Mexico. The first phase established financial liability. The second phase asks whether a court can reach into the mechanics of a social platform and dictate changes meant to reduce harm to minors. If the court endorses even part of the state’s approach, the case could influence the shape of future actions against other major technology companies.

Core issues at stake in the remedies trial

  • Whether age verification can be imposed at the state level on major social platforms.
  • Whether design features tied to engagement can be limited for minors.
  • Whether encryption rules for under-18 users can be restricted in the name of child safety.
  • Whether courts are willing to require extremely high detection standards for new CSAM.

Meta’s broader challenge extends beyond one courthouse

The company is not just defending a single case. It is also defending a model in which product features are rolled out at scale across jurisdictions with different political and legal expectations. That model has come under pressure as lawsuits, legislative proposals, and public scrutiny increasingly focus on the relationship between platform design and youth mental health, addictive use patterns, and illegal content.

New Mexico’s case stands out because it seeks a particularly direct form of intervention. Instead of asking only for compensation or general compliance promises, the state is asking for specific operating constraints. That raises difficult questions about enforceability and technical feasibility, but it also reflects growing impatience among regulators who believe softer approaches have not produced enough change.

The requested requirement to detect 99 percent of new child sexual abuse material is especially notable because it sets a very high bar. Similarly, proposals to curb autoplay and infinite scroll point to a policy shift from content-focused regulation toward interface-focused regulation. In other words, the state is not only challenging what appears on the platforms; it is challenging how the platforms keep users engaged.

A child safety case that could shape future platform governance

There are still significant unknowns. The final remedy, if any, may look very different from the full list New Mexico is seeking. Appeals and narrower implementation could also reduce the immediate impact. But the case already shows how tech regulation is evolving. Courts are increasingly being asked not just to punish platform operators after the fact, but to define what safer product design should look like in practice.

For Meta, that creates risk beyond the $375 million judgment. A company can budget for fines. It is much harder to budget for a legal environment in which courts may begin specifying age checks, engagement limits, privacy exceptions, and detection targets. The Santa Fe trial is therefore less about a single state dispute than about whether social media governance is moving into a new phase of hands-on judicial intervention.

That is why this proceeding may prove more consequential than the initial verdict. The financial hit grabbed attention, but the remedies fight could determine whether child-safety litigation becomes a route to restructuring how major platforms operate.

This article is based on reporting by The Verge. Read the original article.

Originally published on theverge.com