The UK government is moving closer to intervention

The British government has signaled that it is preparing to take a harder line on social media design features it believes are engineered to keep children and teenagers hooked. Prime Minister Keir Starmer said the UK will “have to act” on addictive platform mechanics, marking one of his strongest public statements yet on potential new restrictions.

Starmer specifically pointed to features such as scrolling systems and streaks that encourage repeated daily use. His argument was direct: if platforms are deliberately trying to hold children’s attention for longer in ways that foster dependency, he does not see a case for allowing those features to remain untouched.

The intervention matters because it shifts the public conversation from content moderation alone to product design. Rather than focusing only on what children see online, the UK government is now openly questioning whether some of the core engagement tools of social media products should be permitted at all.

From online harms to interface design

That is an important policy evolution. For years, governments have debated harmful posts, age verification, and platform accountability. But addictive design introduces a different regulatory lens. It asks whether the architecture of the service itself, not just the material flowing through it, can create measurable harm for young users.

Education Secretary Bridget Phillipson reinforced that line of thinking. She said social media is “designed to keep you there” and that the government’s consultation will examine how addictive features can be tackled. She framed the issue as especially serious for younger users, noting that adults may be better able to interpret these attention-maximizing systems than children whose brains are still developing.

Together, the remarks from Starmer and Phillipson suggest the government is trying to build a case that platform engagement mechanics are not neutral design choices. Instead, they may be treated as intentional systems for capturing and extending user attention, with potentially different consequences for minors than for adults.

The emerging debate mirrors a broader international trend in which governments are reassessing how much autonomy tech companies should have in optimizing user retention among children. The UK has not yet announced final rules, but its language indicates that inaction is no longer the preferred stance.

Consultation now, change later

Starmer also said he was open-minded about an under-16 social media ban, similar to measures enacted in Australia, though he stopped short of endorsing that outcome outright. Instead, he emphasized that the current consultation process will guide the government’s next steps. Even with that caveat, his message was unmistakable: the status quo will not continue.

He argued that the next generation would not forgive policymakers if they failed to act now. That statement is politically notable because it frames regulation not as a speculative intervention but as a duty of care. The implication is that governments now have enough evidence of risk to justify redesigning the legal relationship between young users and attention-driven platforms.

The debate has also gained momentum from legal developments outside the UK. The comments came after a U.S. case in which Meta and Google were found liable in connection with a woman’s childhood social media addiction, with damages awarded. The companies plan to appeal, but the ruling adds to the pressure on lawmakers elsewhere by reinforcing the idea that addictive product design can carry legal as well as reputational consequences.

Why the focus on features matters

If the UK ultimately targets mechanics like infinite scroll, streaks, or similar retention tools, it would be addressing the business logic behind many social platforms, not merely their edge cases. Such features are deeply tied to engagement metrics, habit formation, and advertising economics. Restricting them for minors could force companies to rethink how youth-facing experiences are designed and monetized.

That does not guarantee a simple regulatory path. Policymakers will have to define what counts as an addictive feature, determine how age-based enforcement would work, and decide whether certain design elements are banned, limited, or subject to default protections. But the political direction is becoming clearer.

The UK government is moving from general concern about children online to a more specific challenge: whether platforms should be allowed to engineer compulsive use patterns in the first place. That is a sharper, more structural question than debates over screen time alone.

For the technology industry, the warning is substantial. Design decisions once defended as standard growth tactics are increasingly being recast as public-policy targets. For families and schools, the debate suggests that future regulation may focus less on user discipline and more on constraining the systems competing for children’s attention.

Starmer’s message, paired with Phillipson’s, leaves little doubt about the policy trajectory. The consultation is still underway, but the government is already signaling that major changes are likely. In the UK, addictive social media features are no longer being discussed as an unavoidable byproduct of modern apps. They are being discussed as something the state may decide should not be allowed.

This article is based on reporting by The Guardian. Read the original article.