security
Parliamentary committee tells ministers online safety regime is failing children and warns 'no action is not an option'
British MPs are urging the government to tighten online safety laws, arguing social media companies should face the same kind of scrutiny as other products linked to serious harm.
In a letter to Liz Kendall and Kanishka Narayan, shared with The Register, the UK's Science, Innovation and Technology Committee said there is now "strong and consistent evidence" linking social media use to harms affecting young people and warned that "no action is not an option."
The committee, chaired by Chi Onwurah, said the current system leaves social media companies free to grow their youth user bases while avoiding meaningful responsibility for the subsequent fallout.
"The status quo, where social media companies are neither accountable nor responsible for preventing harms, isn't acceptable," Onwurah said. "If any other consumer product caused these harms, it would've been recalled or changed."
The intervention forms part of the government's "Growing up in the online world" consultation and follows a March evidence session examining arguments for and against restricting social media access for under-16s.
The committee said it heard evidence from clinicians, bereaved parents, academics, child safety groups, and experts studying Australia's social media age limits, as well as accounts from young people and families concerned about harmful content and the effect social media is having on children's wellbeing.
While the MPs stopped short of explicitly endorsing a blanket social media ban for teenagers, the letter makes clear the committee thinks ministers have spent too long relying on voluntary action from platforms whose business models still reward engagement above pretty much everything else.
The committee said existing age restrictions should be properly enforced using "effective and privacy-preserving" age verification systems – rather than checks that can be bypassed by a drawn-on mustache – and called for stronger legal obligations requiring companies to filter illegal content and to block children from viewing harmful material.
The letter also revisits the committee's earlier concerns about recommendation algorithms and how platforms deal with harmful and illegal posts, areas where MPs say previous proposals for reform went nowhere.
MPs are now urging ministers to revisit those recommendations and bring forward fresh online safety legislation in the next parliamentary session.
Particular attention was paid to algorithms and addictive design features. The committee argued that infinite scrolling and similar engagement mechanics should be designed out of platforms entirely, and warned that social media companies cannot keep pretending they are passive hosts while their recommendation systems actively shape what users see.
The letter also warned that gaps in the UK's Online Safety Act mean some AI chatbots operating on closed databases currently fall outside the regime, something MPs said must be fixed before the next generation of online platforms disappears into yet another regulatory blind spot. ®

8 hours ago
4





English (US) ·