Social media executives from Meta, Snap, YouTube, TikTok and X are being summoned to Downing Street on Thursday for a crucial meeting with Prime Minister Sir Keir Starmer and Technology Secretary Liz Kendall over online safety for children. The tech bosses will be questioned about the steps they are implementing to protect young users and respond to parent worries, as the government pursues its consultation on whether to implement a complete prohibition on social media for under-16s, following Australia’s lead. Sir Keir has stressed that the meeting will centre on ensuring “social media companies accept and demonstrate responsibility”, warning that “the consequences of not taking action are severe” and that the government owes it to parents and the next generation to prioritise children’s safety.
The Number 10 Showdown
Thursday’s meeting represents a pivotal moment in the government’s push to bring tech giants accountable for their role in protecting vulnerable young users. The gathering comes at a crucial juncture, with Parliament having dismissed calls for an outright ban on social media for under-16s just hours earlier, despite support from the House of Lords. Instead of implementing a broad prohibition, MPs chose to grant ministers authority to introduce their own limitations, indicating the government’s inclination for a more tailored regulatory approach rather than a comprehensive legislative ban.
The scheduling of the Downing Street summit highlights the government’s commitment to appear decisive on online safety whilst navigating multifaceted political and commercial pressures. Professor Gina Neff from the University of Cambridge’s Minderby Centre for Technology and Democracy indicated the meeting permits the government to illustrate it is acting proactively on online harms. Downing Street has previously recognised that some services have made progress, introducing actions such as disabling autoplay for children by standard, and offering parents greater controls over screen time, though observers argue significantly more must be achieved.
- Tech executives interrogated about child safety protections and how they address parent worries
- Government weighing restrictions on social media for children under 16 based on Australian model
- MPs dismissed full ban but gave ministers ability to implement controls
- Some companies already put in place safeguards like disabling autoplay for young users
Parliamentary Rejection and the Wider Discussion
Wednesday evening’s House vote proved damaging to campaigners advocating for a comprehensive social media ban for under-16s, marking the second occasion MPs have rejected such measures despite considerable backing from the upper chamber. The administration’s choice to prioritise ministerial discretion over formal legislation reflects a more conservative strategy, with officials contending that an complete prohibition would be premature given ongoing policy considerations. This approach allows the administration room for manoeuvre in designing tailored controls rather than implementing a blanket prohibition that some worry could be hard to enforce and effectively oversee across various platforms.
The rejection has intensified debate about whether the UK is adequately protecting its young people from digital dangers. Whilst the administration argues that providing ministers with powers to implement bespoke guidelines represents a more sensible solution, critics assert this approach misses the decisive intervention the situation necessitates. Recent research from Australia, where an social media restriction for those under 16 was established in December 2025, reveals that more than 60 per cent of minors persist in using platforms regardless, highlighting serious doubts about the effectiveness of legislative bans and suggesting the challenge goes well beyond straightforward bans.
Cross-Party Criticism
The parliamentary vote has provoked sharp scrutiny from opposition benches. Conservative shadow education secretary Laura Trott charged Labour MPs of failing parents and children by rejecting the ban, arguing that other nations are recognising social media’s negative effects whilst the UK lags under the current government. Liberal Democrat education spokeswoman Munira Wilson reinforced these concerns, asserting that “the time for half-measures is over” and calling for immediate measures to restrict the most destructive platforms for young users rather than incremental regulatory adjustments.
Australia’s Cautionary Tale
Australia’s track record with online platform restrictions provides a cautionary case study for policymakers evaluating comparable approaches in the UK. When the country implemented a prohibition on social media for those under 16 in December 2025, it was hailed as a significant milestone in safeguarding young people from online harms. However, new findings from the Molly Rose Foundation has uncovered a troubling reality: more than 60 per cent of underage Australians keep using online platforms despite the legislative prohibition. This substantial non-compliance rate suggests that legal prohibitions alone could be inadequate in stopping determined young users from accessing the services they want to access.
The Australian research hold significant implications for the UK’s continuing policy debates. If a comparable ban were implemented in Britain, the evidence indicates implementation would present formidable challenges, with young people likely discovering methods to circumvent age-verification systems and restrictions through various technical means. The data challenges arguments that a simple legislative prohibition represents a quick fix to digital safety issues, instead highlighting the need for a more comprehensive approach integrating regulatory measures, platform responsibility, parental oversight tools, and digital literacy training to meaningfully address the risks young people face online.
| Key Finding | Implication |
|---|---|
| Over 60% of underage Australians still access social media despite ban | Legislative prohibitions alone cannot effectively prevent determined young users from accessing platforms |
| Ban introduced in December 2025 has failed to achieve widespread compliance | Enforcement mechanisms remain weak and young people find workarounds to restrictions |
| Blanket bans do not address underlying appeal of social media to young people | Multi-faceted approach combining regulation, platform accountability, and education is necessary |
Subject Matter Experts Push for Concrete Steps
Child safety advocates and online protection specialists have stepped up demands for tech companies to implement meaningful action past self-regulation. The Molly Rose Foundation, created to honour 14-year-old Molly Russell who died by suicide after viewing harmful content online, has been particularly vocal in demanding systemic change. Rather than implementing sweeping prohibitions that prove hard to police, campaigners argue the priority should move towards making companies responsible for the systems driving dangerous material to at-risk individuals.
Andy Burrows, chief executive of the Molly Rose Foundation, has stressed that Thursday’s meeting at Downing Street represents a pivotal juncture for government action. The charity has consistently argued that social media companies have the technological means to implement strong protections, yet often prioritise engagement metrics over user wellbeing. Experts stress that genuine protection requires platforms to overhaul their algorithmic recommendations, enhance moderation practices, and offer parents with meaningful tools to track their children’s online activity effectively.
The Algorithmic Challenge
At the heart of concerns lies the algorithmic systems that control what content younger audiences see. These algorithms are designed to boost user engagement, often pushing sensational, harmful, or addictive content to at-risk groups. Reforming these systems constitutes one of the most critical issues in digital safety, requiring platform transparency about how their recommendation engines operate and what safeguards exist.
- Algorithms favour user engagement over user wellbeing and safety
- Platforms must increase transparency about algorithmic recommendation processes
- Independent audits of algorithmic damage are essential for maintaining accountability
What Happens Next
Thursday’s summit at Downing Street will establish the tone for the government’s stance on online child safety in the months ahead. Following the meeting, Sir Keir Starmer and Liz Kendall are set to outline their results and determine whether current voluntary schemes from tech companies suffice or whether more robust legal measures becomes necessary. The government remains in the midst of its public consultation on whether to implement an Australia-style ban on social media for under-16s, with the result of these discussions likely to affect the final policy direction.
Ministers have signalled their preference for granting themselves powers to introduce constraints rather than implementing an outright ban, citing concerns about enforceability and impact. However, increasing pressure from opposition MPs, child safety groups, and parents suggests the government may face continued demands for stronger action. The weeks ahead will be crucial in establishing whether digital platforms can show real commitment to safeguarding young people or whether the government will introduce new laws to force compliance with more stringent safety standards.