Thursday 19th March 2026

(1 day, 9 hours ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Liz Twist Portrait Liz Twist (Blaydon and Consett) (Lab)
- View Speech - Hansard - -

I congratulate the hon. Member for St Neots and Mid Cambridgeshire (Ian Sollom) on securing the debate and I thank the Backbench Business Committee for granting it.

Facing harm online is one of the biggest struggles that our young people face daily, from toxic influencers trying to push a certain way of life or ideology to those who encourage eating disorders. However, social harms extend a long way beyond that, from aggressive algorithms designed so that young people get addicted and trapped online, to forums encouraging self-harm and suicide. Many hon. Members will be aware that I have been raising this issue in this place over a number of years.

Earlier this month, I chaired a roundtable with the Mental Health Foundation to look at the evidence about the banning of social media for under-16s. As it happened, it took place on the same day that the consultation was launched, and I thank my hon. Friend the Minister for attending our meeting. We heard from mental health experts, affected parents, the Minister and young people themselves. While views on “how” we should protect young people are diverse, the consensus on the “now” was absolute.

There is disagreement about whether an absolute social media ban for the under-16s is the right answer. Should we have a more nuanced approach where we look at a wider range of issues such as the architecture of social media platforms? Following the consultation, the Government must design any proposed policy alongside young people. We will not find an effective solution without including the young people who operate in this world and who are most affected, and we must look to social media companies to start getting their act together and protecting people.

The links between young people using social media and increasing levels of loneliness and poor mental health are well documented. We have a youth mental health crisis, with nearly one in five children aged eight to 16 having a probable mental health disorder. That is a staggering number. As we have heard, the Molly Rose Foundation found that 95% of recommended posts on certain teenage accounts contained content related to suicide or self-harm. As Members know, Molly Rose tragically took her own life at the age of just 14, after social media algorithms continuously served her with self-harm and depression material, which created a rabbit hole for her to go down. She saw more than 2,000 harmful posts in the last six months of her life.

The rise of forums where groups encourage extreme forms of violence, self-harm, suicide, animal cruelty and political extremism, is extremely worrying. They look to target impressionable young people, pipelining sadistic and hateful ideas and content straight to them. UK law enforcement identified several cases in which perpetrators coerced girls as young as 11 into seriously harming or sexually abusing themselves, siblings or pets.

These forums encourage or blackmail users into committing serious acts of self-harm or suicide. One specific pro-suicide forum has been linked to more than 135 deaths in the UK alone. That is 135 empty chairs at dinner tables. These are extreme forms of online harms, and I am glad that the National Crime Agency and international partners are taking them seriously, but do they have the powers and resources to protect young and impressionable people from serious online harm?

This debate on online harms is not new. We have been talking about how to protect people from the online world over many years in this House, and there is a real danger that we are permanently running to catch up with online operators. Experts described the Online Safety Act as a ceiling for safety, not a floor. In many cases, social media companies are doing little more than their statutory duty. In December last year, not a single platform determined itself to be “high risk” for suicide or self-harm content. We cannot let companies mark their own homework.

Platforms are systematically downplaying harms, and are incentivised to allow the problem to go on if advertisers still deem the platform safe. Ofcom needs to have the power to mark these companies honestly, and if they are failing, it needs to be able to act. Does the Minister believe that Ofcom has the powers that it needs to act and for that to be followed through on?

Let me talk quickly about a meeting that the Molly Rose Foundation and affected parents recently had with Ofcom to highlight significant concerns, which included strengthening the Online Safety Act and extending it to cover children’s wellbeing. I am told that the meeting was very unsatisfactory. Will the Minister agree to meet with me and those parents to discuss the situation further? This is important to them; they have lost their children, and they want to do more to protect others.