Online Harm: Child Protection

Matt Rodda Excerpts
Tuesday 24th February 2026

(1 day, 8 hours ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Chi Onwurah Portrait Dame Chi Onwurah
- Hansard - - - Excerpts

I agree that AI chatbots are a further evolution, and I think we should learn from the lack of effective regulation under the Conservatives during that critical period in the evolution of the internet in how we approach AI. I agree with the hon. Lady that AI chatbots should be brought into the regulatory environment of the Online Safety Act.

Matt Rodda Portrait Matt Rodda (Reading Central) (Lab)
- Hansard - -

My hon. Friend the Chair of the Select Committee is making an excellent speech. Her background in this area is really showing in the detail with which she is exploring these issues. Part of the challenge here is that we as parents are struggling to catch up with this revolution, which is gaining speed all the time. Perhaps my hon. Friend would highlight some of the challenges that parents face. For me, part of the importance of the consultation is to allow parents to think more deeply about this difficult issue; there are often different opinions from campaigners who have had the most painful experiences.

Chi Onwurah Portrait Dame Chi Onwurah
- Hansard - - - Excerpts

My hon. Friend makes an excellent point. It is for that exact reason that I support a consultation: this is part of a debate, and we all need to improve our understanding of the impacts of this technology. Parents are in a difficult position. I do not believe parents should have to be technology experts in order to give their children the best start in life, but unfortunately there is so much pressure in the online world that that seems to be the case right now, and that is why it is right that Government take action and consult on the action they take.

Let us think about the evolution of these technologies. I remember that when I joined Facebook in 2005 I had to use my university email address to join—that meant I had to be over 18. Some 20 years later, 13-year-olds and younger are having their lives and brains formed by almost uninhibited access to social media. In the UK, the number of social media users has gone from practically zero to four fifths of the population. I have worked with the Molly Rose Foundation, a charity established by the Russell family after their daughter Molly took her own life at the age of 14 following exposure to self-harm content online; I have spoken to the bereaved parents of children bullied to death online; and I have spoken to the Internet Watch Foundation about the horrendous images its staff see of child exploitation. The fact that the Conservatives did nothing in all those years in government is, in my view, a form of political negligence of the highest order.

As part of my Committee’s inquiry into social media and algorithms, Google, Meta, TikTok and X told us that they accepted their responsibility to be accountable to the British people through Parliament, which I thought was quite a step forward from previous utterances, and ongoing utterances, by some tech billionaires who shall remain nameless. Our inquiry found that our online safety regime should be based on principles that remain sound in the face of technological development. Social media has many important and positive contributions, including helping to democratise access to a public voice and to connect people far and wide, but it also has significant risks—and those risks can evolve with the technology. We spoke about AI as an evolution, and one of the main failings of the Online Safety Act is that it regulates particular services rather than establishing principles that remain true and can be part of a social consensus as technology evolves.

--- Later in debate ---
Emily Darlington Portrait Emily Darlington (Milton Keynes Central) (Lab)
- View Speech - Hansard - - - Excerpts

This week is Eating Disorders Awareness Week, and we must remember the acceleration of online harms. We have heard horrific accounts of ChatGPT giving young people diets of 600 calories per day, which is just appalling. We know the suffering and pain caused by seeing images tagged with the terms “ana”, “thinspiration” and other terms that should go. The promotion of such content is now a category 1 offence, and Ofcom should be weeding it out. The hon. Member for Winchester (Dr Chambers) is absolutely right to say that that measure should be extended to bots.

I thank the Chair of the Science, Innovation and Technology Committee, my hon. Friend the Member for Newcastle upon Tyne Central and West (Dame Chi Onwurah), for her fantastic speech. We have taken this matter seriously since the very beginning of the parliamentary Session, and we have done a lot of work on it. I echo her call for Ministers to look again at the recommendations in our Committee’s “Social media, misinformation and harmful algorithms” report, which goes well beyond misinformation and into how the damage is done.

Protecting our children and young people online is extremely important. The Online Safety Act was an important step forward, but it has not been fully implemented by Ofcom, it is not proactive enough, and it is too dependent on what social media companies themselves tell Ofcom. In the spirit of consultation—I know that we will get to that—I have done my own consultation with 500-plus 14 to 16-year-olds across my Milton Keynes Central constituency. Some 91% of them have a phone, and 80% have social media profiles. However, what will surprise the House is what young people consider social media profiles to be. We consider them to be Facebook or Instagram, while they consider them to be YouTube and Roblox—two organisations not covered by the Australian model. Additionally, 74% of those 14 to 16-year-olds spend two to seven hours online a day. Let me remind the House that, at that age, the brain development of young women is close to finished, while for young men, whose brain development does not finish until they are about 25, it is nowhere near complete. We know that from the science—just to be clear, that is not an opinion. Brain development in young women and girls happens differently, so should we therefore have different rules for young women and men?

Fifty-nine per cent of the 14 to 16-year-olds have been contacted by strangers, and more than a third of that was through Roblox, which is not covered by the Australian social media ban. Thirty-three per cent have been bullied, and a third of those was on Roblox. The Australian social media ban—which I assume is what the Liberal Democrats are talking about when they say they are in favour of a ban—does not cover YouTube or Roblox, and we have not even looked at whether it is effective. A ban is a blunt tool that essentially raises the flag of surrender to social media platforms and declares that there is no way of making social media safe. That is essentially what the Conservatives did when the Online Safety Act 2023 was passed: they said, “We cannot go far enough, so we are going to roll back. It is about free speech.” No, it is not about free speech. Freedom of speech was written into law in this country and spread around the world, so we understand how to protect it and limit its harm. The Online Safety Act was a missed opportunity. It also took seven years to get through this House, but we do not have seven years to wait.

There would also be unintended consequences to a ban. I had the pleasure of meeting Ian Russell the other night, and we had a really powerful discussion. My heart goes out to him, as one parent to another, given what his family have been through. He does not jump to the easy solution of a social media ban. The Molly Rose Foundation has done a brilliant briefing paper, which every MP should read, about why it does not support a ban: it wants the online world to be safe for children, but a ban does not make it so.

Matt Rodda Portrait Matt Rodda
- Hansard - -

My hon. Friend is making an excellent speech. I commend her work in reaching out to young people; it sounds superb. The lesson may be that we should all do exactly that. I am running a survey myself. She mentioned the Molly Rose Foundation, and I have met some of its staff to discuss its work. A family in my constituency of Reading suffered a terrible incident—their son was murdered in an incident of online bullying—and they have a different view. Does my hon. Friend agree that it is important that we properly listen to the families and consider the different views in the consultation?

Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

I absolutely do. My full sympathy goes to that family in my hon. Friend’s constituency—it is the worst thing in the world for a parent to lose a child. But we have to get this right, which is why it is right that we have a consultation. It does no child any good if we jump to a conclusion that does not actually protect children.

Although I maintain an open mind, I worry about a full ban. Some children rely on social media for connection, often including those who are exploring their sexuality—LGBTQ+ people—and those who are neurodivergent. The consequences for them could be devastating, so we need to consider their views. If young people get around the ban, as they do in Australia, they are less likely to report when they see harmful content or are being targeted on social media, because they worry that they will get in trouble for breaking the law.

A ban would create a cliff edge at 16. No matter the person’s maturity—I have already talked about the different brain development in young women and men—their skills or what they have been taught, there is a cut-off at 16. All of a sudden it does not matter, and they go into a world that is not safe. Younger children do not have their own social media profiles; they use their parents’ devices. Often, they start with a video of Peppa Pig, and all of a sudden—who knows where it ends up? A ban would not address that. So, what is the solution? Doing nothing is not an option—I think the whole House can agree on that.