Online Harm: Child Protection

Monica Harding Excerpts
Tuesday 24th February 2026

(1 day, 8 hours ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Chi Onwurah Portrait Dame Chi Onwurah (Newcastle upon Tyne Central and West) (Lab)
- View Speech - Hansard - - - Excerpts

I am grateful to the Liberal Democrats for bringing forward this debate on protecting children from online harms, although I remain uncertain as to the measures they are proposing. This debate is happening up and down the country, in homes and at school gates—indeed, wherever people gather—so it is right that we debate it here. If the Conservatives had done something during their critical 14 years of power, our children would be better protected now, but they did not, so it falls to us to take action.

I am going to speak about three things: online platforms, their history and approach; the work of my Select Committee, the Science, Innovation and Technology Committee, on algorithms; and the work of the Committee on digital childhood, all within the context of protecting children from online harms.

The key online players range in age from pre-teen—TikTok was founded in 2016—to their late 20s, as Google was founded in 1998. In human terms, these platforms are just entering or leaving adolescence, and it shows.

As hon. Members across the House may have heard me mention, I am an engineer—chartered, as it happens; thanks for asking—and my last job before entering this place was head of telecoms technology for Ofcom. I remember meeting people from a US platform, which shall remain nameless, around 2005. The company executive commented that they had come to the UK from silicon valley on a six-month contract to sort out Government affairs, and they could not understand why, two years later, discussions were still ongoing. Did we not realise that Government had no role in what they did?

I say that to illustrate that tech platforms have their origins in a libertarian, small/no-government tech bro bubble that has spread globally. TikTok, as a Chinese company, has a different background, but public accountability is not necessarily part of it. Unfortunately for all of us, the Conservative-Lib Dem Government of 2010 and their successors shared the view that Government should not be a part of it, which is how we arrived in 2024—20 years later—without online harms regulation, while at the same time the use of social media and life online has exploded. That is why I consider the Tory position in this debate to be a superb example of hypocrisy.

Monica Harding Portrait Monica Harding (Esher and Walton) (LD)
- Hansard - -

The hon. Lady is making a powerful speech about the evolution of social media platforms. I have four children; the first was born in 2004 and the last was born in 2011, so their births have spanned that evolution. Facebook began in 2004; TikTok began in 2016. If that evolution was the industrial revolution, we would be around the spinning jenny stage, with AI chatbots the next destination. Those chatbots are terribly dangerous for our children, and we need to regulate them now. That should be within the Online Safety Act.

Chi Onwurah Portrait Dame Chi Onwurah
- Hansard - - - Excerpts

I agree that AI chatbots are a further evolution, and I think we should learn from the lack of effective regulation under the Conservatives during that critical period in the evolution of the internet in how we approach AI. I agree with the hon. Lady that AI chatbots should be brought into the regulatory environment of the Online Safety Act.

--- Later in debate ---
Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

I absolutely do. My full sympathy goes to that family in my hon. Friend’s constituency—it is the worst thing in the world for a parent to lose a child. But we have to get this right, which is why it is right that we have a consultation. It does no child any good if we jump to a conclusion that does not actually protect children.

Although I maintain an open mind, I worry about a full ban. Some children rely on social media for connection, often including those who are exploring their sexuality—LGBTQ+ people—and those who are neurodivergent. The consequences for them could be devastating, so we need to consider their views. If young people get around the ban, as they do in Australia, they are less likely to report when they see harmful content or are being targeted on social media, because they worry that they will get in trouble for breaking the law.

A ban would create a cliff edge at 16. No matter the person’s maturity—I have already talked about the different brain development in young women and men—their skills or what they have been taught, there is a cut-off at 16. All of a sudden it does not matter, and they go into a world that is not safe. Younger children do not have their own social media profiles; they use their parents’ devices. Often, they start with a video of Peppa Pig, and all of a sudden—who knows where it ends up? A ban would not address that. So, what is the solution? Doing nothing is not an option—I think the whole House can agree on that.

Monica Harding Portrait Monica Harding
- Hansard - -

I was interested in the hon. Member’s survey. I have done my own very unscientific survey of young people, and all of them seem to want some form of regulation. With that in mind, we must hurry up—does the hon. Member agree?