Online Harm: Child Protection

Sam Carling Excerpts
Tuesday 24th February 2026

(1 day, 8 hours ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I respect the hon. Member’s intervention for its politeness, but I do not think the answer is suddenly to encourage all children who are finding it hard to find purposeful and meaningful activities in the real world to retreat to their bedrooms. One of the challenges we have seen is that children have felt that the online space is the most stimulating for them. Unfortunately, that has led to an even greater retreat from the real world, and I think we can all recognise that that has been a negative for society.

Sam Carling Portrait Sam Carling (North West Cambridgeshire) (Lab)
- Hansard - -

The hon. Lady has been very clear that she wished the Government had just charged forward in some direction or other. I have had hundreds of constituents email me about this, from various perspectives and various concerns about the workability of certain solutions. I would like to listen to them, and I think it would be really helpful if the opposition parties tried to do likewise and to engage with this process, rather than just criticising whatever approach we take.

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I appreciate where the hon. Member is coming from. I do not think it is wrong to seek evidence and ask for people’s views, but the Prime Minister should be honest about what he wants to do. The problem is that he has been floating various opinions, and he is being buffeted by Labour MPs and by the Opposition and others. If he does not think this is the right approach, he should feel confident in saying so. He has said a whole range of different things about this, and the Government are seeking to launch a consultation, but nobody actually knows what precisely is being consulted on.

If Labour MPs were honest with themselves, I think they would recognise that. I suspect they are having very serious conversations with the party’s Whips, saying, “Well, actually, we would like to know what the Prime Minister does think about this issue, because we’re not convinced by this consultation—we think it’s kicking the issue into the long grass, and we’re worried about the length of time that will mean before we get legislation to protect children from various challenges online.” That is the very reason why the Minister has stood up before them today to say, “We are probably going to do something—very definitely, maybe—in the summer.” He is saying that because the pressure is growing from Labour MPs. It is being briefed out that the Government are going to bring forward amendments to the Bill because they are being buffeted into doing so.

The problem is that nobody knows what this Prime Minister believes. On every single issue for the Government at the moment, and despite the very large Labour majority, this Prime Minister is being buffeted around, and that is the problem.

--- Later in debate ---
Sam Carling Portrait Sam Carling (North West Cambridgeshire) (Lab)
- View Speech - Hansard - -

I commend the hon. Member for Twickenham (Munira Wilson) on bringing forward this debate, which is a really valuable opportunity to talk about this issue. I also thank the many hundreds of my own constituents who have written to me about this from a variety of perspectives—if I have not got back to them yet, I will do so shortly.

Social media has rightly been described as a wild west. I come to this debate as someone who grew up with it—it has been there all my life—but who thoroughly dislikes traditional social media. Were it not for the importance of it in my job, I would spend very little time looking at it.

We must start by clarifying what problem we are trying to solve when we talk about online harms. The way I see it, there are three main categories. First, there is harmful online content itself. Algorithms are feeding people things that they never asked for, and the evidence that misogynistic and other extreme and deeply wrong content is being pushed on to people is overwhelming.

Secondly, there is the online grooming of children. Everyone knows how serious an issue that is, particularly on some online gaming platforms aimed at younger children, such as Roblox, which has been mentioned previously. It is so bad, and the reaction of the relevant company is so poor, that vigilantes are now active on some of these platforms, conducting sting operations to catch paedophiles. Appallingly, Roblox responded to one such user, who has a YouTube channel under the name Schlep and got six child predators arrested, by banning him from the platform and threatening legal action. Clearly we cannot encourage vigilantism, but if that is the platform’s response when someone is trying to deal with their own failures, something is deeply wrong.

There is also the problem of addictive content. That has gotten far worse in recent years, with the rise of short-form content and the algorithms that fuel it. Apps such as YouTube can in some circumstances automatically default to their “shorts” function when opened to be maximally addictive. Other addictive features are rampant, such as Snapchat streaks, which encourage children to open the app first thing in the morning and last thing at night to keep them going.

As the hon. Member for Bath (Wera Hobhouse) just mentioned, there are rampant drug-dealing problems on Snapchat. Some of that is due to the way that it recommends friends to people. There are accounts with the most obvious pseudonyms that we imagine, such as “snowforsale”, which clearly mean, “Add this account if you want to buy drugs from someone.” So little action is being taken on that issue.

Snapchat is not the only platform with this problem; it is rampant on Instagram as well. I remember quite recently that I came across an account that was so clearly selling marijuana-infused food, so I reported it, and Instagram did absolutely nothing. There is a real complacency and a lack of willingness to act in these companies that we have to deal with.

I should mention the rise of AI-generated fake content, designed either to mislead people or keep them hooked by showing fantastical things that do not work in reality. There is also the related issue of faked content more generally. There are horrendous examples of viral fake cooking recipes that do not work and could cause serious harm to people, such as by encouraging them to use a microwave in a way that could create something explosive. Online content creators such as Ann Reardon, who is an Australian YouTube creator, are doing amazing work to call that out and try to educate people, but the platforms do not have their backs; in many cases, they are actively undermining those creators’ work because the content they are trying to deal with is what is generating the most money for those platforms, due to its addictive nature.

I was very impressed by a video that my hon. Friend the Member for Bangor Aberconwy (Claire Hughes) put out the other day, exposing—if memory serves—a situation in which people working for an estate agent were recording videos in the homes of people who are from ethnic minorities, then packaging them to look like they are asylum seekers and saying, “Look at the great lives that asylum seekers have.” I encourage everyone to have a look at that video. The way that that content has been able to propagate online is atrocious, and I am so glad that my hon. Friend has been able to call it out.

All of these issues point to a situation that cannot go on. However, like my hon. Friend the Member for Milton Keynes Central (Emily Darlington), I am concerned that trying to solve them in one fell swoop with a ban for young people will not work, and could make some of the issues worse. Young people are incredibly digitally literate and digitally agile, and I am afraid to say that when a platform becomes unavailable to them, they can rapidly switch to another. A recent letter—I was glad to hear the Minister mention it in his speech—signed by the NSPCC, the Centre of Expertise on Child Sexual Abuse, the Molly Rose Foundation and a long list of other child safety experts raised exactly that concern, and referred to blanket bans as

“a blunt response that fails to address the successive shortcomings of tech companies and governments to act decisively and sooner.”

In my view, taking a named-platform approach to a ban is unworkable. I fear that young people and Ofcom will end up in a perpetual game of whack-a-mole, with children moving to other apps as Ofcom tries to follow and shut them down. I know which side I would have my money on in that game; our regulator is nowhere near quick enough. In March last year, I spoke in this Chamber about an app I was aware of others using from when I was a teenager, which essentially functions as a dating app for children but masquerades as social media. The app I refer to has now finally had its age limit increased to 18, but it took 10 years.

This is not just about Ofcom, either—I do not wish to criticise just Ofcom. Regardless of how quickly a regulator moves, I guarantee that our country’s children will move more quickly. Their digital literacy is far higher than they are often given credit for, which will make it much harder to regulate platforms and deal with harms. It will likely become easier for groomers to hide and to find victims, while it will become harder to regulate addictive features and to take action on harmful content. Even if that turns out to be less of an issue than I fear, arbitrarily setting the age at 16 just shifts the cliff edge that we already have to deal with—it does not teach people to deal with and recognise the problems. Frankly, the problems that social media creates are by no means limited to young people. I talked previously about AI-generated and faked content; it is primarily older people who are struggling to identify that content and are not equipped with some of the necessary skills.

In my view, blanket bans also risk serious damage to children aged between 13 and 16 for whom the ability to connect with others online is particularly important. Let me give a very personal example. Around the age of 13, I started to realise that my sexuality was not like those around me—I was not straight—but I was living in a rural community where there was not really anyone else to talk to about that who would understand. It was made worse by some of my early childhood being immersed in a deeply homophobic religious community. As such, finding people with similar experiences online to talk to and be able to provide mutual support was incredibly valuable for me, as it is for other LGBT people, as well as for neurodiverse children and others.

So what do I believe is the solution? We need a functionality-based approach. Through limiting certain functions and features, I am of the view that we can deal with harms without creating a situation where children—they are going to seek out ways to connect online regardless of the law—move on to less regulated platforms. In my view, we should look to restrict addictive functionality on those platforms; that might be linked to age, or it might be something we want to consider for people of all ages, because as I said, it is affecting people of all ages. The explosion of addictive, algorithmically driven short-form content over the past few years, as well as features such as Snapchat streaks that are actively designed to keep people hooked, provide no discernible benefit to society. Social media companies have proven unwilling to act on that front, so we must.

We also need to enforce existing age restrictions much better. We all know that plenty of people under 13—which is usually accepted as the current limit—are already using social media platforms that they should not be using. Earlier, my hon. Friend the Member for Milton Keynes Central mentioned the example of children on their parents’ accounts, which is so widespread.

There is a definitional problem about what actually is social media. Are we counting online gaming platforms such as Roblox? That can be joined from age five. There are ways to prevent children from accessing chat functions—parents can prevent younger children from doing so—but with the continuing prevalence of child abusers on the platform, those measures are clearly not working. We need to be very clear on what we mean by social media because of how much the definitions differ.

To conclude, I really agree with the Government’s approach in opening a meaningful consultation—a national conversation—on how we tackle online harms and on where the pitfalls in workability are, so that we can identify and deal with them. I look forward to engaging further with it alongside my constituents.

None Portrait Several hon. Members rose—
- Hansard -