Online Harms Debate
Full Debate: Read Full DebateLauren Sullivan
Main Page: Lauren Sullivan (Labour - Gravesham)Department Debates - View all Lauren Sullivan's debates with the Department for Science, Innovation & Technology
(1 day, 9 hours ago)
Commons Chamber
Dr Lauren Sullivan (Gravesham) (Lab)
I thank the hon. Member for St Neots and Mid Bedfordshire for securing this important debate.
Online harms are systemic, they are scaled, and they are producing real-world consequences, as we have seen. Social media is now the environment in which young people grow up—it is almost universal when children enter secondary school. According to a consultation by the Department for Science, Innovation and Technology, 81% of 10 to 12-year-olds are on social media, and 86% have accounts. The Youth Select Committee also did a study on youth violence and social media back in 2024, and found that 97% of 13 to 17-year-olds were online and that 70% of them see real-world violence online. That is a huge number of statistics, but they demonstrate the fact that social media is now in every young person’s bedroom, in their hand and in their pocket.
Professor Sarah-Jayne Blakemore from the University of Cambridge told me that adolescent brains are highly sensitive to the social environment, and the social media companies are probably aware of this. Adolescents’ brains have heightened neuroplasticity, and this will continue until their mid or late 20s. During adolescence, young people are trying to find identity and belonging, and I fear that the tech companies are exploiting this.
Where can we see evidence of harm? The National Education Union did a study called “Big Tech’s Little Victims”, in which researchers created fictional accounts and spent half an hour each day on Instagram, TikTok, Snapchat and YouTube. They found that harmful content appeared within three minutes, and often immediately. Young people in my constituency say, “I do not want to see this harmful content anymore,” yet they are still shown it, so what is going on?
The hon. Member for St Neots and Mid Bedfordshire mentioned the “Inside the Rage Machine” documentary, which I have seen a number of times. I am absolutely horrified at what the whistleblowers have revealed.
The hon. Lady is making a very powerful speech about how young people, whose brains are still being formed, are being bombarded with online content. May I just let her know that my hon. Friend is actually the hon. Member for St Neots and Mid Cambridgeshire (Ian Sollom)? When she mentions him again, she might correct that.
Dr Sullivan
My apologies to the hon. Member for St Neots and Mid Cambridgeshire (Ian Sollom).
I was speaking about “Inside the Rage Machine”. What people have witnessed is remarkable. The documentary makers found that serious exploitation cases were not being prioritised by TikTok, and that algorithms were repeatedly pushing harmful content.
It is not as simple as saying that we must ban children from social media; we need a suite of measures. The core issue is that young people, who are forming their identities, are vulnerable. Addictive algorithms are designed to maximise time and engagement, and they prioritise provocation instead of the truth. Louis Theroux’s Netflix documentary on the manosphere is an incredibly powerful and timely contribution to the debate, and he shows us that the online world is like a gold rush in the wild west. The approach of “hook, identity, monetise” drives profits, with streaming platforms like YouTube rewarding people who spout abominable things. There is a business model behind this, and I think we are all very much aware that we need to do something about it.
Harmful content spreads across platforms, so we need to be very clear about any ban on social media. Last week, the Science, Innovation and Technology Committee looked at the ban in Australia. We learned that because Australia defined which social media companies were to be included, other companies took their place. We can learn from that and it can feed into the Government’s consultation. We have to make the legislation stronger. Bans have limits, because they can be bypassed, as we see in Australia. They also shift the responsibility to the user. Why can we not shift the responsibility to the companies? We should not be banning children from social media; we should be banning the companies from exploiting our children.
Gregory Stafford (Farnham and Bordon) (Con)
I support a number of the things that the hon. Lady is saying about the dangers of online harms, especially for children, but I am unclear about her position on a social media ban for those under 16. Although I accept her overall point, which is that social media companies have a responsibility, we could send them a really clear signal, and protect children, by bringing in an immediate ban on under-16s using social media. Does she support that or not?
Dr Sullivan
I welcome that intervention. Initially, action needs to be taken, but I am not sure whether a ban would be clearcut enough, because there are so many ways to get around it. How do we verify if a person is 16? The emphasis is being put on the young person—the user—who is trying to access that service. As long as the tech company can say, “We have done facial recognition—we have done all that is reasonably possible”, the liability is on the young person. It should be the other way around, with the responsibility being on the tech company. The hon. Member may well agree that the tech companies need to be doing more, and that is where the Government consultation on strengthening the regulations needs to come in.
These online harms are not isolated occurrences; they are being designed into platforms, they are being amplified at scale and they are shaping the real world. We must be serious about protecting our young people. We must address the systems and the incentives that are driving this harm, and hold the tech companies to account. The question is, should we be banning children from social media or should we be banning social media from exploiting our children?