(1 day, 8 hours ago)
Commons ChamberI call Ian Sollom, who will speak for up to 15 minutes.
Ian Sollom (St Neots and Mid Cambridgeshire) (LD)
I beg to move,
That this House believes that current legislation is falling short in preventing online harms; and calls on the Government to review whether it is necessary to introduce new legislation that is centred around harm reduction in this Parliament.
I thank the Backbench Business Committee for granting this debate. Not long after my election in 2024, I visited the Internet Watch Foundation in Cambridgeshire. That organisation is on the frontline of the fight against child sexual abuse material, and is one of only a handful of non-law enforcement bodies worldwide with the legal power to proactively seek out and remove online images and videos of such abuse. During my visit, the IWF told me that, in the preceding five years alone, it had taken down more than 1 million webpages that showed at least one child sexual abuse image—often, they showed hundreds or thousands. The IWF’s annual report last year revealed that 2025 was the worst year on record for child sexual abuse material. Its analysts confirmed 312,000 reports—a 7% rise on the year before. Most starkly, in 2024 they discovered 13 AI-generated videos of child sexual abuse, but in 2025 the figure was 3,440—a rise of over 26,000%, for those who are interested in numbers. Nearly two thirds of those videos were category A material, which is the most extreme classification.
A little while after my visit, I began to work with the Molly Rose Foundation on the proposal in this motion. At the time, the Online Safety Act 2023 had been in law for nearly two years, and the protection of children codes of practice that came from it, which promised to improve user safety dramatically, had just been published and implemented. The text of those codes was heavily criticised by civil society, and even by the Children’s Commissioner, who said they would simply not be strong enough to protect children from the
“multitude of harms they are exposed to online every day.”
It seemed timely for a motion to be brought before the House so that we could scrutinise the Online Safety Act and its resultant codes, as they now are being used in practice, and highlight to the Government the need to take action in this Parliament to protect young people. After the codes were implemented in mid-2025, the Mental Health Foundation published research stating that 68% of young people had experienced harmful content online. It described the harm as one of
“the biggest looming threats to young people’s mental health”.
In October 2025, the Molly Rose Foundation found that over a third of children reported that they had been exposed to at least one type of high-risk content in the past week. In a classroom of 30 children, that is 11 who are, every day, being shown content that promotes suicide and self-harm or that romanticises depression and eating disorders. That is the exact “primary priority content” that the UK’s flagship piece of online safety legislation explicitly promised it would protect them from. Just this week, the BBC aired “Inside the Rage Machine”, which used whistleblower testimony and evidence to lay bare how social media giants such as Meta and TikTok are consistently and deliberately pushing harmful content to users, after finding that their outrage fuelled engagement.
All of that is to say that if the motion for this debate seemed appropriate at the beginning of this Parliament, when I first visited the IWF, it is now urgent. Every week, I hear from parents, young people and organisations who are fighting a losing battle against the proliferation of online harms because, despite its noble aims, the current legislation is falling short of what Parliament envisaged it would do.
Leigh Ingham (Stafford) (Lab)
Last week, I ran a supermarket surgery in my constituency. I had a flipboard that asked whether people felt that social media should be banned for under-16s. It is rare to get this level of agreement, but 78% of my constituents of all ages—older people, young people and even children—said yes. What was consistent was the fear they felt about this space and the belief that it is doing damage to young people as they grow up. I am not 100% sure on my position yet, but does the hon. Member agree that the Government are right to consult to work out the best option to protect young people from social media?
Ian Sollom
The text of the motion asks for a review, and that is certainly what I want to see.
I have not come here today to stir up panic or to imply that the wellbeing of our children, or indeed our adults, is doomed. There is hope and we should not have to accept harm as a reality of life on the internet. As the Molly Rose Foundation chief executive officer, Andy Burrows, noted this week after campaigning pushed both TikTok and Meta to row back on plans for end-to-end encryption in direct messaging,
“tech firms are not immune to pressure”.
However, pressure on its own is not enough. The Government must urgently look at strengthening the Online Safety Act to ensure that pressure has robust legislative backing behind it, and that Ofcom actually has the power to enforce the regulations that will protect us all from harm.
Online harm comes in three forms. First, there is harmful content: the outright illegal and the extreme, posted and peddled by bad actors across social media platforms. Then we have harmful interactions with bad actors, including grooming, cyber-bullying and extortion. I am sure that Members across the House will share many stories of the impact of both types of harm today; it is a tragedy just how many there are. I want to focus on the third form of online harm, which is the harm that arises from not just the type of content encountered online, but the intensity with which it is repeatedly pushed on to young people by the platforms themselves.
This week, I was pleased to participate in the Royal Society pairing scheme. I was paired with Doctor Lizzy Winstone, a researcher from the University of Bristol whose work focuses on how young people use social media and its impact on their mental health. Her most recent research investigates the algorithmic recommendation of content as one of the primary mechanisms that shapes young people’s digital mental health. She and others have found that a large part of online harm is structural, arising from not just individual bad actors, but business models designed at their very core to maximise attention and to profit from provocation.
Social media is built to be addictive. Hooking users in and keeping them engaged is at the very heart of almost every platform’s business model. Algorithmic models cause harm through both overtly harmful content and content that is harmless on the face of it. There are attention deficit harms caused by passive screen watching and health harms associated with an increasingly sedentary lifestyle. Higher social media use has been directly linked to shorter sleep duration and difficulties with sleep onset. Gambling harm is often overlooked, but a recent Guardian investigation found that Meta AI was pointing vulnerable social media users to illegal online casinos and even suggesting ways to bypass UK gambling safeguards. Regulation is clearly not keeping pace with the evolving digital landscape.
Often, it is the directly harmful, even illegal, content that is caught up in these algorithms. The shock, disgust and strong emotion inevitably caused by this content creates engagement: we watch for longer, we engage more, and the algorithm takes this as permission to show us even more of it to keep us hooked. Endless scrolling functionalities allow already vulnerable users to fall into a world where there is no escape from this cycle. Members will be aware that we Liberal Democrats have long called for platforms to implement built-in caps on social media doomscrolling.
In 2017, it was concluded for the first time ever that content on social media had contributed to the death of a young person when teenager Molly Russell tragically took her own life. Before she died, she had viewed thousands of suicide and self-harm videos and images on Pinterest and Instagram, some of which were pushed to her without her asking to see them. The word used by the coroner was that Molly was able—even encouraged by platforms—to “binge” this content.
The normalisation of these recommendation mechanisms has created an awful, self-perpetuating cycle. One case study from the University of Bristol described a 17-year-old girl who was forcing herself to repeatedly watch graphic content of a gory accident on TikTok to try to desensitise herself to violence. She knew that she would be regularly exposed to this kind of content online and wanted to train herself to be able to watch it and not feel sick. We can only assume that due to her increased attention, she was shown even more of this horrific content.
Recommendation systems in and of themselves are no bad thing. They create a personalised space to explore interests and sometimes do filter out content that a user has no interest in. The problem is that a user’s engagement with content does not always indicate their actual interest in it. Another young person from the University of Bristol study—a trans man—described feeling compelled to intervene in homophobic and transphobic comments sections, to try to support his community and challenge prejudice. He was understood by the platform to have engaged, and subsequently he was bombarded with more and more of the same hateful content. The tension between knowing that his algorithm would register his intervention as interest and wanting to actively challenge hateful views was a constant source of stress online.
Problems also arise from a lack of transparency. Not only are social media platforms under no obligation to publish their algorithms, but with AI increasingly being used to build and continually iterate these algorithms, the platforms themselves are often unaware of the exact mechanisms that shape experience. Harm is occurring as a result of an unaccountable black box. Young people are not entirely passive in this system—they know it is happening—but platform tools provide very limited control over what the algorithm continues to recommend.
Looking at Ofcom’s summary of the protection of children codes of practice, we can see how a weak interpretation of the Online Safety Act is allowing such harm to be perpetuated. Volume 4, section 17 says that platforms must
“Ensure content recommender systems are designed and operated so that content indicated potentially to be PPC”—
primary priority content, which is suicide, self-harm, eating disorders and mental health content—
“is excluded from the recommender feeds of children”.
Research shows that children were most likely to report having seen harmful content through feeds with recommender systems—very few actively seek it out—so the intention behind this measure seems good. But then we see that it applies only to “child-accessible” parts of a service that are
“medium or high risk for one or more specific kinds of PPC”.
In Ofcom’s December review, not a single social media platform rated itself high risk for suicide or self-harm content. There is a clear gap between the intention of the legislation and how it is being implemented. That is because the Online Safety Act and its codes are ultimately built around compliance and not harm reduction. Rules-based legislation means that platforms can happily meet their legal duties if measures in the codes are followed, and they are under no obligation to effectively and proactively address the harms identified in their risk assessments. Putting only a moral duty on platforms to protect young people from harm is not going to work—we have seen for years that it does not work.
How can we expect the very same platforms that have been shown to deliberately and knowingly peddle harmful content to young people to essentially police themselves? Why would they bother when it is so much more profitable to tick already loosely defined boxes? A full review of the current legislation must investigate the barriers that Ofcom says are preventing it from delivering on the intentions of Parliament. That includes the safe harbour principle, which allows platforms to claim compliance and skirt enforcement action on harms about which they are already aware, and the complete lack of any obligation in the Act that platforms take active steps to reduce the risk of harm to users. In practice, that means that a platform can follow Ofcom’s codes to the letter, even while its own risk assessment shows that it is aware of serious ongoing harm, and face no enforcement consequences.
Amendments could be passed within months to introduce the robust, risk-based minimum age limits that we Liberal Democrats have been calling for. Minimum joining ages should be determined by a platform-specific assessment of age appropriateness in risk. That will incentivise the market to adopt lower-risk functionalities if platforms wish to open themselves to a wider pool of users.
We could argue that a review of sorts has already taken place: every coroner’s report, every tragic story told in the Chamber and every investigation by charities and organisations make up that review. The evidence is plainly there, but the harm is being allowed to continue. We are here as Members of Parliament to scrutinise, and we have done that. There have been 12 debates with the words “online safety” in the title this Parliament and there have been hundreds of references to “online harm”, yet there has been little indication that the Government are addressing the core issues raised in this debate.
I hope that Members will use this debate to raise the full range of harms we hear about in our work. I ask the Minister to respond specifically to these questions: will the Government examine whether the safe harbour principle is serving Parliament’s original intentions or has become a mechanism that platforms use to avoid accountability for harms about which they are already aware? Will the Government commit to ensuring that any new legislation this Parliament brings forward is built around harm reduction and not compliance?
I will now introduce a time limit of six minutes.
I congratulate the hon. Member for St Neots and Mid Cambridgeshire (Ian Sollom) on securing the debate and I thank the Backbench Business Committee for granting it.
Facing harm online is one of the biggest struggles that our young people face daily, from toxic influencers trying to push a certain way of life or ideology to those who encourage eating disorders. However, social harms extend a long way beyond that, from aggressive algorithms designed so that young people get addicted and trapped online, to forums encouraging self-harm and suicide. Many hon. Members will be aware that I have been raising this issue in this place over a number of years.
Earlier this month, I chaired a roundtable with the Mental Health Foundation to look at the evidence about the banning of social media for under-16s. As it happened, it took place on the same day that the consultation was launched, and I thank my hon. Friend the Minister for attending our meeting. We heard from mental health experts, affected parents, the Minister and young people themselves. While views on “how” we should protect young people are diverse, the consensus on the “now” was absolute.
There is disagreement about whether an absolute social media ban for the under-16s is the right answer. Should we have a more nuanced approach where we look at a wider range of issues such as the architecture of social media platforms? Following the consultation, the Government must design any proposed policy alongside young people. We will not find an effective solution without including the young people who operate in this world and who are most affected, and we must look to social media companies to start getting their act together and protecting people.
The links between young people using social media and increasing levels of loneliness and poor mental health are well documented. We have a youth mental health crisis, with nearly one in five children aged eight to 16 having a probable mental health disorder. That is a staggering number. As we have heard, the Molly Rose Foundation found that 95% of recommended posts on certain teenage accounts contained content related to suicide or self-harm. As Members know, Molly Rose tragically took her own life at the age of just 14, after social media algorithms continuously served her with self-harm and depression material, which created a rabbit hole for her to go down. She saw more than 2,000 harmful posts in the last six months of her life.
The rise of forums where groups encourage extreme forms of violence, self-harm, suicide, animal cruelty and political extremism, is extremely worrying. They look to target impressionable young people, pipelining sadistic and hateful ideas and content straight to them. UK law enforcement identified several cases in which perpetrators coerced girls as young as 11 into seriously harming or sexually abusing themselves, siblings or pets.
These forums encourage or blackmail users into committing serious acts of self-harm or suicide. One specific pro-suicide forum has been linked to more than 135 deaths in the UK alone. That is 135 empty chairs at dinner tables. These are extreme forms of online harms, and I am glad that the National Crime Agency and international partners are taking them seriously, but do they have the powers and resources to protect young and impressionable people from serious online harm?
This debate on online harms is not new. We have been talking about how to protect people from the online world over many years in this House, and there is a real danger that we are permanently running to catch up with online operators. Experts described the Online Safety Act as a ceiling for safety, not a floor. In many cases, social media companies are doing little more than their statutory duty. In December last year, not a single platform determined itself to be “high risk” for suicide or self-harm content. We cannot let companies mark their own homework.
Platforms are systematically downplaying harms, and are incentivised to allow the problem to go on if advertisers still deem the platform safe. Ofcom needs to have the power to mark these companies honestly, and if they are failing, it needs to be able to act. Does the Minister believe that Ofcom has the powers that it needs to act and for that to be followed through on?
Let me talk quickly about a meeting that the Molly Rose Foundation and affected parents recently had with Ofcom to highlight significant concerns, which included strengthening the Online Safety Act and extending it to cover children’s wellbeing. I am told that the meeting was very unsatisfactory. Will the Minister agree to meet with me and those parents to discuss the situation further? This is important to them; they have lost their children, and they want to do more to protect others.
I congratulate my hon. Friend the Member for St Neots and Mid Cambridgeshire (Ian Sollom) on securing this debate, and I thank the Backbench Business Committee for granting it.
There is no shortage of online harms demanding our attention. I have spoken before about children buying illegal drugs that are openly advertised on social media, the flood of harmful eating disorder content reaching young people, and Ofcom not holding social media companies to account, although it has increased powers to do so under the Online Safety Act. Today, though, I want to focus on another deeply disturbing trend. Men are secretly filming women on nights out and profiting by posting the videos online. These accounts mask themselves as “nightlife content” or “walking tours”, but the videos tell a completely different story; they fixate on women in dresses and skirts, often filmed from behind and from low or intrusive angles. These women have not consented—in most cases, they do not know that they are being filmed—and the scale is staggering. The BBC found that videos such as these have been viewed more than 3 billion times in just three years.
Once they have been uploaded, the abuse begins, with comment after comment dripping with misogyny:
“Look at how these ladies are dressed, no wonder they get attacked”
followed by a laughing emoji,
“They belong to the streets”,
and “Easy meat”. Hundreds of misogynistic comments like these flood the replies beneath nearly every video. This vile practice has victims, and the impact is real. Women who have been filmed in this way say that they no longer feel safe to go out; they feel watched, exposed, vulnerable, distressed and harassed. They no longer enjoy a night out or being in public. We must be clear that secretly filming women in this way is deeply degrading and predatory and must be stopped.
Right now, the law is failing. In 2024, a man was arrested on suspicion of stalking and harassment for this kind of behaviour, but no further action was taken due to limitations in the current legislation. As of now, there is no provision in law to prosecute for covert filming of this nature. This abuse sits in a legal grey area between several different crimes, including voyeurism and harassment, giving this type of video the space to grow. Existing voyeurism offences are framed around private acts or taking intimate images. Harassment laws were not designed to address the recording and mass distribution of this kind of content, so perpetrators slip through the cracks and the problem grows.
My Liberal Democrat colleagues in the other place tabled an amendment to the Crime and Policing Bill that would have created a specific criminal offence of secretly filming someone without their consent for sexual gratification, or to humiliate or distress them. The Government’s view was that this amendment was too broad. Yes, we must protect the freedom to film in public and legitimate journalism, but we cannot allow that to become an excuse for inaction, because right now women are being targeted, filmed and broadcast to millions without protection. Something must change.
We should look at harassment laws, including how the Public Order Act 1986 can be strengthened to tackle sex-based harassment, both offline and online, because harassment does not stop on the streets—it continues online, often indefinitely. It should be an offence to record and distribute footage of someone without their consent when they are targeted because of their sex and that material is used to objectify and humiliate them and subject them to misogynistic abuse. Women should not have to wonder every time they go out whether they will wake up the next morning to find themselves plastered across the internet in the most distressing and degrading way. It is not beyond the Government’s power to fix this issue, and I urge them to listen and to close this gap in the law, to protect women from this vile misogynistic harassment.
Dr Lauren Sullivan (Gravesham) (Lab)
I thank the hon. Member for St Neots and Mid Bedfordshire for securing this important debate.
Online harms are systemic, they are scaled, and they are producing real-world consequences, as we have seen. Social media is now the environment in which young people grow up—it is almost universal when children enter secondary school. According to a consultation by the Department for Science, Innovation and Technology, 81% of 10 to 12-year-olds are on social media, and 86% have accounts. The Youth Select Committee also did a study on youth violence and social media back in 2024, and found that 97% of 13 to 17-year-olds were online and that 70% of them see real-world violence online. That is a huge number of statistics, but they demonstrate the fact that social media is now in every young person’s bedroom, in their hand and in their pocket.
Professor Sarah-Jayne Blakemore from the University of Cambridge told me that adolescent brains are highly sensitive to the social environment, and the social media companies are probably aware of this. Adolescents’ brains have heightened neuroplasticity, and this will continue until their mid or late 20s. During adolescence, young people are trying to find identity and belonging, and I fear that the tech companies are exploiting this.
Where can we see evidence of harm? The National Education Union did a study called “Big Tech’s Little Victims”, in which researchers created fictional accounts and spent half an hour each day on Instagram, TikTok, Snapchat and YouTube. They found that harmful content appeared within three minutes, and often immediately. Young people in my constituency say, “I do not want to see this harmful content anymore,” yet they are still shown it, so what is going on?
The hon. Member for St Neots and Mid Bedfordshire mentioned the “Inside the Rage Machine” documentary, which I have seen a number of times. I am absolutely horrified at what the whistleblowers have revealed.
The hon. Lady is making a very powerful speech about how young people, whose brains are still being formed, are being bombarded with online content. May I just let her know that my hon. Friend is actually the hon. Member for St Neots and Mid Cambridgeshire (Ian Sollom)? When she mentions him again, she might correct that.
Dr Sullivan
My apologies to the hon. Member for St Neots and Mid Cambridgeshire (Ian Sollom).
I was speaking about “Inside the Rage Machine”. What people have witnessed is remarkable. The documentary makers found that serious exploitation cases were not being prioritised by TikTok, and that algorithms were repeatedly pushing harmful content.
It is not as simple as saying that we must ban children from social media; we need a suite of measures. The core issue is that young people, who are forming their identities, are vulnerable. Addictive algorithms are designed to maximise time and engagement, and they prioritise provocation instead of the truth. Louis Theroux’s Netflix documentary on the manosphere is an incredibly powerful and timely contribution to the debate, and he shows us that the online world is like a gold rush in the wild west. The approach of “hook, identity, monetise” drives profits, with streaming platforms like YouTube rewarding people who spout abominable things. There is a business model behind this, and I think we are all very much aware that we need to do something about it.
Harmful content spreads across platforms, so we need to be very clear about any ban on social media. Last week, the Science, Innovation and Technology Committee looked at the ban in Australia. We learned that because Australia defined which social media companies were to be included, other companies took their place. We can learn from that and it can feed into the Government’s consultation. We have to make the legislation stronger. Bans have limits, because they can be bypassed, as we see in Australia. They also shift the responsibility to the user. Why can we not shift the responsibility to the companies? We should not be banning children from social media; we should be banning the companies from exploiting our children.
Gregory Stafford (Farnham and Bordon) (Con)
I support a number of the things that the hon. Lady is saying about the dangers of online harms, especially for children, but I am unclear about her position on a social media ban for those under 16. Although I accept her overall point, which is that social media companies have a responsibility, we could send them a really clear signal, and protect children, by bringing in an immediate ban on under-16s using social media. Does she support that or not?
Dr Sullivan
I welcome that intervention. Initially, action needs to be taken, but I am not sure whether a ban would be clearcut enough, because there are so many ways to get around it. How do we verify if a person is 16? The emphasis is being put on the young person—the user—who is trying to access that service. As long as the tech company can say, “We have done facial recognition—we have done all that is reasonably possible”, the liability is on the young person. It should be the other way around, with the responsibility being on the tech company. The hon. Member may well agree that the tech companies need to be doing more, and that is where the Government consultation on strengthening the regulations needs to come in.
These online harms are not isolated occurrences; they are being designed into platforms, they are being amplified at scale and they are shaping the real world. We must be serious about protecting our young people. We must address the systems and the incentives that are driving this harm, and hold the tech companies to account. The question is, should we be banning children from social media or should we be banning social media from exploiting our children?
Mrs Elsie Blundell (Heywood and Middleton North) (Lab)
I thank the hon. Member for St Neots and Mid Cambridgeshire (Ian Sollom) for securing this crucial debate. Since my election, constituents in Heywood and Middleton North have repeatedly raised issues about online harms, especially as they see those who control the platforms seeking to shirk accountability at every turn. That is why we cannot discount the significance of the Online Safety Act. That critical piece of legislation—the first of its kind in putting a range of new duties on social media companies and search engines to mitigate the harms that those online can pose to our constituents—was a welcome step taken by the previous Government and implemented by this Labour Government.
Perhaps to a greater extent than in any other area of policy, we must recognise that the frontiers of online media are constantly expanding, technology is evolving, and our daily life is increasingly determined by what takes place on phones, laptops and tablets. Though the Act was immensely welcome—it goes some way towards dealing with this complex set of challenges—we cannot wait another 20 years before we come to substantively revisit this topic.
To underscore why constant adaptation to these threats is necessary, I would like to touch on three themes. First, there is the proliferation of misinformation and disinformation. The integrity of our democracy and the tone of our discourse through to our continued belief in facts, evidence and science are all on the line in the war being waged unrelentingly in these digital spaces, where online actors are determined to amplify falsehoods to erode a sense of public trust that has taken generations to foster. The meteoric rise of AI has made the challenge all the more pressing.
People’s behaviour is being tracked on apps, and algorithms responding to them are driving misleading and sensationalist content into the most impressionable, vulnerable and isolated minds—so many of them are young people who are growing up unable to tell fact from fiction. We know that adults are also susceptible to such trends.
This week—of all weeks, when we have seen a deeply concerning outbreak of meningitis in Canterbury and east Kent—we see misinformation and blatantly anti-science positioning rear its ugly head once again, as we saw in the covid-19 pandemic and have seen countless times since. It is a really obvious thing to say, but the onus is on us to speak with one voice as MPs on such a critical topic as public health and to confront those harmful narratives at their source.
A great deal more thinking needs to be done in digital spaces when it comes to misinformation, whether medical or otherwise. That requires strengthened regulation and real intent between the Government, Ofcom and the platforms. I am pleased that the Online Safety Act has provisions to capture myths and disinformation where they are illegal or harmful to children, but we have much further to go in curtailing the weaponisation of online platforms to spread lies, conspiracies and harmful falsehoods to millions across the country.
Secondly, I would like to speak about the protection of children. I have raised the issue of technology-assisted child sexual abuse on several occasions in this place. It needs to be tackled from both sides—the judicial and the digital—so I wholly welcome the Online Safety Act and the Government’s wider work in this area. From stopping companies like X, or AI tools like Grok, generating vile, sexualised images of children and non-consensual, intimate deepfakes to the commitment to ban nudification apps and to introduce a legal duty requiring tech platforms to remove non-consensual intimate images within 48 hours of being posted, it is clear that the Government stand firmly against those who would do our children harm.
That being said, TACSA also has further dimensions that warrant serious consideration. It can take many forms, such as the distribution of child sexual abuse material, sexual harassment, exposure to sexually explicit materials and grooming, to name a few. Despite the prevalence and seriousness of these crimes, there is an over-reliance on non-custodial sentences across our judicial landscape, with magistrates courts dominating outcomes, and gaps in the unduly lenient sentencing scheme. Online or technology-assisted child sexual abuse has profound and lasting impacts on children for their whole lives, comparable to that of physical abuse. Digital regulation and our justice system must reflect the insidiousness and seriousness of such crimes, and I would welcome the Minister’s comments on that when he concludes the debate.
Finally, I will briefly touch on how discourse in digital spaces is increasingly affecting our communities. Following the Manchester synagogue attack last year, the Centre for Countering Digital Hate identified a troubling rise in antisemitism online, where violence against the Jewish community was celebrated and further encouraged. We only need open X, Facebook or other platforms to see a disgraceful barrage of abuse levelled at our Muslim community too, with platforms giving previously fringe far-right voices the means to amplify their dangerous and divisive rhetoric to millions. The harm that these actors can inflict on the capacity of our communities to come together is being played out each and every day. All too often they can hide behind anonymous accounts, and real people—my constituents and people across the country—are having to face the consequences. I am proud to represent a diverse constituency, but I fear the power that those online have to direct actions and attitudes in real life. I hope that the Minister will touch on that pertinent topic.
I welcome this Government’s efforts to curtail online harms. Indeed, I welcome the work of any Government in doing so. Things, however, are moving at a staggering rate. We therefore cannot view the Online Safety Act 2023 simply as a job well done; rather, we should see it as another rung on a growing ladder. To keep our constituents—especially children—and our communities safe, we need to ensure that our thinking is consistent with the expanding nature of these digital spaces. Ultimately, that means recognising that, for all their utility in connecting us with one another, these platforms also have a near unlimited capacity to do people harm. I truly fear the consequences of failing to recognise that.
Melanie Ward (Cowdenbeath and Kirkcaldy) (Lab)
It is an understatement to say that the internet and social media have changed everything. The early optimism of internet pioneers was that we would all benefit from a world in which all information was at our fingertips. In many respects, they were not wrong, and rapid technological advancement has massively improved our lives, whether that is significant developments in healthcare and easier communication with friends and family or online banking, which is a real benefit to many people here in the UK. We have also seen the benefit across the world—in humanitarian crises, for example, where cash transfers are increasingly used as part of the humanitarian response. It is much safer and easier to make those happen from a laptop or someone’s mobile phone, rather than having to helicopter huge sums of cash through war zones or refugee camps, which is what happens without that ability.
The fact that the online world has amplified everything means just that: almost everything, no matter how sinister or extreme, is available to us and, most distressingly of all, to our children. Not only is it available to us, but algorithms designed to push extreme content mean that violent, misogynistic, racist, antisemitic, Islamophobic and other hateful content is winning the battle for our attention and causing real harm. It is no longer just in ideological echo chambers. Algorithms and the introduction of suggested content that is pushed at the user mean that such content has permeated youth culture and taken over many of the spaces where young people communicate with each other and the language that they use. It is now just as easy—if not easier—-to tune in to extremist content online as it is to watch cartoons, go to the park or go to a house party, and that has real-life impacts in our constituencies.
In Cowdenbeath in my constituency, antisocial behaviour is a real issue. Tomorrow, I will hold a second meeting on antisocial behaviour, following an antisocial behaviour summit I held in December. We have found that social media is having a real impact by encouraging more extreme behaviour between young people, because it is filmed and shared online. Local headteachers also report the impact of apps like Snapchat as a real factor in bullying between schoolchildren.
We know that this is a global problem. Radical and violent groups profit from the recruitment to their online causes of young men in particular, pushing violence and very real threats to our democracy, including ISIS in the middle east, the Proud Boys in the United States and Yoon Suk Yeol, whose misogynistic platform was a factor in his election as President of South Korea and the attempted insurrection in 2024 for which he is now serving a life sentence. The truth is that the big tech companies are so obsessed with outdoing each other to profit from the attention of our children or other vulnerable people that they have ignored their responsibility to keep them and our communities safe, and to prevent people from being exposed unwittingly to the most horrific material.
Madam Deputy Speaker, I am about to mention another hon. Member, who is not present, and I just want to confirm that I have notified him in advance. Too many people have been prepared to sacrifice the safety and cohesion of our communities for the right price. This week, an investigation showed that the leader of Reform has been paid to take extreme political positions on the Cameo app. According to The Guardian, the hon. Member for Clacton (Nigel Farage) took money to call for the release of P. Diddy and of a Honduran drug trafficker, to support a rioter, to repeat extremist slogans and to endorse a neo-Nazi event. Members of the public will be able to draw their own conclusions from that kind of behaviour.
Too often, action to prevent harmful content is too slow. In March last year, when new powers in the Online Safety Act came into force, I wrote to Ofcom requesting that action be taken, using that Act, against a website that actively encourages its users to die by suicide—I will not name the site for obvious reasons. Ofcom launched an investigation of the site, but it had still taken only a provisional decision against that site last month. I promise hon. Members that spending five minutes on the site would tell them immediately that it has no place in our country and no place online at all. It is shocking that action has not been taken. Tragically, since the illegal harms code came into force last year, the death of two more people have been linked to that site. Does the Minister agree that Ofcom is far too slow in responding to sites like this, and will he please take that up with Ofcom?
There are so many reasons why I am glad that our Government are taking steps to consult on a social media ban for under-16s. To be clear, I support such a ban.
Adam Jogee (Newcastle-under-Lyme) (Lab)
We are enjoying my hon. Friend’s speech; she has a number of pages left, and we want to hear all of it. She rightly talks about the potential ban for under-16s. I was at Newcastle academy last week, and a number of young people said that they would feel much safer if such a ban were imposed, so I would like to add my support to hers.
Melanie Ward
I thank my hon. Friend for his intervention. I would add that I was recently at a primary school in my constituency, and I asked the young people how many of them were on social media—a class of 10 and 11-year-olds—and almost all of them were. However, when I talked to them about how social media work, I found that everybody had different rules for what they are allowed to do and when they are allowed to go on social media. It was clear that their parents are trying really hard to regulate their children’s access to social media.
Among the reasons why I want us to act by banning social media for under-16s are not only the impact on young people, which I have tried to lay out, but the job parents are struggling to do because social media companies cannot behave properly. I saw a survey showing that one third of parents had cried because of the stress of trying to manage their children’s access to social media and online content. To me, this is about backing parents as well as about keeping our young people safe online and in the real world.
We banned the sale of alcohol to under-18s in 1923, and when we banned the sale of tobacco to those under 16 in 1908. I very much hope that future generations will look on our Parliament as the Parliament that finally took action to prevent the public health risk and the real-life harm that is addictive social media and extremist content in the hands of children, as well as in the hands of so many vulnerable young people. We must act now. The safety and wellbeing of our children is at risk.
Several hon. Members rose—
Order. Before we move on to the next speaker, I remind Members to use extreme caution when avowing the motives of other Members. I think the hon. Lady probably just about stayed on the correct side of the line.
It is a pleasure to follow my hon. Friend the Member for Cowdenbeath and Kirkcaldy (Melanie Ward). I have been very impressed and moved by the quality of the speeches from across the House. I really do appreciate the hon. Member for St Neots and Mid Cambridgeshire (Ian Sollom) securing today’s important debate.
I want to touch on two specific aspects of this issue: to try to explain the awful impacts of some these cases, based on a case I have been involved in of a constituent who sadly was killed through online bullying; and to address some of the issues—my hon. Friend the Member for Cowdenbeath and Kirkcaldy made very good points about the enormous difficulties that parents face—and ask the Minister to hopefully give us some indication of the Government’s direction of travel.
First, I will explain the case in Reading, which some Members may know about but others may not. My constituent Olly Stephens had just turned 13 years old when he was stabbed and brutally murdered by two other boys in a local park just yards from his house on the outskirts of the town. It happened through online bullying. The attack was heavily linked to the sharing of images of knives online, which led to his death. None of us can imagine the impact on his parents, Stuart and Amanda Stephens, and what they have been through. They are now incredibly powerful and determined campaigners against online harms. They have worked with Ian Russell and many other families. They have been able to explain some of the horror of what happens in these dreadful incidents. It is worth explaining a little bit about their views on regulating social media.
I want to highlight the point at which the attack on Olly happened: it was before the Online Safety Act became law. However, some of the same issues still appear to be taking place. The two boys who carried out the attack were 13 and 14 years old at the time—it happened in 2021—had both seen videos and other images of knives on 11 different social media platforms. They had seen them repeatedly and none of the companies responsible for those platforms had taken any of that content down. These young people had been bombarded with these images and were sharing them. They were sharing pictures of knives and teenagers playing with knives in bedrooms. That may have influenced their behaviour. It is the most awful thing.
Stuart and Amanda have tried very hard to raise awareness of the different aspects of this issue: the huge dangers of knife crime, the dangers of online bullying, the dangers of social media, and the effect of social media on young people. I know them very well as constituents. They have talked to me very powerfully about the way in which their son was addicted to his phone—they tried to take it off him and he threatened to run away. They believe he was being groomed through all sorts of other things that were happening online. It is absolutely shocking to see it from their perspective.
Their experience is different to some of the other cases we have heard about today. We have heard some very powerful stories told by other colleagues about issues in their own constituencies, or other ones they have come across, in particular my hon. Friend the Member for Gravesham (Dr Sullivan) in her work in relation to suicide. I have also come across that issue, which is absolutely appalling. I had the privilege—although it was a very difficult thing to do—of attending an event run by the Molly Rose Foundation. People were shown videos of some of the content she had been exposed to, which was quite shocking, and the deluge of the content and its repetitive nature through the algorithms targeting vulnerable young people—as my hon. Friend the Member for Gravesham rightly said—and the way that young people are particularly vulnerable to these terrible images. However, we need to think very carefully—and this is the other point I ask the Minister to address—about the difficulty of trying to then respond to that.
I have a lot of interest, and I totally understand that in Stuart and Amanda’s case they would like to see a complete ban on social media for under-16s. There is a powerful case for that. I am not completely convinced, however, because I know that the Russell family take a different view and that, as my hon. Friend the Member for Gravesham said, there are practical issues around the risk of companies being able to circumvent some of those.
I hope that when the Minister responds, he can give an early indication of some of the issues that are being discussed in the consultation. That is important work being led by the Government and it is extremely difficult. It is great that Australia and other countries have already taken some action. Hopefully we can learn from their experience, build on what they have done and take things even further in our country to do even more to protect vulnerable young people and, indeed, vulnerable adults—the hon. Member for Bath (Wera Hobhouse) spoke about some of the appalling things involving adults as well.
A specific aspect that is particularly challenging for many of us, as parents, is that this area is evolving so rapidly and it is very difficult for many to keep up. In fact, the point made by my hon. Friend the Member for Cowdenbeath and Kirkcaldy about the need for parents to be reassured that they were doing the right thing and about the difficulty of finding the right way forward was very powerful. We need to think about how we can help parents, schools and other places where young people are.
Gregory Stafford
Unlike the hon. Gentleman, I am very convinced of the need for a social media ban. That is why I welcome the Leader of the Opposition’s stance on that. On his point about communities, schools and parents, if we do not go for a full ban, there are some technologies that could be used. I think of Jason in my constituency, who runs a company called Orbiri. He is looking to set up communities, where a school—maybe a class or a whole school—can set the parameters for usage time and the sites and apps that are used, so parents do not feel that they are alone but are part of a wider community, all working together to limit and control the social media usage of their children. Does the hon. Gentleman agree that something like that would help?
The hon. Gentleman makes an excellent point. The other thing to consider is that there would be a risk to older teenagers—those over 16—if the ban for under-16s were imposed. We may need to look at a number of complementary, but different, measures, as my hon. Friend the Member for Gravesham also mentioned. I thank the hon. Gentleman for his intervention, and the Minister might want to reflect on the work done by the company in his constituency.
To conclude, it has been a privilege to speak today. This is an extremely difficult subject. It is wonderful that the House has been able to discuss it in some detail this afternoon, and I look forward to the Minister responding in a little bit more detail. I realise that the consultation is under way. When he looks into this further, can he take submissions from MPs, where we have been carrying out our own, local work? I have done that, with a local consultation that is a mini version of the Government’s one. A very high proportion of people who responded wanted to see firm action. There is a range of views on what that might be, but there is certainly a serious intent to change things.
Paul Waugh (Rochdale) (Lab/Co-op)
The latest Louis Theroux documentary for Netflix, “Inside the Manosphere”, was deeply shocking to many of us who watched it. But it was not remotely shocking for the millions of teenagers to whom his subjects are well known. It was not shocking to my three twenty-something sons; it was not shocking to the boys in the playground; it was not shocking to Gen Z or Gen Alpha; and it was not shocking for children in primary schools, let alone in secondary schools.
That is why this online harms debate should involve everyone, particularly the young people in whose name and on whose behalf we often make laws in this place. Their synapses are dulled to this stuff and their feeds are full of it, which in turn means that the premium for even more shock is higher. Outrage and extremism are hardwired into this business model.
“Inside the Manosphere” exposed that many of these social media influencers are themselves deeply damaged boys, often with a resentment about fathers who were either absent or violent, or both. They project themselves as pro-men, but in doing so they feel the need to project themselves as anti-women. And they are not just anti-women—that is a mild term—but they are virulently, disgustingly misogynistic. They feed off the pornography that, sadly, is seen by all too many of our young boys these days.
What also shocked me, however, as my hon. Friend the Member for Heywood and Middleton North (Mrs Blundell) pointed out, was just how casual the antisemitism propagated by many of those in the manosphere was.
We saw a chap called Myron Gaines say,
“LOUIS IS A DIRTY J-E-W.”
Louis Theroux is not Jewish, by the way—not that that matters. At one point, another manosphere influencer, Harrison Sullivan, imitates Louis Theroux and leers that he is
“just sat there with his Jew fingers.”
Another of the manosphere influencers blames Jews for feminism, homosexuality and even
“vibrations that are going to negatively bring you down”.
In the conspiracy theory-ridden rabbit hole of the internet, all this is normalised. I thank the Antisemitism Policy Trust for its work in exposing just how much this vile racism has exploded online, and Elon Musk and X share responsibility for much of that. We must take much tougher action against tech giants who are literally profiting from this hatred. Antisemitism is often described as the oldest hatred, but misogyny is just as ancient a hatred. That is why I am proud to be part of a Labour Government who stood up to Grok and Musk when they flouted British laws and put British women and children at risk with those nudification apps.
I am equally proud that my party has been calling out Reform—none of whose Members is present today—for its pledge to repeal the Online Safety Act. I would like to know which protections for children Reform MPs would remove and what, if anything, they would put in their place.
I would also like to know why George Galloway’s Workers party took £5,000 in political donations at the last general election—an election in which I partook in Rochdale—from Andrew Tate’s brother, Tristan.
Can I quickly take the hon. Gentleman back to when he said he was proud of the action his Labour Government have taken? For a long time while they were in opposition, his colleagues advocated making misogyny a hate crime. I assume it was in their manifesto, but I am not quite clear about that. He mentions misogyny as one of the vile things that happen all the time in the manosphere. Why does he not press his Government more to make it a hate crime?
Paul Waugh
The Minister for Safeguarding, my hon. Friend the Member for Birmingham Yardley (Jess Phillips), has repeatedly emphasised the need to crack down on and outlaw misogyny, as have many of my colleagues. There is definitely more work to do on that, but it is a key part of our violence against women and girls strategy.
It was a pleasure to meet the Smartphone Free Childhood campaign last week—including Zack George, aka Steel from “Gladiators”, whom many Members will also have met—to hear why we need further action to protect our kids from the harm that social media can cause. As the hon. Member for St Neots and Mid Cambridgeshire (Ian Sollom) has already mentioned, harm arises not only from content, but from design features such as algorithmic amplification and endless scroll—features that go beyond a simple age-based ban.
We need to help parents who are desperate for support in combating the daily nightmare of wresting back control from their children’s phones and computers. Suicide ideation, self-harm, pornography, animal cruelty, child sex abuse, anti-Muslim hatred and anti-Jewish hatred are all things that we want to protect our youngsters from seeing online, but we feel powerless in the face of the outrage economy. It is time to stop that sense of powerlessness.
Like the hon. Member for St Neots and Mid Cambridgeshire, I want to praise the BBC’s recent documentary “Inside the Rage Machine”, which reported whistleblowers claiming that Meta made decisions to allow more harmful content on people’s feeds simply because internal research into its algorithms showed that outrage fuelled engagement and monetisation. A TikTok employee gave the BBC rare access to the company’s internal user complaints dashboards, as well as other evidence of staff being instructed to prioritise several cases involving politicians rather than a series of reports of harmful posts featuring children.
I would like to promote the great work that Rochdale borough safeguarding children partnership does to allow parents to access the right tools to protect their children. Other councils across the country are doing similarly great work—solutions are at hand. The Government’s new media literacy action plan should help us all to build resilience against hatred, and the Education Secretary’s recent guidance to schools to be phone free was very welcome indeed.
The Government’s consultation on social media is another huge step forward in creating a healthy relationship between children and the internet. We need to test all the options presented in the consultation so that decisions can be truly evidence based and delivery can be rolled out as effectively as possible. We need to balance the upsides of life online for young people—the friendship groups, the specialised help, and the need to protect free speech—against the very clear downsides.
Finally, we also need to address the offline issues that are often turbo-charged online. For example, why is it that these guys in the manosphere are so popular in the first place? There is the provocation, the riskiness, the sophisticated editing, the addictive nature of their output, the justification that it is “just jokes”, and the get-rich-quick con merchantry of it all. We need to ask how we can provide alternative role models for our boys and young men. How can we help their mental health? How can we repair their trauma? How can we tackle the lack of fulfilling jobs, careers and housing that is so often at the root of scapegoating—whether that is the scapegoating of women, Jews, Muslims, migrants, or their own lack of opportunities?
I call the Liberal Democrat spokesperson.
Dr Danny Chambers (Winchester) (LD)
I thank my hon. Friend the Member for St Neots and Mid Cambridgeshire (Ian Sollom) for securing this debate. His introduction was eloquent and his knowledge of the subject very evident.
I will be honest with the House: when I first saw the title of this debate, I was not quite sure what to focus on or where to start. Everyone here has raised different issues. Do we start with addictive algorithms, underage children being able to access pornographic content, non-consensual image editing, financial scams or medical misinformation? My hon. Friend the Member for Bath (Wera Hobhouse) pointed to the harassment of people being filmed without their consent on the street. This is an absolute wild west and we have not even mentioned electoral interference by foreign powers.
Part of this wild west is AI chatbots, which I have spoken about on several occasions in this House. It surprised me to learn that a third of adults have relied on an AI chatbot for mental health advice or support, or for advice on a life choice. It is also concerning that one in four teenagers has done the same thing. That is not helped by the fact that over a million people are on NHS waiting lists waiting for mental health treatment. Those people are looking for other options. Although these chatbots could be useful if designed in the right way, the concern is that they are unregulated. The medical advice that they are giving is unsafe and can be dangerous. We know that some people with eating disorders are getting advice on low calorie diets and how to access weight-loss drugs, which are completely inappropriate for them. My hon. Friend the Member for St Neots and Mid Cambridgeshire spoke about children using chatbots to get around the safeguards of online gambling companies. That is hugely dangerous.
Then we come back to the general social media that people have been talking about. When I speak to people in Winchester, I hear that parents want action on social media, teachers want action on social media, and even many young people, especially teenagers, say that they think social media is dangerous and damaging. Many of them actually want us to take action on it as well.
The Liberal Democrats have been working hard for the past few months to develop a position on this. My hon. Friends the Members for Harpenden and Berkhamsted (Victoria Collins) and for Twickenham (Munira Wilson) have spent months engaging with experts, charities and other organisations to come up with robust, evidence-based positions that would help us to tackle this issue. What we propose for the platforms with addictive algorithms that allow children to make contact with strangers, or for strangers to make contact with them to show inappropriate content, is to have an age rating like we see with films and video games. If we ban specific platforms, other ones pop up and we end up in a sort of regulatory whack-a-mole where we cannot keep on top of it all. If we take a principles-based approach that is based on the harm that can be caused, it can be applied to current and future platforms.
To be clear, those proposals, if applied today, would effectively result in a social media ban, because current social media platforms would not be suitable for children under 16 years old. However, there is nothing to stop technology companies creating online spaces that do not have dangerous, addictive algorithms, that do not have inappropriate contact, and that do not allow strangers to contact children. They could create useful spaces where children can connect and help each other with their homework.
One reason that we do not support the Conservatives’ headline-grabbing proposal of a complete ban on social media is that it would remove the ability for children to use Wikipedia to do homework. It would also mean that children under 16 would not be allowed to use WhatsApp, so they would be kicked out of the family WhatsApp group, and we know how many families rely on the WhatsApp group to run their lives.
We need intelligent, proactive regulation that is fit for purpose. It is not just a matter of announcing policies that chase headlines and taking quick political positions to get a hit in the media. This is such an entrenched problem, and we need cross-party support to tackle it in a meaningful way.
I think we all agree that scrolling is the new smoking. Like smoking, we already know the dangers. With smoking, we knew for decades. We knew that it was harmful and addictive, and specifically harmful for children. We know what happens: the risks are downplayed by lobbyists and the big companies, and the debate ends up shaped by misinformation and industry lobbying—and it is happening again. These new technologies, which sit at the heart of people’s emotional lives, are still subject to remarkably little scrutiny.
One day we may look back on the unregulated social media landscape and AI chatbots in the same way that we now look back at smoking. We will say that we knew the risks and we knew that action had to be taken, but we waited too long and people suffered, even died—and it was preventable. These are not abstract concerns. Poorly regulated social media and AI are some of the most pressing emerging public health threats.
I really would urge the Minister to agree to meet me and my Liberal Democrat colleagues to discuss the proposals we have come up with. They are backed by organisations such as the National Society for the Prevention of Cruelty to Children. They are powerful yet nuanced enough to have a genuine impact in this area, and they could be implemented immediately.
I congratulate the hon. Member for St Neots and Mid Cambridgeshire (Ian Sollom) on securing this debate. As with so many debates over recent months, it has shown that online harms are a matter of paramount importance to Members on both sides of the House and in the other place. As was to be expected, every Member who spoke focused on online harms. I think only one Member, the hon. Member for Cowdenbeath and Kirkcaldy (Melanie Ward), spoke about some of the positives of the internet age.
I would usually say that it has been a pleasure to listen to and take part in the debate, but it really has not been in this case, because it has been a pretty grim debate. We have had a tour de force discussion of all the horrors that young people and adults are exposed to on the internet, and we have heard about the importance for our society and country of tackling them.
I am very proud of the steps that His Majesty’s official Opposition took in government to make the online environment a safer place, from bringing the Online Safety Act into force to the commendable and tenacious work of my noble Friends in the other place, especially Baroness Owen and Baroness Bertin, who are the staunchest of advocates for protections from digital forms of abuse for women and girls. Members will know that Baroness Owen secured important amendments to the Sexual Offences Act 2003 to criminalise the solicitation of sexually explicit deepfake images. Baroness Bertin’s report and campaigning has resulted in amendments being tabled to the Crime and Policing Bill concerning nudification apps. That is by no means the extent of their important work.
The aim of the Conservatives’ Online Safety Act was to build an environment where adults could access legal content freely and where children enjoy greater protections. I welcome in particular the entry into force last year of Ofcom’s protection of children codes. I also welcome the enforcement action that Ofcom has already taken under the Act to tackle file-sharing sites disseminating child sexual abuse material and pornography sites that have failed to put in place highly effective age-assurance measures to prevent children from accessing content. However, we know that concerns regarding children’s social media use go well beyond content that is explicitly harmful and subject to restrictions under the Online Safety Act.
As a result of addictive algorithms that drive excessive use and unhealthy patterns of behaviour, parents across the country are rightly concerned about their children’s social and emotional development. That is why we called for a social media ban for children under the age of 16. This month, the Government regrettably voted down amendments to the Children’s Wellbeing and Schools Bill, which were secured in the other place by my noble Friend Lord Nash, to bring such a ban into effect. In response to pressure from His Majesty’s loyal Opposition and other Members across the House, the Government have now launched their own consultation on measures to restrict access to social media for under 16s, alongside several other online safety matters. If the Government had accepted our amendment to the Data (Use and Access) Act 2025, such a review would be well under way by now, and we would be closer to a solution on this generationally important issue, but we are where we are. Consultation is no substitute for action, and I sincerely hope that the Government will deliver on the timescales set out for responding and bringing in measures after their consultation concludes in May.
As with any rapidly evolving technology, social media and other online tools develop new apps and sites that pose novel threats and demand a response from Government and regulators. We have seen most recently in AI chatbots, some of which may fall outside the scope of the regulatory framework in promoting self-harm content to young people. A particular harm that I have raised with Ministers and Ofcom, of which there has been a disturbing increase, is the use of AI chatbots to obtain medical or other advice that should be sought from regulated professionals. The hon. Member for Winchester (Dr Chambers) mentioned that in his speech. Last year, the Medicines and Healthcare products Regulatory Agency established a national commission, which ran a call for evidence on the suitability of the UK’s framework for regulating AI in healthcare. The call for evidence closed last month, and I look forward to seeing the commission’s conclusions and the Government’s proposals for dealing with the risks that will no doubt be highlighted.
A fundamentally important aspect of online harm that has attracted comparably less media attention and debating time in Parliament is the threat to democracy of online disinformation campaigns perpetrated by hostile state actors and their affiliates. The Science, Innovation and Technology Committee reported last year that online foreign interference and disinformation campaigns are putting UK citizens at risk. We also had credible reports last year of Iranian state-backed digital interference in the Scottish independence referendum. The risk posed by that type of activity is intensifying, as artificial intelligence tools provide the capability to generate deepfake content purporting to represent politicians or campaigns, amplified by foreign, hostile, state-controlled bot accounts.
As people—particularly young people—increasingly obtain their news online, it is more important than ever that we consider the potential of digital watermarking tools that can be used to demonstrate the provenance and authenticity of content published on the internet. This danger is likely to increase as a result of geopolitical tensions. In their report on artificial intelligence and copyright, which was published yesterday, the Government recognised the risks posed by digital replicas and deepfakes in spreading convincing disinformation online, and committed to exploring options to address the growing problem. The Government also discussed the need to label AI-generated content to make disinformation easier for users to spot.
Will the Minister provide timescales for that further work, and an update on the Government’s broader strategy on countering AI-generated democratic interference material? What role does Minister think digital watermarking tools will play in countering the proliferation and impact of deepfake videos and content?
The Parliamentary Under-Secretary of State for Science, Innovation and Technology (Kanishka Narayan)
I thank the hon. Member for St Neots and Mid Cambridgeshire (Ian Sollom) for bringing this important debate to the House. A number of hon. Members have mentioned bereaved families, and I want to pay tribute to all those families. Ian Russell—with whom I have had a series of meetings, including this morning—Stuart and Amanda Stephens, Ellen Roome and so many others have gone through the most horrific of tragedies, and despite that, they have consistently fought for appropriate action for other families. I carry them in my heart and mind when I think about the prospect of online safety regulation doing justice to future generations of children in this country.
I am grateful to the hon. Member for St Neots and Mid Cambridgeshire and to the other Members who made contributions on this important topic. In the interest of time, I propose to prioritise responses to them individually before talking about the wider context. First and foremost, I thank the hon. Member for St Neots and Mid Cambridgeshire for doing a stocktake of progress on the child safety and illegal content duties so far. He will be aware that Ofcom is due to report on content harmful to children and progress on that question this year. I understand that will be due by October, and I look forward to its findings to assess where we can go further still.
The only other thing I will flag to the hon. Member for St Neots and Mid Cambridgeshire is that the national consultation we have launched on children’s wellbeing includes the question of functionality limitations. The functionalities that he talked about—algorithmic recommendations and the structural aspects that make parts of social media particularly harmful to children—will be in scope. I would very much welcome his submissions on that as well.
I thank my hon. Friend the Member for Blaydon and Consett (Liz Twist) for her consistent advocacy on this question, and for the roundtable she held with the Mental Health Foundation and the Molly Rose Foundation, which I was glad to attend. I thank her for not just shining a light and keeping a consistent focus across the House on the scale of the problem, but flagging the diversity of views on how we should tackle it most effectively. I have been in schools pretty much every week since the launch of the consultation. I was with young people just this morning, and I will be in a school next week. She is right to raise the diversity and depth of views held on how we act, not whether we act.
My hon. Friend the Member for Blaydon and Consett raised concerns about the suicide forum, which my hon. Friend the Member for Cowdenbeath and Kirkcaldy (Melanie Ward) also mentioned. I share those concerns, and I have engaged with Ofcom to ensure that it is acting quickly and robustly. I had a meeting with one of the bereaved families just this morning. I will continue to ensure that Ofcom does everything it can with the powers it has, and that we continue to look at any further powers required to ensure we act robustly to prevent any such incidents happening again. I would, of course, be delighted to meet my hon. Friend the Member for Blaydon and Consett to continue that conversation.
I have had the privilege of engaging with the hon. Member for Bath (Wera Hobhouse) on the illegal sale of drugs; I know that she has been, quite rightly, actively advocating on that question. She will be aware that it has been deemed a priority offence. Ofcom is closely monitoring compliance. I know there is more to do; she has made that point very firmly to me. I will also inform her that the National Crime Agency is looking to identify offenders operating online, both nationally and internationally. She made a very important point on covert filming, and we will take what she raised into consideration. Systems that are designed to remove such content will now have to do so within 48 hours of non-consensual intimate images being put up online. I will continue to look at the implementation of that measure once it comes into force.
My hon. Friend the Member for Gravesham (Dr Sullivan) raised very important points about the impact of social media usage on brain development, which is one motivating factor for our consultation. We are looking at not just acute harms, but the chronic impact over time of engagement on social media. I am grateful to her for raising the point that there is a suite of options that might be appropriate. I very much share her intent that, at the heart of it, the action we take will make platforms, not young people, responsible for the harms being conducted online.
I thank my hon. Friend the Member for Heywood and Middleton North (Mrs Blundell) for advocating on the questions of misinformation and community cohesion, both in her community and nationally. On her point about misinformation and the erosion of public trust, which was also made by the shadow Minister, the hon. Member for Runnymede and Weybridge (Dr Spencer), there is a very clear foreign interference offence in the Online Safety Act. I will continue to look at the implementation of that. Alongside that, I serve on the defending democracy taskforce with the Security Minister. This is a priority question that we have been looking at. I will continue to ensure that we do more to press the enforcement of existing law and to look at where we can go further still.
Both my hon. Friend the Member for Heywood and Middleton North and my hon. Friend the Member for Rochdale (Paul Waugh) raised important points about community cohesion, and how we must use online experiences not to divide but to unite our communities. In that context, we have taken a series of initiatives on media literacy to support the ability to sift fact from fiction across our communities. The foreign interference provisions in the Online Safety Act are also a key vector of enforcement against the causes described.
On antisocial behaviour, I would be interested, in the light of the consultation, to hear from my hon. Friend the Member for Cowdenbeath and Kirkcaldy about where the headteachers and young people she has engaged with think we ought to go. I agree with her on the divisive impacts, and we will continue to look not just at illegal content but at how we empower users in relation to divisive content that, individually, might be legal but, collectively, ends up being deeply harmful to community cohesion, as well as to democratic integrity.
My hon. Friend the Member for Reading Central (Matt Rodda) reaffirmed the point that he has made to me in person about this issue. I pay tribute again to Stuart and Amanda Stephens, who have gone through the most horrific tragedy in their family. I am deeply grateful for their grit and resilience through it, and for my hon. Friend’s advocacy alongside them. He asked me for a sense of direction on where the consultation is going. I will not pre-empt its substantive content, but we have had almost 25,000 responses—I hope and expect that this will be the most engaged-with consultation in the history of British national consultations—including thousands of young people. We have designed a dedicated version of the consultation for young people as well as one for parents and carers. I am keen to hear my hon. Friend’s views from his engagement, as well as those of other Members.
My hon. Friend the Member for Rochdale raised a very important point about the documentary “Inside The Manosphere”, the growing cause of misogyny in this country and this Government’s priority of tackling violence against women and girls. He will be aware that in December, we published our landmark cross-Government violence against women and girls strategy. That was the underpinning force for our making cyber-flashing and intimate image abuse priority offences in this country, banning the creation of nudification apps and banning people from creating and sharing that content, and it is why we are going further still in ensuring that such content is taken down robustly and quickly, within 48 hours. On the point that he and my hon. Friend the Member for Heywood and Middleton North raised about the growing prevalence of antisemitism and division online, I look forward to an imminent meeting with the Antisemitism Policy Trust to figure out how we can go further not just in law but in terms of awareness of it across our communities.
I turn to the contribution from the Liberal Democrat spokesperson, the hon. Member for Winchester (Dr Chambers). I have met the Liberal Democrat Front-Bench team to talk about their suggestions on functionalities and age ratings. I would of course be happy to continue the conversation, and I encourage them to contribute to the consultation.
Finally, the shadow Minister, the hon. Member for Runnymede and Weybridge, raised a very important point about chatbots. I hope it is very clear that chatbots ought never to replace professional support. We will continue to look at that, and I will update the House when we have decided on specific steps. We announced just yesterday that we are looking at the issues of labelling and personality rights, and I hope to update the House on them soon.
Ian Sollom
I thank all Members who have contributed to the debate. The hon. Member for Blaydon and Consett (Liz Twist) told us about the 135 deaths linked to one pro-suicide forum—135 people who are not with us. It is really stark and powerful to share that sort of statistic. My hon. Friend the Member for Bath (Wera Hobhouse) shared stories of the new frontiers in misogyny and abuse online.
The hon. Member for Gravesham (Dr Sullivan) highlighted the science, as I would expect from the chair of the Parliamentary Office of Science and Technology—though she is maybe not quite so hot on geography. The hon. Member for Heywood and Middleton North (Mrs Blundell) made some really powerful points on online discourse and how hate, Islamophobia and antisemitism proliferate.
I wish the hon. Member for Cowdenbeath and Kirkcaldy (Melanie Ward) luck with tackling antisocial behaviour. She highlighted the link between what is happening in the online space and real-world antisocial behaviour and how they reinforce each other; it is toxic. I thank the hon. Member for Reading Central (Matt Rodda) for sharing Olly Stephens’s story again. I pay tribute to Stuart and Amanda for the campaigning they do.
The hon. Member for Rochdale (Paul Waugh) talked about the manosphere and highlighted the connection to the real world, but in a more positive light, asking what we can do in the real world to make a difference to the online space; I really appreciate that. The hon. Member for Farnham and Bordon (Gregory Stafford) also made some important interventions.
I appreciate the Minister’s effort to respond directly to all Members. We need timely action after the consultation, because these issues are not going away, as we have heard today, so let’s keep talking about this.