Online Safety Act 2023: Repeal

Peter Fortune Excerpts
Monday 15th December 2025

(1 day, 12 hours ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Lewis Atkinson Portrait Lewis Atkinson
- Hansard - - - Excerpts

My right hon. Friend is absolutely right. I think, as a society, we want forums such as the ones she reports to close down—they have been harmful. But I recognise that there were others that, maybe pre-emptively, decided to shut down. Perhaps the Minister has further information on how far the reported close-downs were a one-off event, in pre-emption, rather than an ongoing, repeated loss of online spaces.

As I have outlined, we are getting at a more nuanced position from owners and operators of bona fide community forums who are concerned about how to ensure that they are meeting their obligations—in the same way that any person would meet obligations such as those under the Data Protection Act 2018, which has always applied. That is a more nuanced position, far from asking for a full-out repeal of the OSA, but rather asking how the obligations under the Act can be carried out in a proportionate manner.

Peter Fortune Portrait Peter Fortune (Bromley and Biggin Hill) (Con)
- Hansard - -

I thank the hon. Member for introducing the debate—and, as somebody who shares a house with a Newcastle fan, I thank him for a miserable weekend. It is important that we get the safety elements and aspects of the Online Safety Act correct, but does he agree that it should not be used as a blunt tool to stifle freedom of speech online?

Lewis Atkinson Portrait Lewis Atkinson
- Hansard - - - Excerpts

I do, but I will come to some of the issues regarding social media platforms in what I am about to say. I certainly would not want to stifle the freedom of speech of Newcastle fans expressing their genuine heartfelt sorrow about yesterday’s events.

I turn now to wider concerns that have been expressed about the Online Safety Act, which, although they are not the motivations of the petition creator, are undoubtedly held by a number of people who signed the petition. The number of petition signatories notably increased in the immediate aftermath of the implementation of age verification requirements that have been applied to significant parts of the internet, from pornography to some elements of social media. Here, I am afraid I find it significantly harder to provide balance in my introduction to the debate, having read the report by the Children’s Commissioner that was published in advance of the implementation of the OSA, which stated:

“It is normal for children and young people to be exposed to online pornography”,

as 70% of children surveyed responded that they had seen pornography online. The report also found:

“Children are being exposed at very young ages…the average age a child first sees pornography online is 13…More than a quarter…of respondents had seen online pornography by the age of 11.”

--- Later in debate ---
Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I congratulate the hon. Member for Sunderland Central (Lewis Atkinson) on introducing the debate. He made a particularly excellent contribution to last week’s petition debate on mandatory digital identification; although his party’s leadership may not have thanked him, I am sure his constituents did. He is right that the internet allows unprecedented connection, which is for good, but also for ill. Our job is to balance that inherent tension, while recognising that sometimes there is no balance to be found and that we have to make a choice when it comes to children being served a toxic online diet of extreme content.

When we were in government, that choice was the Online Safety Act, about which thousands of petitioners have raised concerns, believing that its breadth and scope are having too restrictive an effect. I have some sympathy with those concerns, because the Act is large and very complex; although it is proving effective in protecting children in many ways, the implementation undoubtedly comes with challenges, whether that is VPN usage or the inadvertent capturing of no to low-risk sites in compliance duties.

Peter Fortune Portrait Peter Fortune
- Hansard - -

Childnet has discovered that there has been increased downloading of VPNs by children over the last three months, as adolescents use them to circumvent age verification processes. I was interested to hear what the hon. Member for Sunderland Central (Lewis Atkinson) said—I presume he was referring to the Open Rights Group. I just had a quick look at the research, and although it says that it is not the youngest children who do that, it is children from the age of 13 up, so these are vulnerable adolescents. Does my hon. Friend agree that, for the Online Safety Act to be successful, the use of VPNs has to be examined further?

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I agree, and I am interested to hear what the Minister has to say about VPNs—whether they should be age-gated, whether we should look at app store controls so that parents have to consent to children downloading VPN apps, or whether there are other, more effective ways of doing that.

The sites that we have talked about may be smaller community forums, or they may be volunteer-run, but they are often mid-size tech companies that do not have the resources that the social media giants have to navigate the legal risks and intricacies of the Act without diverting precious capacity that might otherwise be used to innovate and expand their businesses. I have talked to some of those businesses. They may have one person who can do legal compliance; if they are looking at the next stage of the OSA’s implementation, they may be pulled off other work that is helping to grow the economy. We have to take that very seriously and look into it. A lot of those sites are effectively no risk whatever, and the OSA is probably too burdensome for them. We do not want the Act to stifle our vital tech sector.

Concerns have been raised about freedom of speech and user privacy. I can understand those concerns in principle; particularly when age verification first came into place over the summer, there were instances where there was a practical impact and posts were restricted. However, it seems that those early examples have been recycled many times to suggest that the Act is having a wholesale dampening effect on what people feel confident saying online, and I am not sure that that is actually the case. Those concerns are often conflated with other issues, such as the policing of tweets, non-crime hate incidents, the application of legislation such as the Public Order Act 2023, and outrageous cases of state overreach, such as that of Graham Linehan. We must also be mindful of those who seek to exaggerate those concerns for the sake of big tech’s commercial interests. I note the comments that the hon. Member for Oldham West, Chadderton and Royton (Jim McMahon) made about the power that platforms hold.

On the other side of the ledger are worries that it remains relatively straightforward for under-18s to access pornography. There is no foolproof way to age-gate the internet. We have to see the Online Safety Act as a first step in a pushback against the wholly unacceptable situation that we witnessed previously. Of course some children seek out material and will continue to do so with determination, but far too many had previously been stumbling across explicit or illegal material by complete accident.

Studies that have already been cited today suggest that 41% of children first encountered porn on X, rather than seeking it out. If the Online Safety Act has a material impact on reducing that risk, it will have served its purpose. The hon. Member for Milton Keyneps Central (Emily Darlington) provided some truly shocking statistics and rightly said that we are putting porn back on the top shelf. As she was saying that, I thought back to the advent of magazines such as Zoo and Nuts when I was a teenager. They were seen as having dreadfully explicit content that was far too readily available, but they seem quite quaint when we think about what children can access now.

Pornhub reported a 77% reduction in visitors after implementing age verification measures, but it is important to note that the traffic previously going to such sites has not simply disappeared. A chunk of it has shifted to smaller sites that may be riskier because, in some cases, they think they can financially benefit from not implementing age-gating, which is against the law. Those sites need to know that Ofcom fines are coming. No website should assume that it can sidestep its legal duties. The fines are designed to outweigh any short-term commercial advantage gained by ignoring the law.

I do not believe that the best way of dealing with these concerns is to repeal the Online Safety Act, and nobody in the Chamber has advocated for that, but it is for us to review it and to work out how to tighten up child safety while being honest about the aspects of the law that need to change. A very long and winding road led to the previous Conservative Government passing the Act in 2023, but it was one of the first markers on internet regulation to be put down globally that said that the status quo—children having easy access to illegal explicit content—simply could not continue. As the hon. Member for Morecambe and Lunesdale (Lizzi Collinge) suggested, we cannot have untrammelled freedom for adults in this space, because that status quo caused significant harm to children.

It is right that we now expect more from social media platforms, which were given an opportunity to self-regulate and were found wanting. The Act has driven them to make design changes to help parents, children and teenagers, including Meta’s teen accounts and some aspects of Roblox. Without those protections, more children would encounter harmful content, receive unsolicited contact from adults, and access material that encourages self-harm, eating disorders or even suicide. Families would also continue to face barriers when seeking answers from tech companies following tragedies. Pressure on those companies will continue for as long as we see technology’s pernicious effects on our children, including a chatbot’s recent encouragement of a teenager into suicide.

Despite the criticisms levelled at this law, it remains popular with parents. Parents must remain sovereign in how children are raised, and they must have the parenting confidence to deny phones and social media to their children, but no matter how involved or savvy parents are, they need help with these challenges—challenges that creep into the heart of people’s homes in a way that we have not seen before, and find their way to children through what other children may be sharing with them.

During the years that the Act was being drawn up, we made a lot of changes, including removing the provisions relating to legal but harmful content in 2022. That is good because, as we can see, Ofcom has an enormous job on its hands dealing with some of the biggest online problems, such as age-gating pornography. Had we taken an expansionist approach, we would now be facing far greater problems around free speech, and Ofcom would have an even heavier role—not to mention workload—as arbiter of the public square.

Hon. Members need to think about that carefully: the OSA became a Christmas tree on which everybody hung their niche, individual concerns, and it became unimplementable. If the Labour Government wish to go down a more restrictive route in some of these areas, they have to be mindful of that risk. They need something that can actually be implemented in law and they need to resource the regulator to implement it.

Long before the Act was brought in, the status quo was also having a negative impact on journalistic content. There is lots of discussion about freedom of speech, but I recall having discussions as a media Minister with traditional news content providers that were extremely frustrated by west coast content moderators arbitrarily taking down their content with no opportunity to appeal. With this legislation, we introduced must-carry provisions that would give proper news content providers a greater chance of their content being visible, which is important for journalistic credibility and to make sure that we have truth in online spaces.

As I have said, the Online Safety Act represents the start of a journey as countries grapple to find the right framework through which adults can retain their freedom on the internet while children are treated as children. This debate is particularly timely as a new social media ban was introduced in Australia last week. My party will be watching that closely, but concerns about social media and mobile phones go far beyond the ability to access porn or illegal content.

As a party, the Conservatives are concerned about the impact of social media and smartphone use on children’s mental health, education and social development. I suspect that any hon. Member who has recently been to a school in their constituency will have heard about the challenges to the learning environment, the challenges of the social interactions between children, and the challenges that parents are facing at home, which go way beyond the issue of illegal content. We have also heard other concerns. I liked the way the hon. Member for Worcester (Tom Collins) described it as “a veritable chemistry lab of…psychoactive substances”. He also made some interesting observations about safety and how, as a democracy, we must think very carefully about broader harms.

The Act will be statutorily reviewed next year. I would welcome the Minister telling us whether the Government are examining the measures that have been discussed today, including whether GDPR protections on the processing of children’s data might be raised from age 13 to 16. Age-gating also needs attention. Reports of VPN use suggest that children have been circumventing protections, and we must consider whether age-gating should be applied more comprehensively, including to VPN use or via app stores or at device level to close those loopholes. Adults also need to be assured of the privacy preserving nature of the age verification tools that they are using. There are concerns about the volume of sensitive data being collected.

As I have described, we also need to ensure that low-risk tech firms are not being disproportionately burdened. They did a huge amount of work before the July phase of Ofcom’s introduction of age verification, and they are worried that they will need to make a further step change for regulatory compliance, given the burden that will place on them.

Regulation has to remain proportionate, targeting high-risk sites and services without undermining innovation or our competitive position in tech. We also need to examine, as has been discussed many times today, the emerging technologies such as generative AI and chatbots —as was suggested by the hon. Member for Dewsbury and Batley (Iqbal Mohamed) and the hon. Member for Worcester. The legal position under the OSA is not yet entirely clear, but children are increasingly exposed to AI-generated content, and we need to know if the Act is flexible enough to deal with innovations of that kind.

As hon. Members have described today, the Online Safety Act is a necessary first step, but it is only the beginning of a much longer fight to protect childhood. We have to create the space for childhood and adolescence away from screens, with all the richness and stimulation that the real world can bring. Parents ultimately can never cease their vigilance. The internet cannot be made wholly safe, and we cannot be naive about that. Ultimately, parents have to remain the ultimate backstop to make sure that their children are safe online—but they need help. It is our role as legislators to provide some of those tools and that assistance to help children through their childhoods.

I hope that we look back at the time before the Online Safety Act and wonder how we ever allowed children to be exposed to the unbridled internet culture that has hitherto been the norm. To return to the opening remarks of the hon. Member for Sunderland Central, the internet has led to the creation of new bonds and a huge multiplication of opportunity, but simultaneously the opportunity for harm. Sometimes we need to make a choice, and while the Online Safety Act is undoubtedly imperfect, the imperative to protect children will always take precedence. If social media platforms are held to greater account, so be it.