Online Safety Act 2023: Repeal

Lewis Atkinson Excerpts
Monday 15th December 2025

(1 day, 20 hours ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Lewis Atkinson Portrait Lewis Atkinson (Sunderland Central) (Lab)
- Hansard - -

I beg to move,

That this House has considered e-petition 722903 relating to the Online Safety Act.

It is a pleasure to serve with you in the Chair, Mr Pritchard, and to open this important debate as a member of the Petitions Committee. I start by thanking the 550,138 people who signed the petition for their engagement with the democratic process, and in particular the petition creator, Alex Baynham, whom I had the pleasure of meeting as part of my preparations for this debate; he is in the Public Gallery today. My role as a member of the Petitions Committee is to introduce the petition and key contours of the issues and considerations that it touches on, hopefully to help ensure that we have a productive debate that enhances our understanding.

I believe that at the heart of any balanced discussion on this issue is a recognition of two simultaneous features of the online world’s development over the last 30 years. First, there has been the development of incredible opportunities for people to communicate and form bonds together online, which go far beyond the previous limitations of geography and have allowed a huge multiplication of opportunities for such interactions—from marketplaces to gaming to dating. We should welcome that in a free society.

Secondly, the opportunities for harm, hate and illegality have also hugely increased, in a way that previous legislation and regulation was totally unequipped to deal with. That is what prompted the introduction of the Online Safety Act 2023. As the Minister at the time said:

“The Bill is world-leading, and the legislative framework established by it will lead to the creation of a profoundly safer online environment in this country. It will kickstart change where that is sorely needed, and ensure that our children are better protected against pornography and other content that is harmful to them.” —[Official Report, 12 September 2023; Vol. 737, c. 799.]

Although some aspects of the Online Safety Act have been more prominent than others since its introduction, it is important in this debate to recall that there are multiple parts of the Act, each of which could separately be subject to amendment or indeed repeal by Parliament. There was the introduction of a framework placing obligations on in-scope services—for example, social media platforms—to implement systems and processes to reduce the risk of their services being used for illegal activity, including terrorism offences, child sexual exploitation and abuse, and drugs and weapon offences. Those duties have been implemented and enforced since March 2025. Secondly, the Act required services to implement systems and processes to protect under-18s from age-inappropriate content—both content that may be passed from user to user, and content that is published by the service itself, such as pornography sites.

We should recognise that the Online Safety Act implemented measures to regulate a wide range of diverse services, from social media giants to commercial sites, but also online spaces run by charities, community and voluntary groups, and individuals. As the first substantive attempt at regulating safety online, the OSA has brought into regulation many services that have not previously been regulated.

Mr Baynham explained to me that those services lay behind his primary motivation in creating the petition. He was spurred by concerns about the impact of the Online Safety Act on online hobby and community forums of the type he uses. They are online spaces created by unpaid ordinary people in their spare time, focused on the discussion of particular shared interests—games, a film or TV series, or football teams. A number of the administrators of such forums have expressed concern that they now face liabilities and obligations under the Online Safety Act that they are not equipped to meet.

I must declare an interest at this stage. For more than a decade, I have regularly used the Ready To Go—RTG—Sunderland AFC fans’ messaging boards. They provide thousands of Mackems with online space to discuss the many ups and downs of our football club and associated issues facing the city, with current topics including club finances, “Match of the Day” tonight and, following a successful Wear-Tyne derby yesterday, “The Mag meltdown” thread.

I heard directly from the administrator of the RTG forum in preparation for this debate. He told me that he came close to shutting the site down when the Online Safety Act came into force and has still not ruled that out completely. He points out that there have been thousands of pages of guidance issued by Ofcom on the implementation of the Act, and that, while tech companies with large compliance teams have the capacity to process that volume of guidance, having volunteers do the same is a huge challenge.

Ofcom has stressed that it will implement the Act in a way that is risk-based and proportionate, and has offered a digital toolkit targeted at small services in response. But even for the smaller sites the rules seem to require, for example, a separate and documented complaints system beyond the usual reporting functionality that small forums have often had in place. The administration of that system has been described to me as time-consuming and liable to being weaponised by trolls.

Some forum hosts feel that the uncertainty regarding the liability they face under the Online Safety Act is too much. The reassurance offered that prosecution is “unlikely” has not given sufficient confidence to some who have been running community sites as volunteers. To some, the risk of liability, personal financial loss or simply getting it wrong has been too great; when the Act came into force, 300 small forums reportedly exited the online space or lost their status as independent forums and migrated to larger platforms such as Facebook.

Iqbal Mohamed Portrait Iqbal Mohamed (Dewsbury and Batley) (Ind)
- Hansard - - - Excerpts

The hon. Member is making an extremely passionate and informed speech. While the unintended consequences of the Online Safety Act on the small forums and specialist groups that he highlights are critical, does he agree that a balance needs to be struck, whereby under-age children are protected from harmful content on whatever forum or website they are exposed to?

--- Later in debate ---
Lewis Atkinson Portrait Lewis Atkinson
- Hansard - -

I absolutely agree with the hon. Gentleman, and he will not be surprised that I will come on in my speech to deal with some wider issues about the Online Safety Act, in particular the protection of children. I think that today’s debate is likely to be more nuanced than simply whether we should maintain or repeal the Online Safety Act, and we will talk about the implementation and potential evolution of the Act over time.

The ask that I have heard from administrators of small forums is that Ofcom take further steps to simplify the record-keeping and risk-assessment burdens for small sites. When I have met with other organisation such as the Open Rights Group in preparation for this debate, they have suggested that exemptions be made for small and low-risk sites.

It is clear that a size-only exemption would not be appropriate; unfortunately, there have been small platforms specifically to host harmful content, such as forums dedicated to idealising suicide or self-harm, but it is possible that some combination of size and risk could be considered together. These questions touch at the heart of how we maintain the positives that come from vibrant and plural internet spaces while also clamping down on online harms.

Anneliese Dodds Portrait Anneliese Dodds (Oxford East) (Lab/Co-op)
- Hansard - - - Excerpts

Like my hon. Friend, I want to pay tribute to site managers and moderators; I am sad indeed that an incredible example of that function from my city of Oxford, Maggie Lewis, has passed away. She was an incredible presence online for the community and did much other community and charity work.

I looked at some of the small websites that had apparently had issues because of the Act. I found one that was an internet forum known for its open discussion and encouragement of suicide and suicide methods. I found another community website that had allegedly shut down, but is still functioning and has a forum where local people can let others know what is happening in the community—just one element of it had had to close. Does my hon. Friend agree that it is important that, when looking at the regulatory burden, we argue on the basis of facts to make the right decision?

Lewis Atkinson Portrait Lewis Atkinson
- Hansard - -

My right hon. Friend is absolutely right. I think, as a society, we want forums such as the ones she reports to close down—they have been harmful. But I recognise that there were others that, maybe pre-emptively, decided to shut down. Perhaps the Minister has further information on how far the reported close-downs were a one-off event, in pre-emption, rather than an ongoing, repeated loss of online spaces.

As I have outlined, we are getting at a more nuanced position from owners and operators of bona fide community forums who are concerned about how to ensure that they are meeting their obligations—in the same way that any person would meet obligations such as those under the Data Protection Act 2018, which has always applied. That is a more nuanced position, far from asking for a full-out repeal of the OSA, but rather asking how the obligations under the Act can be carried out in a proportionate manner.

Peter Fortune Portrait Peter Fortune (Bromley and Biggin Hill) (Con)
- Hansard - - - Excerpts

I thank the hon. Member for introducing the debate—and, as somebody who shares a house with a Newcastle fan, I thank him for a miserable weekend. It is important that we get the safety elements and aspects of the Online Safety Act correct, but does he agree that it should not be used as a blunt tool to stifle freedom of speech online?

Lewis Atkinson Portrait Lewis Atkinson
- Hansard - -

I do, but I will come to some of the issues regarding social media platforms in what I am about to say. I certainly would not want to stifle the freedom of speech of Newcastle fans expressing their genuine heartfelt sorrow about yesterday’s events.

I turn now to wider concerns that have been expressed about the Online Safety Act, which, although they are not the motivations of the petition creator, are undoubtedly held by a number of people who signed the petition. The number of petition signatories notably increased in the immediate aftermath of the implementation of age verification requirements that have been applied to significant parts of the internet, from pornography to some elements of social media. Here, I am afraid I find it significantly harder to provide balance in my introduction to the debate, having read the report by the Children’s Commissioner that was published in advance of the implementation of the OSA, which stated:

“It is normal for children and young people to be exposed to online pornography”,

as 70% of children surveyed responded that they had seen pornography online. The report also found:

“Children are being exposed at very young ages…the average age a child first sees pornography online is 13…More than a quarter…of respondents had seen online pornography by the age of 11.”

Lola McEvoy Portrait Lola McEvoy (Darlington) (Lab)
- Hansard - - - Excerpts

My hon. Friend is making a clear and coherent speech. I surveyed 1,000 young people in my constituency, and the forum leads of my online safety forum said that they had found graphic and disturbing content, which they had never searched for, regularly fed to them through the algorithms. Does the hon. Member agree that that is robbing children of their childhood and that age verification needs to be stronger, not weaker, as a result of the 2023 Act?

Lewis Atkinson Portrait Lewis Atkinson
- Hansard - -

I agree that there is significant work to be done to effectively implement the OSA. I will touch on that, and the Minister may wish to do so in his response.

Crucially, the report by the Children’s Commissioner found that children were most likely to see pornography by accident—a key point that some of the criticism of the Act fails to grasp. The horrifying statistics, showing the scale of online harm to children that the OSA is working to reduce, make it obvious why in a recent survey 69% the public backed the introduction of age verification checks on platforms, and why children’s charities and children’s rights organisations overwhelmingly back the OSA and—to my hon. Friend’s point—want it implemented more rapidly and robustly.

I have heard that some petition signatories are particularly concerned about age verification on platforms, such as X, Reddit or Discord, beyond those specifically designed as pornography sites. However, the report by the Children’s Commissioner shows that eight out of 10 of the main sources where children saw pornography were not porn sites; they were social media or networking sites. Those platforms that choose to allow their users to upload pornographic content—some do not—should be subject to the same age-verification requirements as porn sites in order to keep our children safe.

Following the implementation of those provisions of the Online Safety Act, it was reported that UK traffic to the most popular pornographic websites was notably down. Yes, it was initially reported that there had been in spike in the number of virtual private networks, or VPNs, being downloaded for access to those sites, but research increasingly suggests it is likely that that trend was being driven by adults worried about their anonymity, rather than by children seeking to circumvent the age limitations.

The Online Safety Act addresses harms beyond those done by porn. Content that is especially harmful to children and that children should not have access to includes very violent content and content encouraging limited eating or suicide.

Amanda Hack Portrait Amanda Hack (North West Leicestershire) (Lab)
- Hansard - - - Excerpts

Looking at those algorithms is a really important part of the Online Safety Act. When I was a county councillor looking at public health, I did a piece of work on disordered eating, and I was bombarded with content. I am not a vulnerable young person or a vulnerable adult, but my real fear is that that information is seen by people who are not as capable of managing that content. Does my hon. Friend agree that algorithm work is a key part of the Online Safety Act?

Lewis Atkinson Portrait Lewis Atkinson
- Hansard - -

My hon. Friend is right. The proactive duty that the Act places on providers in relation to the nature of their algorithms and their content is crucial because of the type of content to which she refers. It is right that the largest providers, and those most frequently used by kids, have to take active responsibility for keeping children safe. The implementation of the OSA means that algorithms serving harmful content to kids are now being regulated for the first time. There is a long way to go, and I am sure that other Members will say more than I can in this introduction, but I want to be clear to my constituents that I support the action that the OSA is prompting to improve children’s safety and welfare online.

Various surveys set out the impact of the Online Safety Act; Ofcom is publishing its research and a formal Government review will follow in due course. However, most impactful for me was seeing a teenage boy say on a news piece recently that, now,

“when I’m scrolling TikTok, I’m free from violence.”

That changed for him in the months following the implementation of the Online Safety Act, so it is no wonder that organisations such as the Online Safety Act Network, which I spoke to in preparation for this debate, fully support the Act’s principles. The network points to early evidence that the Act is actively reducing harm to children and emphasised that Ofcom must move beyond content filters to ensure safety by design, which would, for example, include addressing features that incentivise pile-ons, targeting an individual with abuse and harassment.

New Ofcom research shows that 58% of parents now believe that measures in the code of practice are beginning to improve the safety of children online. My belief is that we should be considering not whether to repeal the Act, but how we can continue to enforce it in a robust, effective and proportionate manner.

The way in which the Online Safety Act addresses online hate has perhaps not had as much focus as it might have. As well as being a member of the Petitions Committee, I am privileged to be a member of the Home Affairs Committee, which is conducting an inquiry into combating new forms of extremism. It is very clear from the public evidence that we have received so far that, left unregulated and unchallenged, online spaces and services can be used to amplify hate, thus risking a rise in extremist action, including violence.

Analysis by the Antisemitism Policy Trust highlights that there are patterns of co-ordinated and persistent misogynistic, anti-immigrant, anti-Government and antisemitic discourse on social media, with bot accounts being repeatedly used to amplify misleading or harmful narratives that fuel hate and may increase the risk of violence. Such content often breaches platforms’ own terms of service, but under the Online Safety Act, I understand that Ofcom category 1 services will now be mandated to proactively offer users optional tools to help them to reduce the likelihood that they will encounter legal but harmful content such as that.

There is much to be done to implement those provisions in an appropriate manner. However, I invite anyone calling for full repeal of the Act to consider how we as a society deal with the rise of extremism, and a context where the internet can be used as a sort of free-for-all fuelled by hate-filled algorithms that thrive on and incentivise division and hatred, rather than consensus and civic peace.

I am aware that there are large parts of the Online Safety Act that I have not been able to touch on today; I hope that others will do so during the debate. There are questions about end-to-end encryption, cyber-flashing, the creation of abusive deepfakes, AI moderation and chatbots.

Manuela Perteghella Portrait Manuela Perteghella (Stratford-on-Avon) (LD)
- Hansard - - - Excerpts

The hon. Member is making a strong and thoughtful case. Does he agree that although the Act regulates user-to-user services, it leaves a significant gap around generative AI chatbots, despite the growing evidence of harm caused to children from private interaction with them? And does he share my concern that the speed at which this technology is developing risks outpacing the legislative framework that we have in place?

Lewis Atkinson Portrait Lewis Atkinson
- Hansard - -

I agree with the hon. Lady. In my understanding, when the legislation was drafted, it was not initially clear to those who drafted it that AI would develop at the astonishing pace that it has in recent years. I ask the Minister to reflect on that point in addressing the implementation of the Act and its potential future evolution through primary legislation.

Lola McEvoy Portrait Lola McEvoy
- Hansard - - - Excerpts

I thank my hon. Friend for giving way and for being so generous with his time. Can we also pass on to the Minister that, going forward, there is a possibility to brand bots? That would require the Online Safety Act to be amended to make sure that any profile that is a bot—generated by AI—is explicitly marketed as such, which would protect users as AI advances.

--- Later in debate ---
Lewis Atkinson Portrait Lewis Atkinson
- Hansard - -

My hon. Friend makes that point well, and the Minister will have heard it.

As this discussion continues, I hope that we can find a way of reflecting these two areas of balance—these two features of the online world now. First, there is the absolute primacy of safeguarding children and tackling serious online harms, but it is also important to recognise the real benefits that living in an increasingly connected society bring us all. I think those are very much the motivations of the petition’s creator—we are talking about the work done by good, civic-minded folk, and creators and administrators of online communities and hobby forums across the country. Naturally, as our learning about the implementation of the Act continues, there is a way of doing that that supports the efforts of those people without risking such sites being used to further online harms.

The consensus, I think it is fair to say, is that reform of the Act, rather than repeal, is the realistic route forward. That is natural with such groundbreaking legislation, but reform must be sensitive to the scale, proportionality and privacy, as well as the emerging and changing nature of online harms. I thank Members for their time and their interventions, and I look forward to a positive debate.

Mark Pritchard Portrait Mark Pritchard (in the Chair)
- Hansard - - - Excerpts

I remind colleagues that if they wish to speak, they should bob—quite a few colleagues are bobbing already, so thank you for that.

--- Later in debate ---
Lewis Atkinson Portrait Lewis Atkinson
- Hansard - -

It is a pleasure to see you in the Chair for the conclusion of this debate, Sir John. I thank all Members for their contributions. I think that we had a really constructive and thorough debate, and I certainly learned a lot in the course of it. I only wish that I had heard some of the contributions before I wrote my opening speech. I particularly thank the Minister for being so generous with his time in giving a thorough response and taking interventions, which I think gave us significant insight.

The contributions from the Liberal Democrat spokesperson, the hon. Member for Harpenden and Berkhamsted (Victoria Collins), were thoughtfully made, and the shadow Minister, the hon. Member for Hornchurch and Upminster (Julia Lopez), clearly brought to the debate her expertise on this subject. I think that the points about the wider impacts on mental health and wellbeing of the new online world will be particularly relevant as the Government think more widely about the approach to mental health in their strategy, which I hope will be forthcoming. I can say, as the Minister did, that my only regret is that the few Members of the House who have publicly called for outright repeal of the Online Safety Act were not here this afternoon to give their perspective and to engage with the thoroughness that I think all Members present have engaged.

I have a final couple of reflections. I think that everyone I have heard in this space and debate has been motivated by a desire to preserve rights and our British values, whether it be rights to freedom of speech, freedom of expression or freedom of association, including through small online spaces and forums, but as my hon. Friend the Member for Worcester (Tom Collins) rightly said in his excellent contribution, safe spaces online open up space for further freedoms, and the freedoms of children cannot be infringed by freedoms for adults. It was really shocking to hear the extent of the harms that children are suffering in the current environment. I think that the motivations behind this petition were not about that at all, but it was very much about making sure that the freedoms of association that we hold dear in this country are able to be continued online through small forums. I welcome the Minister’s assurance that the Government see ample space for small and independent providers in the future as part of that.

A reflection that I had in the course of the debate is that increasingly we are talking about safety by design, but a lot of online forums came about in a world in which there was no safety by design. Part of the implementation issue is that the tech being used for online forums is probably 10 or 15 years old in relation to message boards. If a new message board were being set up today, with the use of new tech standards, I would hope that safety by design would be much more embedded, and the responsibilities that fall to individual volunteers and administrators would be lessened as a result.

It is entirely natural that the first attempt at regulation and legislation will not get everything right and that it will require evolution. The Online Safety Act was a landmark attempt to regulate online harms, but I think it is fair to say that the consensus that we have heard today is that it needs to evolve—that we should be looking not to repeal the Act but to evolve at pace and ensure implementation at pace, so that we tackle online harms in a way that is consistent with our British values and the freedoms of expression and association that we have heard about.

It only remains for me to thank again Mr Baynham, the creator of the petition, and all the petitioners for their online engagement—I have to say—in the petition process, without which today’s really informative debate would not have occurred.

Question put and agreed to.

Resolved,

That this House has considered e-petition 722903 relating to the Online Safety Act.