Online Safety Act 2023: Repeal

Debate between Iqbal Mohamed and Lewis Atkinson
Monday 15th December 2025

(1 day, 21 hours ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Lewis Atkinson Portrait Lewis Atkinson (Sunderland Central) (Lab)
- Hansard - - - Excerpts

I beg to move,

That this House has considered e-petition 722903 relating to the Online Safety Act.

It is a pleasure to serve with you in the Chair, Mr Pritchard, and to open this important debate as a member of the Petitions Committee. I start by thanking the 550,138 people who signed the petition for their engagement with the democratic process, and in particular the petition creator, Alex Baynham, whom I had the pleasure of meeting as part of my preparations for this debate; he is in the Public Gallery today. My role as a member of the Petitions Committee is to introduce the petition and key contours of the issues and considerations that it touches on, hopefully to help ensure that we have a productive debate that enhances our understanding.

I believe that at the heart of any balanced discussion on this issue is a recognition of two simultaneous features of the online world’s development over the last 30 years. First, there has been the development of incredible opportunities for people to communicate and form bonds together online, which go far beyond the previous limitations of geography and have allowed a huge multiplication of opportunities for such interactions—from marketplaces to gaming to dating. We should welcome that in a free society.

Secondly, the opportunities for harm, hate and illegality have also hugely increased, in a way that previous legislation and regulation was totally unequipped to deal with. That is what prompted the introduction of the Online Safety Act 2023. As the Minister at the time said:

“The Bill is world-leading, and the legislative framework established by it will lead to the creation of a profoundly safer online environment in this country. It will kickstart change where that is sorely needed, and ensure that our children are better protected against pornography and other content that is harmful to them.” —[Official Report, 12 September 2023; Vol. 737, c. 799.]

Although some aspects of the Online Safety Act have been more prominent than others since its introduction, it is important in this debate to recall that there are multiple parts of the Act, each of which could separately be subject to amendment or indeed repeal by Parliament. There was the introduction of a framework placing obligations on in-scope services—for example, social media platforms—to implement systems and processes to reduce the risk of their services being used for illegal activity, including terrorism offences, child sexual exploitation and abuse, and drugs and weapon offences. Those duties have been implemented and enforced since March 2025. Secondly, the Act required services to implement systems and processes to protect under-18s from age-inappropriate content—both content that may be passed from user to user, and content that is published by the service itself, such as pornography sites.

We should recognise that the Online Safety Act implemented measures to regulate a wide range of diverse services, from social media giants to commercial sites, but also online spaces run by charities, community and voluntary groups, and individuals. As the first substantive attempt at regulating safety online, the OSA has brought into regulation many services that have not previously been regulated.

Mr Baynham explained to me that those services lay behind his primary motivation in creating the petition. He was spurred by concerns about the impact of the Online Safety Act on online hobby and community forums of the type he uses. They are online spaces created by unpaid ordinary people in their spare time, focused on the discussion of particular shared interests—games, a film or TV series, or football teams. A number of the administrators of such forums have expressed concern that they now face liabilities and obligations under the Online Safety Act that they are not equipped to meet.

I must declare an interest at this stage. For more than a decade, I have regularly used the Ready To Go—RTG—Sunderland AFC fans’ messaging boards. They provide thousands of Mackems with online space to discuss the many ups and downs of our football club and associated issues facing the city, with current topics including club finances, “Match of the Day” tonight and, following a successful Wear-Tyne derby yesterday, “The Mag meltdown” thread.

I heard directly from the administrator of the RTG forum in preparation for this debate. He told me that he came close to shutting the site down when the Online Safety Act came into force and has still not ruled that out completely. He points out that there have been thousands of pages of guidance issued by Ofcom on the implementation of the Act, and that, while tech companies with large compliance teams have the capacity to process that volume of guidance, having volunteers do the same is a huge challenge.

Ofcom has stressed that it will implement the Act in a way that is risk-based and proportionate, and has offered a digital toolkit targeted at small services in response. But even for the smaller sites the rules seem to require, for example, a separate and documented complaints system beyond the usual reporting functionality that small forums have often had in place. The administration of that system has been described to me as time-consuming and liable to being weaponised by trolls.

Some forum hosts feel that the uncertainty regarding the liability they face under the Online Safety Act is too much. The reassurance offered that prosecution is “unlikely” has not given sufficient confidence to some who have been running community sites as volunteers. To some, the risk of liability, personal financial loss or simply getting it wrong has been too great; when the Act came into force, 300 small forums reportedly exited the online space or lost their status as independent forums and migrated to larger platforms such as Facebook.

Iqbal Mohamed Portrait Iqbal Mohamed (Dewsbury and Batley) (Ind)
- Hansard - -

The hon. Member is making an extremely passionate and informed speech. While the unintended consequences of the Online Safety Act on the small forums and specialist groups that he highlights are critical, does he agree that a balance needs to be struck, whereby under-age children are protected from harmful content on whatever forum or website they are exposed to?