Online Safety Act 2023: Repeal

Iqbal Mohamed Excerpts
Monday 15th December 2025

(1 day, 20 hours ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Lewis Atkinson Portrait Lewis Atkinson (Sunderland Central) (Lab)
- Hansard - - - Excerpts

I beg to move,

That this House has considered e-petition 722903 relating to the Online Safety Act.

It is a pleasure to serve with you in the Chair, Mr Pritchard, and to open this important debate as a member of the Petitions Committee. I start by thanking the 550,138 people who signed the petition for their engagement with the democratic process, and in particular the petition creator, Alex Baynham, whom I had the pleasure of meeting as part of my preparations for this debate; he is in the Public Gallery today. My role as a member of the Petitions Committee is to introduce the petition and key contours of the issues and considerations that it touches on, hopefully to help ensure that we have a productive debate that enhances our understanding.

I believe that at the heart of any balanced discussion on this issue is a recognition of two simultaneous features of the online world’s development over the last 30 years. First, there has been the development of incredible opportunities for people to communicate and form bonds together online, which go far beyond the previous limitations of geography and have allowed a huge multiplication of opportunities for such interactions—from marketplaces to gaming to dating. We should welcome that in a free society.

Secondly, the opportunities for harm, hate and illegality have also hugely increased, in a way that previous legislation and regulation was totally unequipped to deal with. That is what prompted the introduction of the Online Safety Act 2023. As the Minister at the time said:

“The Bill is world-leading, and the legislative framework established by it will lead to the creation of a profoundly safer online environment in this country. It will kickstart change where that is sorely needed, and ensure that our children are better protected against pornography and other content that is harmful to them.” —[Official Report, 12 September 2023; Vol. 737, c. 799.]

Although some aspects of the Online Safety Act have been more prominent than others since its introduction, it is important in this debate to recall that there are multiple parts of the Act, each of which could separately be subject to amendment or indeed repeal by Parliament. There was the introduction of a framework placing obligations on in-scope services—for example, social media platforms—to implement systems and processes to reduce the risk of their services being used for illegal activity, including terrorism offences, child sexual exploitation and abuse, and drugs and weapon offences. Those duties have been implemented and enforced since March 2025. Secondly, the Act required services to implement systems and processes to protect under-18s from age-inappropriate content—both content that may be passed from user to user, and content that is published by the service itself, such as pornography sites.

We should recognise that the Online Safety Act implemented measures to regulate a wide range of diverse services, from social media giants to commercial sites, but also online spaces run by charities, community and voluntary groups, and individuals. As the first substantive attempt at regulating safety online, the OSA has brought into regulation many services that have not previously been regulated.

Mr Baynham explained to me that those services lay behind his primary motivation in creating the petition. He was spurred by concerns about the impact of the Online Safety Act on online hobby and community forums of the type he uses. They are online spaces created by unpaid ordinary people in their spare time, focused on the discussion of particular shared interests—games, a film or TV series, or football teams. A number of the administrators of such forums have expressed concern that they now face liabilities and obligations under the Online Safety Act that they are not equipped to meet.

I must declare an interest at this stage. For more than a decade, I have regularly used the Ready To Go—RTG—Sunderland AFC fans’ messaging boards. They provide thousands of Mackems with online space to discuss the many ups and downs of our football club and associated issues facing the city, with current topics including club finances, “Match of the Day” tonight and, following a successful Wear-Tyne derby yesterday, “The Mag meltdown” thread.

I heard directly from the administrator of the RTG forum in preparation for this debate. He told me that he came close to shutting the site down when the Online Safety Act came into force and has still not ruled that out completely. He points out that there have been thousands of pages of guidance issued by Ofcom on the implementation of the Act, and that, while tech companies with large compliance teams have the capacity to process that volume of guidance, having volunteers do the same is a huge challenge.

Ofcom has stressed that it will implement the Act in a way that is risk-based and proportionate, and has offered a digital toolkit targeted at small services in response. But even for the smaller sites the rules seem to require, for example, a separate and documented complaints system beyond the usual reporting functionality that small forums have often had in place. The administration of that system has been described to me as time-consuming and liable to being weaponised by trolls.

Some forum hosts feel that the uncertainty regarding the liability they face under the Online Safety Act is too much. The reassurance offered that prosecution is “unlikely” has not given sufficient confidence to some who have been running community sites as volunteers. To some, the risk of liability, personal financial loss or simply getting it wrong has been too great; when the Act came into force, 300 small forums reportedly exited the online space or lost their status as independent forums and migrated to larger platforms such as Facebook.

Iqbal Mohamed Portrait Iqbal Mohamed (Dewsbury and Batley) (Ind)
- Hansard - -

The hon. Member is making an extremely passionate and informed speech. While the unintended consequences of the Online Safety Act on the small forums and specialist groups that he highlights are critical, does he agree that a balance needs to be struck, whereby under-age children are protected from harmful content on whatever forum or website they are exposed to?

--- Later in debate ---
Iqbal Mohamed Portrait Iqbal Mohamed (Dewsbury and Batley) (Ind)
- Hansard - -

It is a pleasure to speak with you in the Chair, Mr Pritchard. I thank the hon. Member for Sunderland Central (Lewis Atkinson) for his powerful and eloquent introduction to this important debate. The scale of this petition should make us reflect: over half a million people have called for the repeal of the Online Safety Act, not because online safety is unpopular, but because they believe that the legislation does not yet strike the right balance.

Let me be clear: the Online Safety Act exists for a reason. I stand in strong support of its intent, aims and objectives, and I am not in favour of its repeal. For too long, online platforms have failed to protect users, particularly children, from serious harm. The statistics are sobering: nearly one in five children aged 10 to 15 have exchanged messages with someone they have never met; over 9,000 reported child sexual abuse offences in 2022-23 involved an online element; and, in recent years, we have seen tragic cases where exposure to harmful online content has contributed to devastating outcomes. Repealing the Act would leave us with very little meaningful protection, so it remains central for regulating online spaces in the UK. We must accept that necessary truth, although it is a hard pill to swallow.

Supporting the Act, however, does not mean ignoring the parts that need important improvements. One of the most significant concerns is age restriction. Age-gating can and should play a role in protecting children from genuinely harmful content, but it is increasingly clear that the boundaries of age restrictions are not defined well. There is growing evidence that lawful political content, including news and commentary on conflicts such as Gaza, Ukraine and Sudan, is being placed behind age gates.

Teenagers aged 16 and 17 are finding themselves blocked from accessing political information and current affairs, sometimes more strictly than in film and television content regulated by the British Board of Film Classification. That should give us pause, particularly when the House is considering extending the vote to 16-year-olds. If we believe that young people should be active participants in our democracy, we cannot also allow systems that restrict their access to political debate by default, just because these are difficult and sensitive topics. What is or is not age-restricted needs to be far clearer, more consistent and more proportionate.

The second area where clarity is urgently needed is generative AI. As we are having this debate, the Home Secretary is making a statement on violence against women and girls, which she has rightly described as a “national emergency”. The Government’s five-year national strategy acknowledges the growing threat posed by intimate deepfakes, with one survey by the National Society for the Prevention of Cruelty to Children showing that three in five people fear becoming a victim. With current laws proving too difficult to apply in complex and rapidly evolving cases, what specific legislative proposals are the Government hoping to develop to address deepfake abuse?

When this legislation was drafted and passed, the pace of AI development was very different. Today, AI tools and chatbots are embedded across social media, search engines and messaging platforms, with people relying on ChatGPT, Gemini and Copilot as search engines and virtual assistants embedded into almost every online service we use. They can generate harmful and misleading content within seconds, including advice related to self-harm, eating disorders, substance misuse and suicide assistance.

Only last week, I led a debate in Westminster Hall on the need for stronger AI regulation. That debate reinforced a growing concern that many AI-driven services currently sit at the edges of the Online Safety Act. Although Ofcom has acknowledged that gap and issued guidance, guidance alone is not enough. We need clarity on how generative AI is regulated and whether further legislative action is required to keep pace with the technology.

The message of this petition is not a rejection of online safety; it is a call for a system that protects children while safeguarding freedom of expression, political engagement and public trust. The challenge before us is not to repeal, but to refine by strengthening definitions, clarifying age restrictions and ensuring that the Online Safety Act evolves alongside emerging technologies. If we get that right, we can protect users online without undermining the democratic values we try to defend.