Draft Online Safety Super-Complaints (Eligibility and Procedural Matters) Regulations 2025

Monday 7th July 2025

(1 day, 15 hours ago)

General Committees
Read Hansard Text Read Debate Ministerial Extracts
The Committee consisted of the following Members:
Chair: Sir John Hayes
† Clark, Feryal (Parliamentary Under-Secretary of State for Science, Innovation and Technology)
† Collins, Victoria (Harpenden and Berkhamsted) (LD)
† Craft, Jen (Thurrock) (Lab)
† Dewhirst, Charlie (Bridlington and The Wolds) (Con)
† Edwards, Sarah (Tamworth) (Lab)
† Fleet, Natalie (Bolsover) (Lab)
† Fortune, Peter (Bromley and Biggin Hill) (Con)
† Gordon, Tom (Harrogate and Knaresborough) (LD)
† Holmes, Paul (Hamble Valley) (Con)
† Joseph, Sojan (Ashford) (Lab)
† Macdonald, Alice (Norwich North) (Lab/Co-op)
† McEvoy, Lola (Darlington) (Lab)
† Patrick, Matthew (Wirral West) (Lab)
† Poynton, Gregor (Livingston) (Lab)
† Timothy, Nick (West Suffolk) (Con)
† Tufnell, Henry (Mid and South Pembrokeshire) (Lab)
† Turley, Anna (Lord Commissioner of His Majesty's Treasury)
Aaron Kulakiewicz, Committee Clerk
† attended the Committee
First Delegated Legislation Committee
Monday 7 July 2025
[Sir John Hayes in the Chair]
Draft Online Safety Super-Complaints (Eligibility and Procedural Matters) Regulations 2025
18:00
Feryal Clark Portrait The Parliamentary Under-Secretary of State for Science, Innovation and Technology (Feryal Clark)
- Hansard - - - Excerpts

I beg to move,

That the Committee has considered the draft Online Safety Super-Complaints (Eligibility and Procedural Matters) Regulations 2025.

It is a pleasure to serve under your chairmanship, Sir John. The regulations, which are necessary for the super-complaints regime to take effect, were laid before Parliament on 9 June. They will enable the super-complaints regime by establishing the eligibility criteria that entities must meet to submit a super-complaint, and by setting out procedural matters relating to the assessment of super-complaints.

Super-complaints are an integral part of the complaint handling, reporting and redress mechanisms in the Online Safety Act 2023. They enable eligible entities with expertise in online safety matters, such as civil society groups, to raise systemic issues with Ofcom, which is the Act’s independent regulator. Section 169 of the Act establishes the scope of issues that super-complaints may address, including where the features and/or conduct of regulated services may be causing significant harm, adversely affecting freedom of expression, or otherwise adversely impacting users, members of the public, or particular groups. We expect super-complaints typically to deal with cross-platform systemic issues, but a complaint may cover a single service if the complaint is particularly important, or it impacts a large number of users or members of the public.

This statutory instrument sets out several eligibility criteria that an entity must meet to submit a complaint to Ofcom. For example, entities must represent the interests of users of regulated services, members of the public, or a particular group of users or members of the public. They must remain independent from regulated services in terms of funding, but representation from the services in entities’ governance is allowed, provided that appropriate mechanisms are in place to maintain independence. They must contribute as experts to public discussions of online safety matters, and be capable of being relied on to have due regard to any guidance published by Ofcom. These criteria aim to ensure that a wide range of entities are eligible while safeguarding the integrity of the process and reducing the risk of vexatious complaints.

In addition to the eligibility criteria, the SI sets out the process and timeline for the assessment of super-complaints. First, Ofcom must determine whether an entity is eligible within 30 days; it must then inform the entity whether it is eligible and explain why. The time for assessing eligibility decreases to 15 days for entities that have been found to be eligible within the past five years. In those circumstances, entities must submit information to show that they are still experts that contribute significantly to the public discussion of online safety.

The draft regulations also state that eligible entities must present current, objective and relevant evidence to support their view that one of the grounds in section 169 of the Act is met. When assessing the admissibility of the complaint, as well as its substance, Ofcom must typically respond within 90 days of the eligibility determination. That means that, as standard, the entire super-complaint process will conclude within 120 days, or 105 days where there is retained eligibility status. Ofcom may stop the clock in certain circumstances, such as if additional information is required and the complaint cannot be progressed without it, but it may stop the clock only for the amount of time that it takes to receive the requested information.

Where Ofcom has determined that an entity is eligible, it must consider the complaint and evaluate the evidence presented to it. In addition, Ofcom can request further input from the complainant or third parties, as required. At the end of the process, it must publish a response including its determination on the matter, which may include what further action, if any, is anticipated.

In developing these regulations, the Government have consulted Ofcom and conducted a public consultation. We have listened closely to the views of the stakeholders and, where possible, made changes to the policy, which was consulted on under the previous Government. The changes are set out in further detail in the Government’s policy response, which was published in June. Alongside the consultation, the Government held roundtables with civil society organisations, and their views on the policy have been taken into consideration. The changes include lowering the bar for eligibility to enable new expert organisations to make complaints, and removing the requirement to notify Ofcom ahead of submitting a complaint.

The online world is complicated and dynamic, with new harms emerging every day. These regulations have been drafted to ensure that the Act remains agile in addressing emerging technologies and market operators and consequent harms. They mark an important step towards a fully realised online safety regime.

18:07
Paul Holmes Portrait Paul Holmes (Hamble Valley) (Con)
- Hansard - - - Excerpts

It is a pleasure to see you in the Chair, Sir John. I thank the Minister for that thorough explanation of the draft regulations. As she outlined, they establish the eligibility criteria and procedural framework for the super-complaints mechanism under the Online Safety Act 2023—a long-awaited and essential step forward in the effort to protect users, particularly children, in the digital space.

Protecting children was one of the top priorities of the last Government, which is why we introduced and passed the Act. Although we welcome the implementation of this mechanism, we must acknowledge that there is still a risk that we will fail our children. As Ian Russell has powerfully and heartbreakingly stated, the Government are “going backwards”. The super-complaints process that we are discussing is not about minor grievances; it is about systemic failures across services, or, in exceptional circumstances, failures within individual services, that put users, especially the most vulnerable, at real risk. These are serious matters, and deserve to be handled with the utmost care, scrutiny and understanding.

Progress is being made. We welcome the clarity on the eligibility criteria, which require that organisations bringing forward complaints must be independent from the services regulated under the Act. That is a sensible and important safeguard. Similarly, the decision to allow a shorter application process for organisations already deemed eligible is a pragmatic move that will avoid unnecessary bureaucracy. None the less, we must also ask hard questions.

Ofcom is already under considerable pressure to deliver its duties under the Online Safety Act, so it is right to ask whether it will be given the support it needs to manage the new mechanism effectively. Expanding the scope of complainants, as the Government appear to be doing, risks compounding the burden without necessarily improving the quality of complaints. That is a key point of divergence from the previous Government’s approach. We believed, and continue to believe, that the super-complaints process should focus on quality, not quantity. We must prioritise high-quality, evidence-based complaints from trusted organisations. Can the Minister assure the Committee that the £72.6 million allocated to online safety in 2025-26 is not just headline money, but is actually proportionate and sufficient to deal with the likely increase in the volume of super-complaints?

The framework around super-complaints matters. It is not just about the process; it is a test of whether we are serious about holding online services to account for systemic harms. Let us not fall into the trap of creating a mechanism that looks good on paper but fails in practice due to a lack of focused resources or political will. We owe it to the families who have suffered like the Russell family have to get this right, and we owe it to the children whom we promised to protect. I look forward to the Minister’s answers to those questions.

18:10
Feryal Clark Portrait Feryal Clark
- Hansard - - - Excerpts

I thank the shadow Minister for his remarks. This instrument will bring us one step closer to a fully implemented online safety regime, with a functioning super-complaints mechanism that allows relevant organisations to help the regulator to stay abreast of new and emerging harms. As well as the civil society organisations I mentioned, the Molly Rose Foundation was consulted during the creation of the super-complaints regime under the previous Government. The changes that have been made will reduce the burden on Ofcom and improve its ability to manage the complaints. It will also make the process easier by providing a smoother path for the relevant eligible organisations to bring forward complaints.

The shadow Minister asked about resources and expertise of the regulator. The Government have ensured that Ofcom has the funding it needs to deliver the online safety regulation effectively, with £72.6 million allocated for online safety spending in 2025-26, which is an increase on previous years. That decision followed a business case process that included Ofcom submitting its requirements, including on delivering the super-complaints function.

The shadow Minister will be happy to hear that later this month the child safety duties will be in force. We expect that children will see a positive change to their online experience. Services likely to be accessed by children will be required to take measures to prevent children from seeing pornography and content that promotes, encourages or provides instructions for suicide, self-harm and eating disorders. The services will also have to protect children from other types of harmful content, including violent, abusive and bullying content. Under Ofcom’s finalised child safety codes, services will need to take certain steps, including introducing robust age checks, such as photo ID matching or facial age estimation, filtering out harmful content from algorithms, and ensuring that reporting mechanisms are accessible and easier to use.

As I said earlier, this is a dynamic, fast-moving and developing world. For that reason, we will continue to keep the harms under review, and constantly evaluate and review evidence of new and emerging harms. As the Secretary of State has said, we will not shy away from introducing further legislation to ensure that our children are safe online. In conclusion, I hope all Members agree on importance of these regulations, and I commend them to the Committee.

Question put and agreed to.

18:14
Committee rose.