Draft Online Safety Super-Complaints (Eligibility and Procedural Matters) Regulations 2025 Debate

Full Debate: Read Full Debate
Department: Department for Science, Innovation & Technology

Draft Online Safety Super-Complaints (Eligibility and Procedural Matters) Regulations 2025

Feryal Clark Excerpts
Monday 7th July 2025

(1 day, 14 hours ago)

General Committees
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Feryal Clark Portrait The Parliamentary Under-Secretary of State for Science, Innovation and Technology (Feryal Clark)
- Hansard - -

I beg to move,

That the Committee has considered the draft Online Safety Super-Complaints (Eligibility and Procedural Matters) Regulations 2025.

It is a pleasure to serve under your chairmanship, Sir John. The regulations, which are necessary for the super-complaints regime to take effect, were laid before Parliament on 9 June. They will enable the super-complaints regime by establishing the eligibility criteria that entities must meet to submit a super-complaint, and by setting out procedural matters relating to the assessment of super-complaints.

Super-complaints are an integral part of the complaint handling, reporting and redress mechanisms in the Online Safety Act 2023. They enable eligible entities with expertise in online safety matters, such as civil society groups, to raise systemic issues with Ofcom, which is the Act’s independent regulator. Section 169 of the Act establishes the scope of issues that super-complaints may address, including where the features and/or conduct of regulated services may be causing significant harm, adversely affecting freedom of expression, or otherwise adversely impacting users, members of the public, or particular groups. We expect super-complaints typically to deal with cross-platform systemic issues, but a complaint may cover a single service if the complaint is particularly important, or it impacts a large number of users or members of the public.

This statutory instrument sets out several eligibility criteria that an entity must meet to submit a complaint to Ofcom. For example, entities must represent the interests of users of regulated services, members of the public, or a particular group of users or members of the public. They must remain independent from regulated services in terms of funding, but representation from the services in entities’ governance is allowed, provided that appropriate mechanisms are in place to maintain independence. They must contribute as experts to public discussions of online safety matters, and be capable of being relied on to have due regard to any guidance published by Ofcom. These criteria aim to ensure that a wide range of entities are eligible while safeguarding the integrity of the process and reducing the risk of vexatious complaints.

In addition to the eligibility criteria, the SI sets out the process and timeline for the assessment of super-complaints. First, Ofcom must determine whether an entity is eligible within 30 days; it must then inform the entity whether it is eligible and explain why. The time for assessing eligibility decreases to 15 days for entities that have been found to be eligible within the past five years. In those circumstances, entities must submit information to show that they are still experts that contribute significantly to the public discussion of online safety.

The draft regulations also state that eligible entities must present current, objective and relevant evidence to support their view that one of the grounds in section 169 of the Act is met. When assessing the admissibility of the complaint, as well as its substance, Ofcom must typically respond within 90 days of the eligibility determination. That means that, as standard, the entire super-complaint process will conclude within 120 days, or 105 days where there is retained eligibility status. Ofcom may stop the clock in certain circumstances, such as if additional information is required and the complaint cannot be progressed without it, but it may stop the clock only for the amount of time that it takes to receive the requested information.

Where Ofcom has determined that an entity is eligible, it must consider the complaint and evaluate the evidence presented to it. In addition, Ofcom can request further input from the complainant or third parties, as required. At the end of the process, it must publish a response including its determination on the matter, which may include what further action, if any, is anticipated.

In developing these regulations, the Government have consulted Ofcom and conducted a public consultation. We have listened closely to the views of the stakeholders and, where possible, made changes to the policy, which was consulted on under the previous Government. The changes are set out in further detail in the Government’s policy response, which was published in June. Alongside the consultation, the Government held roundtables with civil society organisations, and their views on the policy have been taken into consideration. The changes include lowering the bar for eligibility to enable new expert organisations to make complaints, and removing the requirement to notify Ofcom ahead of submitting a complaint.

The online world is complicated and dynamic, with new harms emerging every day. These regulations have been drafted to ensure that the Act remains agile in addressing emerging technologies and market operators and consequent harms. They mark an important step towards a fully realised online safety regime.

--- Later in debate ---
Feryal Clark Portrait Feryal Clark
- Hansard - -

I thank the shadow Minister for his remarks. This instrument will bring us one step closer to a fully implemented online safety regime, with a functioning super-complaints mechanism that allows relevant organisations to help the regulator to stay abreast of new and emerging harms. As well as the civil society organisations I mentioned, the Molly Rose Foundation was consulted during the creation of the super-complaints regime under the previous Government. The changes that have been made will reduce the burden on Ofcom and improve its ability to manage the complaints. It will also make the process easier by providing a smoother path for the relevant eligible organisations to bring forward complaints.

The shadow Minister asked about resources and expertise of the regulator. The Government have ensured that Ofcom has the funding it needs to deliver the online safety regulation effectively, with £72.6 million allocated for online safety spending in 2025-26, which is an increase on previous years. That decision followed a business case process that included Ofcom submitting its requirements, including on delivering the super-complaints function.

The shadow Minister will be happy to hear that later this month the child safety duties will be in force. We expect that children will see a positive change to their online experience. Services likely to be accessed by children will be required to take measures to prevent children from seeing pornography and content that promotes, encourages or provides instructions for suicide, self-harm and eating disorders. The services will also have to protect children from other types of harmful content, including violent, abusive and bullying content. Under Ofcom’s finalised child safety codes, services will need to take certain steps, including introducing robust age checks, such as photo ID matching or facial age estimation, filtering out harmful content from algorithms, and ensuring that reporting mechanisms are accessible and easier to use.

As I said earlier, this is a dynamic, fast-moving and developing world. For that reason, we will continue to keep the harms under review, and constantly evaluate and review evidence of new and emerging harms. As the Secretary of State has said, we will not shy away from introducing further legislation to ensure that our children are safe online. In conclusion, I hope all Members agree on importance of these regulations, and I commend them to the Committee.

Question put and agreed to.