Online Safety Act 2023 (Priority Offences) (Amendment) Regulations 2025

Debate between Lord Addington and Baroness Lloyd of Effra
Thursday 4th December 2025

(1 week, 5 days ago)

Grand Committee
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Lloyd of Effra Portrait The Parliamentary Under-Secretary of State, Department for Business and Trade and Department for Science, Innovation and Technology (Baroness Lloyd of Effra) (Lab)
- Hansard - - - Excerpts

My Lords, these regulations were laid before the House on 21 October this year. Before I proceed further, I draw the Committee’s attention to a correction slip issued for these regulations in October for minor drafting changes related to the date of the Sexual Offences Act 2003 in the Explanatory Notes and the order of words for the title of an offence inserted by paragraph 2 of the regulations.

The Government remain firmly committed to tackling the most serious and harmful online behaviours. This statutory instrument strengthens the Online Safety Act by designating new priority offences aimed at addressing cyber flashing and content that encourages self-harm. By doing so, we are ensuring that platforms take more proactive steps to protect users from these damaging harms.

Evidence shows that cyber flashing and material promoting self-harm are widespread and cause significant harm, particularly among younger age groups. In 2025, 9% of 18 to 24 year-olds reported experiencing cyber flashing and 7% encountered content encouraging self-harm in a four-week period. That equates to around 530,000 young adults exposed to cyber flashing and 450,000 to self-harm content. This is unacceptable.

Further, 27% of UK users exposed to cyber flashing reported significant emotional discomfort. There is also compelling evidence that exposure to self-harm content worsens mental health outcomes. A 2019 study found that 64% of Instagram users in the US who saw such content were emotionally disturbed by it. Another study in 2018 revealed that 8% of adults and 26% of children hospitalised after self-harming had encountered related content online. These figures underline that these are not marginal issues—they are widespread and deeply harmful.

As noble Lords will know, the Online Safety Act, which received Royal Assent on 26 October 2023, imposes strong duties on platforms and search services to protect users. Providers must assess the likelihood that their services expose users to illegal content or facilitate priority offences, and then take steps to mitigate those risks; these include safety by design measures and robust content moderation systems.

The Act sets out a list of priority offences for the purposes of illegal content duties. These represent the most serious and prevalent forms of online illegal activity. Platforms must take additional steps to address these offences under their statutory duties. This statutory instrument adds cyber flashing and content encouraging self-harm to the list of priority offences. Currently, these offences fall under the general illegal content duties. Without priority status, platforms are not required to conduct specific risk assessments or implement specific measures to prevent exposure to these harms; that is why we are adding them as priority offences.

Stakeholders have strongly supported these changes. Organisations such as the Molly Rose Foundation and Samaritans have long called for greater protection for vulnerable users. These changes will come into force 21 days after the regulations are made, following approval by both Houses. Ofcom will then set out in its codes of practice the measures that providers should adopt to meet their duties. Our updates to the Act’s safety duties will fully take effect when Ofcom makes these updates about measures that can be taken to fulfil the duties.

We expect Ofcom to recommend actions such as enhanced content moderation; improved reporting and complaints systems; and safety by design measures—for example, testing algorithms to ensure that illegal content is not being promoted. If providers fail to meet their obligations and fail to take proportionate steps to stop this vile material being shared on their services, Ofcom has strong enforcement powers to enforce compliance. These include powers to issue fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is higher.

This statutory instrument upgrades cyber flashing and self-harm content to priority status, reinforcing the Online Safety Act’s protections. Service providers will be required to take more proactive and robust action to detect, remove and limit exposure to these harmful forms of illegal content. This will help ensure that platforms take stronger steps to protect users, reduce the prevalence of these behaviours online and make the internet safer for all. I beg to move.

Lord Addington Portrait Lord Addington (LD)
- Hansard - -

My Lords, I hope this is one of those occasions when we agree that what is coming here is a good thing—something that is designed to deal with an evil and thus is necessary. I want just to add a bit of flesh to the bones.

If we have regulation, we must make sure—as we are doing now—that it is enforced. I congratulate the Government on the age-verification activities that were reported on this morning, but can we get a little more about the tone, let us say, with which we are going to look at future problems? The ones we have here—cyber flashing and self-harm—are pretty obviously things that are not good for you, especially for younger people and the vulnerable.

I have in front of me the same figures of those who have experienced disturbing reactions to seeing these things, especially if they did not want to see them. Self-harm is one of those things; it makes me wince even to think about it. Can we make sure that not only those in the industry but those outside it know that action will be taken? How can we report across more? If we do not have a degree of awareness, reporting and everything else gets a bit slower. How do we make sure that everybody who becomes a victim of this activity knows that it is going on?

It is quite clear that the platforms are responsible; everybody knows that. It is about knowing that something is going on and being prepared to take action; that is where we will start to make sure not only that this is unacceptable and action will be taken but that everybody knows and gets in on the act and reporting takes place.

I could go on for a considerable length of time, and I have enough briefing to do so, but I have decided that the Grand Committee has not annoyed me enough to indulge in that today. I congratulate the Minister, but a little more flesh about the action and its tone, and what we expect the wider community to do to make sure this can be enacted, would be very helpful here. Other than that, I totally welcome these actions. Unpleasant as it is that they are necessary, I welcome them and hope that the Government will continue to do this. We are always going to be playing a little bit of catch-up on what happens, but let us make sure that we are running fast and that what is in front of us does not get too far away.