Thursday 4th December 2025

(1 day, 8 hours ago)

Grand Committee
Read Hansard Text Read Debate Ministerial Extracts
Considered in Grand Committee
13:17
Moved by
Baroness Lloyd of Effra Portrait Baroness Lloyd of Effra
- Hansard - - - Excerpts

That the Grand Committee do consider the Online Safety Act 2023 (Priority Offences) (Amendment) Regulations 2025.

Relevant document: 40th Report from the Secondary Legislation Scrutiny Committee

Baroness Lloyd of Effra Portrait The Parliamentary Under-Secretary of State, Department for Business and Trade and Department for Science, Innovation and Technology (Baroness Lloyd of Effra) (Lab)
- Hansard - - - Excerpts

My Lords, these regulations were laid before the House on 21 October this year. Before I proceed further, I draw the Committee’s attention to a correction slip issued for these regulations in October for minor drafting changes related to the date of the Sexual Offences Act 2003 in the Explanatory Notes and the order of words for the title of an offence inserted by paragraph 2 of the regulations.

The Government remain firmly committed to tackling the most serious and harmful online behaviours. This statutory instrument strengthens the Online Safety Act by designating new priority offences aimed at addressing cyber flashing and content that encourages self-harm. By doing so, we are ensuring that platforms take more proactive steps to protect users from these damaging harms.

Evidence shows that cyber flashing and material promoting self-harm are widespread and cause significant harm, particularly among younger age groups. In 2025, 9% of 18 to 24 year-olds reported experiencing cyber flashing and 7% encountered content encouraging self-harm in a four-week period. That equates to around 530,000 young adults exposed to cyber flashing and 450,000 to self-harm content. This is unacceptable.

Further, 27% of UK users exposed to cyber flashing reported significant emotional discomfort. There is also compelling evidence that exposure to self-harm content worsens mental health outcomes. A 2019 study found that 64% of Instagram users in the US who saw such content were emotionally disturbed by it. Another study in 2018 revealed that 8% of adults and 26% of children hospitalised after self-harming had encountered related content online. These figures underline that these are not marginal issues—they are widespread and deeply harmful.

As noble Lords will know, the Online Safety Act, which received Royal Assent on 26 October 2023, imposes strong duties on platforms and search services to protect users. Providers must assess the likelihood that their services expose users to illegal content or facilitate priority offences, and then take steps to mitigate those risks; these include safety by design measures and robust content moderation systems.

The Act sets out a list of priority offences for the purposes of illegal content duties. These represent the most serious and prevalent forms of online illegal activity. Platforms must take additional steps to address these offences under their statutory duties. This statutory instrument adds cyber flashing and content encouraging self-harm to the list of priority offences. Currently, these offences fall under the general illegal content duties. Without priority status, platforms are not required to conduct specific risk assessments or implement specific measures to prevent exposure to these harms; that is why we are adding them as priority offences.

Stakeholders have strongly supported these changes. Organisations such as the Molly Rose Foundation and Samaritans have long called for greater protection for vulnerable users. These changes will come into force 21 days after the regulations are made, following approval by both Houses. Ofcom will then set out in its codes of practice the measures that providers should adopt to meet their duties. Our updates to the Act’s safety duties will fully take effect when Ofcom makes these updates about measures that can be taken to fulfil the duties.

We expect Ofcom to recommend actions such as enhanced content moderation; improved reporting and complaints systems; and safety by design measures—for example, testing algorithms to ensure that illegal content is not being promoted. If providers fail to meet their obligations and fail to take proportionate steps to stop this vile material being shared on their services, Ofcom has strong enforcement powers to enforce compliance. These include powers to issue fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is higher.

This statutory instrument upgrades cyber flashing and self-harm content to priority status, reinforcing the Online Safety Act’s protections. Service providers will be required to take more proactive and robust action to detect, remove and limit exposure to these harmful forms of illegal content. This will help ensure that platforms take stronger steps to protect users, reduce the prevalence of these behaviours online and make the internet safer for all. I beg to move.

Lord Addington Portrait Lord Addington (LD)
- Hansard - - - Excerpts

My Lords, I hope this is one of those occasions when we agree that what is coming here is a good thing—something that is designed to deal with an evil and thus is necessary. I want just to add a bit of flesh to the bones.

If we have regulation, we must make sure—as we are doing now—that it is enforced. I congratulate the Government on the age-verification activities that were reported on this morning, but can we get a little more about the tone, let us say, with which we are going to look at future problems? The ones we have here—cyber flashing and self-harm—are pretty obviously things that are not good for you, especially for younger people and the vulnerable.

I have in front of me the same figures of those who have experienced disturbing reactions to seeing these things, especially if they did not want to see them. Self-harm is one of those things; it makes me wince even to think about it. Can we make sure that not only those in the industry but those outside it know that action will be taken? How can we report across more? If we do not have a degree of awareness, reporting and everything else gets a bit slower. How do we make sure that everybody who becomes a victim of this activity knows that it is going on?

It is quite clear that the platforms are responsible; everybody knows that. It is about knowing that something is going on and being prepared to take action; that is where we will start to make sure not only that this is unacceptable and action will be taken but that everybody knows and gets in on the act and reporting takes place.

I could go on for a considerable length of time, and I have enough briefing to do so, but I have decided that the Grand Committee has not annoyed me enough to indulge in that today. I congratulate the Minister, but a little more flesh about the action and its tone, and what we expect the wider community to do to make sure this can be enacted, would be very helpful here. Other than that, I totally welcome these actions. Unpleasant as it is that they are necessary, I welcome them and hope that the Government will continue to do this. We are always going to be playing a little bit of catch-up on what happens, but let us make sure that we are running fast and that what is in front of us does not get too far away.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, as we have heard, this instrument amends Schedule 7 to the Online Safety Act 2023 to add cyber flashing and content encouraging self-harm to the list of priority offences. I thank the Minister for setting out some of the most alarming facts and figures associated with those offences.

As well as passing the Online Safety Act, which placed duties on social media sites and internet services to tackle illegal content, the previous Government outlawed cyber flashing and sharing or threatening to share intimate images without consent by amending the Sexual Offences Act 2003. We welcome the draft regulations, which we agree are in line with the Act’s overarching purpose to tackle harmful content online. As has been highlighted, young people are especially vulnerable to cyber flashing and content encouraging self-harm, and we must be proactive in tracking the trends of illegal activity, especially online, and its impact on UK users, to ensure that the law continues to be proportionate and effective.

We therefore support the move to categorise cyber flashing and content encouraging self-harm as priority offences under the Act rather than as relevant offences. We share the Government’s view that this will oblige services to remove such material as soon as they are made aware of it, as well as to prevent it appearing in the first place through risk assessments and specialised measures. However, I feel there are some broader issues that we should take into account, and I would be grateful if the Minister could comment on these.

First, on the use of VPNs, or virtual private networks, to override protections, my belief—I would welcome the Minister’s view on this—is that the Online Safety Act creates an obligation on platforms to prevent users gaining access to the wrong content for them, regardless of any technical workarounds they may be using. In other words, it is not a defence for a platform to claim that the user had deployed a VPN. Can the Minister confirm this? Needless to say, I am seeking not to downplay the VPN issue but merely to establish clearly where responsibility lies for addressing it.

Secondly, on the use of AI in ways that drive self-harm, obviously AI that assists in suicide ideation or less extreme forms of self-harm is subject to these controls. But where an AI that is not initially designed for a harmful purpose gradually takes on the role of, say, a psychotherapist or—I am told—in some cases a deity, the conditions become highly propitious for self-harm. Can the Minister comment on how the Act’s protections cover these emergent rather than designed properties? The noble Lord, Lord Addington, put this very well in his question too, and I look forward to hearing the Minister’s views on that.

Thirdly, and more generally, online harms are, of course, created faster than the rules that ban them, and a key part of Ofcom’s role is to monitor for gaps in the legislation as they emerge so that rules can adapt as needed. As far as the Government are aware now, what gaps has Ofcom identified so far in the existing legislation, if any?

We therefore support these regulations to strengthen the Online Safety Act, to better protect UK users from cyber flashing and content encouraging self-harms. We count on the Government to be proactive in ensuring that legislation is kept updated to tackle the changing ways in which unlawful content is proliferated and to be transparent about the way the Government and regulators balance the broader considerations mentioned. I look forward to the Minister’s response.

13:30
Baroness Lloyd of Effra Portrait Baroness Lloyd of Effra (Lab)
- Hansard - - - Excerpts

My Lords, I thank noble Lords for their broad support for adding these offences to the priority offences list. This is an important step in improving the online safety regime and improving the environment in which we all use the internet, particularly children and vulnerable people. This will help fulfil the Government’s commitment to improving online safety and strengthening protections for women and girls.

On the points made by the noble Lord, Lord Addington, about tone and proactivity, it is really important that we communicate what we are doing, both in the online world and in terms of violence against women and girls in the physical world. We know that we must all do more to tackle misogynistic abuse, pile-ons, harassment and stalking, and the Government’s whole approach to tackling violence against women and girls is an active one and is something that we have real, serious goals on. We welcome everyone supporting that move forward. For example, the publication of Ofcom’s guidance, A Safer Life Online for Women and Girls, sets out the steps that services can take to create safer online spaces, and the Government will be setting out our strategy for tackling violence against women and girls in due course as part of that. I think that the publication of Ofcom’s report this morning, which sets out the activity that it has taken and will take, will help raise the profile, as the noble Lord says, about what is expected of services in terms of the urgency and the rigour with which these changes are made.

On the question of VPNs, which we talked about a little earlier, we do not have a huge amount of information or research about their use, particularly by young people to circumvent age assurance. We know that there are legitimate reasons to use VPNs, and we do not have a huge amount of evidence about their use by young people, either very young people or older teenagers. Ofcom and the Government are committed to increasing the research and evidence for how VPNs are being used and whether this is indeed a way that age assurance is being circumvented, or whether it is for what might be legitimate reasons, such as security or privacy reasons. That is an important piece of the evidence puzzle to know exactly what measures to take subsequently.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am particularly interested in whether it is a legitimate defence for a platform to say, “We could not have prevented this access because a VPN was in use”, and therefore whether it falls to the platforms themselves to figure out how to prevent abuse via VPNs.

Baroness Lloyd of Effra Portrait Baroness Lloyd of Effra (Lab)
- Hansard - - - Excerpts

I think we may need to have this conversation together with Ofcom. My understanding is that doing the risk assessments and putting these offences on the priority list increases the level of risk assessment that must be done. When a platform is doing its risk assessment, it will have to take into account the way in which children, young people or other users will access the service. Thus, depending on what its service provided and how people accessed it, if that was a factor that needed taking into account, it would therefore have to take that into account in the controls it was putting in place, as a platform, on the basis of its knowledge of its user base. Thus, it would, perhaps, go to the more conservative as opposed to the more permissive end of controls. That is my understanding, and if it is not correct, I will correct it.

Likewise, the noble Lord, Lord Addington, made points about emerging technology, making sure that this measure is fit for purpose and that we keep all Online Safety Act duties, defences and coverage very much up to speed with what people are experiencing in their daily lives. Many others have raised the issue of AI chatbots, including what is currently covered and not covered under the Act. The Secretary of State has commissioned work on AI chatbot activity to make sure both that there is no gap in coverage and that we are keeping up to speed with the emerging technology.

That is an example of how we want to approach emerging technology: making sure that we are getting all of the best research and information, and, if there are gaps in any areas, plugging them. This is the approach we have taken so far, and it is one we are committed to continuing. Whether enacted through advice, guidance, codes or additional offences, all of those are open to us to take forward, whatever the technology shows.

Those were the main questions that were asked. On enforcement, the point is absolutely well made. Enforcement is good only if platforms and services know that these things will be enforced. We have been very clear that Ofcom has our backing to carry out enforcement activity. We have funded the online safety part of Ofcom year on year to ensure that it has the capacity and resources to enforce whatever it needs to enforce in this area. We are very committed to continuing to protect children online. We welcome Ofcom’s recent consultation on additional measures to build on its safety codes—including additional protections on live streaming, which many have called for, as children should not have harmful content pushed on them and should have age-appropriate experiences.

We remain committed to keeping young people safe online, and we will continue to work closely with campaigners, charities, industries and Ofcom to achieve this goal. We will also work with all civil society campaigners and those with an interest. The Secretary of State has also announced that DSIT will support an NSPCC summit at Wilton Park next year in order to bring together experts and young people to discuss the impact of AI on childhood.

Turning back to the SI under discussion, today’s update is another step towards a safer digital environment—one that protects the most vulnerable and addresses emerging risks. I thank the Committee and, on that basis, I commend these regulations to the Committee.

Motion agreed.