Online Safety Act 2023 (Priority Offences) (Amendment) Regulations 2025 Debate
Full Debate: Read Full DebateBaroness Lloyd of Effra
Main Page: Baroness Lloyd of Effra (Labour - Life peer)Department Debates - View all Baroness Lloyd of Effra's debates with the Department for Business and Trade
(1 day, 8 hours ago)
Grand Committee
Baroness Lloyd of Effra
That the Grand Committee do consider the Online Safety Act 2023 (Priority Offences) (Amendment) Regulations 2025.
Relevant document: 40th Report from the Secondary Legislation Scrutiny Committee
The Parliamentary Under-Secretary of State, Department for Business and Trade and Department for Science, Innovation and Technology (Baroness Lloyd of Effra) (Lab)
My Lords, these regulations were laid before the House on 21 October this year. Before I proceed further, I draw the Committee’s attention to a correction slip issued for these regulations in October for minor drafting changes related to the date of the Sexual Offences Act 2003 in the Explanatory Notes and the order of words for the title of an offence inserted by paragraph 2 of the regulations.
The Government remain firmly committed to tackling the most serious and harmful online behaviours. This statutory instrument strengthens the Online Safety Act by designating new priority offences aimed at addressing cyber flashing and content that encourages self-harm. By doing so, we are ensuring that platforms take more proactive steps to protect users from these damaging harms.
Evidence shows that cyber flashing and material promoting self-harm are widespread and cause significant harm, particularly among younger age groups. In 2025, 9% of 18 to 24 year-olds reported experiencing cyber flashing and 7% encountered content encouraging self-harm in a four-week period. That equates to around 530,000 young adults exposed to cyber flashing and 450,000 to self-harm content. This is unacceptable.
Further, 27% of UK users exposed to cyber flashing reported significant emotional discomfort. There is also compelling evidence that exposure to self-harm content worsens mental health outcomes. A 2019 study found that 64% of Instagram users in the US who saw such content were emotionally disturbed by it. Another study in 2018 revealed that 8% of adults and 26% of children hospitalised after self-harming had encountered related content online. These figures underline that these are not marginal issues—they are widespread and deeply harmful.
As noble Lords will know, the Online Safety Act, which received Royal Assent on 26 October 2023, imposes strong duties on platforms and search services to protect users. Providers must assess the likelihood that their services expose users to illegal content or facilitate priority offences, and then take steps to mitigate those risks; these include safety by design measures and robust content moderation systems.
The Act sets out a list of priority offences for the purposes of illegal content duties. These represent the most serious and prevalent forms of online illegal activity. Platforms must take additional steps to address these offences under their statutory duties. This statutory instrument adds cyber flashing and content encouraging self-harm to the list of priority offences. Currently, these offences fall under the general illegal content duties. Without priority status, platforms are not required to conduct specific risk assessments or implement specific measures to prevent exposure to these harms; that is why we are adding them as priority offences.
Stakeholders have strongly supported these changes. Organisations such as the Molly Rose Foundation and Samaritans have long called for greater protection for vulnerable users. These changes will come into force 21 days after the regulations are made, following approval by both Houses. Ofcom will then set out in its codes of practice the measures that providers should adopt to meet their duties. Our updates to the Act’s safety duties will fully take effect when Ofcom makes these updates about measures that can be taken to fulfil the duties.
We expect Ofcom to recommend actions such as enhanced content moderation; improved reporting and complaints systems; and safety by design measures—for example, testing algorithms to ensure that illegal content is not being promoted. If providers fail to meet their obligations and fail to take proportionate steps to stop this vile material being shared on their services, Ofcom has strong enforcement powers to enforce compliance. These include powers to issue fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is higher.
This statutory instrument upgrades cyber flashing and self-harm content to priority status, reinforcing the Online Safety Act’s protections. Service providers will be required to take more proactive and robust action to detect, remove and limit exposure to these harmful forms of illegal content. This will help ensure that platforms take stronger steps to protect users, reduce the prevalence of these behaviours online and make the internet safer for all. I beg to move.
My Lords, I hope this is one of those occasions when we agree that what is coming here is a good thing—something that is designed to deal with an evil and thus is necessary. I want just to add a bit of flesh to the bones.
If we have regulation, we must make sure—as we are doing now—that it is enforced. I congratulate the Government on the age-verification activities that were reported on this morning, but can we get a little more about the tone, let us say, with which we are going to look at future problems? The ones we have here—cyber flashing and self-harm—are pretty obviously things that are not good for you, especially for younger people and the vulnerable.
I have in front of me the same figures of those who have experienced disturbing reactions to seeing these things, especially if they did not want to see them. Self-harm is one of those things; it makes me wince even to think about it. Can we make sure that not only those in the industry but those outside it know that action will be taken? How can we report across more? If we do not have a degree of awareness, reporting and everything else gets a bit slower. How do we make sure that everybody who becomes a victim of this activity knows that it is going on?
It is quite clear that the platforms are responsible; everybody knows that. It is about knowing that something is going on and being prepared to take action; that is where we will start to make sure not only that this is unacceptable and action will be taken but that everybody knows and gets in on the act and reporting takes place.
I could go on for a considerable length of time, and I have enough briefing to do so, but I have decided that the Grand Committee has not annoyed me enough to indulge in that today. I congratulate the Minister, but a little more flesh about the action and its tone, and what we expect the wider community to do to make sure this can be enacted, would be very helpful here. Other than that, I totally welcome these actions. Unpleasant as it is that they are necessary, I welcome them and hope that the Government will continue to do this. We are always going to be playing a little bit of catch-up on what happens, but let us make sure that we are running fast and that what is in front of us does not get too far away.
Baroness Lloyd of Effra (Lab)
My Lords, I thank noble Lords for their broad support for adding these offences to the priority offences list. This is an important step in improving the online safety regime and improving the environment in which we all use the internet, particularly children and vulnerable people. This will help fulfil the Government’s commitment to improving online safety and strengthening protections for women and girls.
On the points made by the noble Lord, Lord Addington, about tone and proactivity, it is really important that we communicate what we are doing, both in the online world and in terms of violence against women and girls in the physical world. We know that we must all do more to tackle misogynistic abuse, pile-ons, harassment and stalking, and the Government’s whole approach to tackling violence against women and girls is an active one and is something that we have real, serious goals on. We welcome everyone supporting that move forward. For example, the publication of Ofcom’s guidance, A Safer Life Online for Women and Girls, sets out the steps that services can take to create safer online spaces, and the Government will be setting out our strategy for tackling violence against women and girls in due course as part of that. I think that the publication of Ofcom’s report this morning, which sets out the activity that it has taken and will take, will help raise the profile, as the noble Lord says, about what is expected of services in terms of the urgency and the rigour with which these changes are made.
On the question of VPNs, which we talked about a little earlier, we do not have a huge amount of information or research about their use, particularly by young people to circumvent age assurance. We know that there are legitimate reasons to use VPNs, and we do not have a huge amount of evidence about their use by young people, either very young people or older teenagers. Ofcom and the Government are committed to increasing the research and evidence for how VPNs are being used and whether this is indeed a way that age assurance is being circumvented, or whether it is for what might be legitimate reasons, such as security or privacy reasons. That is an important piece of the evidence puzzle to know exactly what measures to take subsequently.
I am particularly interested in whether it is a legitimate defence for a platform to say, “We could not have prevented this access because a VPN was in use”, and therefore whether it falls to the platforms themselves to figure out how to prevent abuse via VPNs.
Baroness Lloyd of Effra (Lab)
I think we may need to have this conversation together with Ofcom. My understanding is that doing the risk assessments and putting these offences on the priority list increases the level of risk assessment that must be done. When a platform is doing its risk assessment, it will have to take into account the way in which children, young people or other users will access the service. Thus, depending on what its service provided and how people accessed it, if that was a factor that needed taking into account, it would therefore have to take that into account in the controls it was putting in place, as a platform, on the basis of its knowledge of its user base. Thus, it would, perhaps, go to the more conservative as opposed to the more permissive end of controls. That is my understanding, and if it is not correct, I will correct it.
Likewise, the noble Lord, Lord Addington, made points about emerging technology, making sure that this measure is fit for purpose and that we keep all Online Safety Act duties, defences and coverage very much up to speed with what people are experiencing in their daily lives. Many others have raised the issue of AI chatbots, including what is currently covered and not covered under the Act. The Secretary of State has commissioned work on AI chatbot activity to make sure both that there is no gap in coverage and that we are keeping up to speed with the emerging technology.
That is an example of how we want to approach emerging technology: making sure that we are getting all of the best research and information, and, if there are gaps in any areas, plugging them. This is the approach we have taken so far, and it is one we are committed to continuing. Whether enacted through advice, guidance, codes or additional offences, all of those are open to us to take forward, whatever the technology shows.
Those were the main questions that were asked. On enforcement, the point is absolutely well made. Enforcement is good only if platforms and services know that these things will be enforced. We have been very clear that Ofcom has our backing to carry out enforcement activity. We have funded the online safety part of Ofcom year on year to ensure that it has the capacity and resources to enforce whatever it needs to enforce in this area. We are very committed to continuing to protect children online. We welcome Ofcom’s recent consultation on additional measures to build on its safety codes—including additional protections on live streaming, which many have called for, as children should not have harmful content pushed on them and should have age-appropriate experiences.
We remain committed to keeping young people safe online, and we will continue to work closely with campaigners, charities, industries and Ofcom to achieve this goal. We will also work with all civil society campaigners and those with an interest. The Secretary of State has also announced that DSIT will support an NSPCC summit at Wilton Park next year in order to bring together experts and young people to discuss the impact of AI on childhood.
Turning back to the SI under discussion, today’s update is another step towards a safer digital environment—one that protects the most vulnerable and addresses emerging risks. I thank the Committee and, on that basis, I commend these regulations to the Committee.