Draft Online Safety Act 2023 (Priority Offences) (Amendment) Regulations 2025 Debate
Full Debate: Read Full DebatePeter Fortune
Main Page: Peter Fortune (Conservative - Bromley and Biggin Hill)Department Debates - View all Peter Fortune's debates with the Department for Science, Innovation & Technology
(1 day, 12 hours ago)
General CommitteesIt is a pleasure to serve under your chairmanship, Mr Vickers.
This statutory instrument represents an important development in the obligations on platforms regulated under the Online Safety Act to protect people from encountering illegal content online. The OSA was enacted by the last Government with the primary aim of safeguarding children and removing serious illegal material from the internet. Tackling the most harmful content, such as that which is the subject of today’s discussion, goes to the heart of the Online Safety Act’s aims. His Majesty’s Opposition therefore welcome and support the draft regulations.
The experiences and opportunities offered by the online world change rapidly. It is right that legislators are responsive when new risks emerge or when certain types of unlawful content proliferate on the internet. Under the last Government, the OSA amended the Sexual Offences Act 2003 to criminalise several forms of sexual misconduct and abusive behaviour online. The new offences included cyber-flashing and the sharing of or threatening to share intimate images without consent. The amendments were made to keep pace with novel threats and forms of abuse, the victims of which are too often women and girls.
Baroness Bertin’s independent review of pornography, which was published in February this year, highlighted the damaging impact on victims of intimate image abuse, ranging from physical illness to mental health effects such as anxiety, depression, post-traumatic stress disorder and suicidal thoughts. The effects of cyber-flashing and intimate image abuse on victims is severe. It is therefore right that this statutory instrument brings cyber-flashing within the scope of the priority offences in schedule 7 to the Online Safety Act, while retaining as a priority offence the sharing of or threatening to share intimate images.
We also strongly support the addition as a priority offence of encouraging or assisting serious self-harm, which is the other important component of this statutory instrument. Desperate people who contemplate self-harm need early intervention and support, not encouragement to self-harm. Under this SI, regulated services will be obliged to proactively remove the material when they become aware of it on their platforms and take measures to prevent it from appearing in the first place. One can only wonder why it has taken so long to get to this position. I am sure we will have a unanimous view not only in the House but in society of the importance of removing such material.
The regulations will work only if they are adopted by the industry and subject to rigorous oversight, coupled with enforcement when platforms fail in their obligations. That is a necessity, and why we had to introduce the Online Safety Act in the first place. It is right that Government regulators should look to identify obstacles to the implementation of the OSA and take action where necessary. Since the introduction of Ofcom’s protection of children codes in the summer, important questions have arisen around the use of virtual private networks to circumvent age verification, as well as data security and privacy in the age-verification process.
Peter Fortune (Bromley and Biggin Hill) (Con)
On that point, does my hon. Friend the shadow Minister agree that we need to give some thought to the rise of chatbots and their nefarious activity, especially where they encourage self-harm or encourage children to do worse?
I thank my hon. Friend for his question on a very important point, which was raised just last week in Department for Science, Innovation and Technology questions by my hon. Friend the Member for Harrow East (Bob Blackman) and others. The Lib Dem spokesperson, the hon. Member for Harpenden and Berkhamsted, also raised questions about the importance of the scope of regulations for chatbots.
The Government seem all over the place as to whether the large language models, as we understand them, regulate the content that comes into scope. Given the response we received last week, it would be helpful to have some clarity from the Minister. Does he believe that LLMs are covered by the OSA when it comes to encouraging self-harm material? If there is a gap, what is he going to do about it? I recognise that he is commissioning Ofcom to look at the issue, but in his view, right now, is there a gap that will need someone to fix it? What are his reflections on that? This is increasingly becoming a priority area that we need to resolve. If there is a gap in legislation, we need to get on and sort it.