Lord Davies of Gower
Main Page: Lord Davies of Gower (Conservative - Life peer)Department Debates - View all Lord Davies of Gower's debates with the Ministry of Justice
(1 day, 10 hours ago)
Lords ChamberMy Lords, I welcome the Government’s technical amendments. We spent some time in Committee debating the definition of a “thing” used to generate horrific CSA images. I am pleased that the Government have tabled Amendment 201 to clarify that a “thing” explicitly includes a service.
Modern AI is not just a program sitting on a hard drive but an ephemeral, cloud-based service. By adopting this broader language, we ensure that those who provide the underlying infrastructure for CSA image generation cannot evade responsibility through technical loopholes. These may appear to be technical drafting changes, but they provide the necessary teeth for the primary offences in Clauses 65 to 67.
My Lords, the government amendments in this group are largely consequential and minor drafting changes. They relate to the important topic of child sexual abuse image generators. I have little to say to this group other than that the topic which they address is one of serious and urgent concern.
The rapid emergence of generative AI has presented new and troubling challenges. The recent Grok AI scandal, in which an AI model generated harmful sexual content publicly, some of which involved children, highlighted the potential for mainstream tools to be misused in ways that normalise or distribute abusive material. That episode underlines why robust legal safeguards are essential as technology evolves.
The Government have continued to delay passing legislation regarding AI regulation, which was alluded to as far back as 2024. I thank the Minister for his assurances that the Government will continue to monitor developments in this area and work with industry to protect children from abuse and exploitation.
My Lords, from these Benches, I strongly support Amendment 209, which was so convincingly spoken to by the noble Baroness, Lady Kidron. I was very pleased to have signed it, alongside the noble Lord, Lord Russell of Liverpool, and the noble Baroness, Lady Morgan of Cotes.
This amendment is a vital safeguard against the “innovation first, safety later” culture of big tech. Although the Bill will rightly prohibit the creation of models specifically designed to generate CSA images, it remains silent on general-purpose models that can be easily manipulated or jailbroken to produce the same horrific results. As the unacceptable use of tools such as Grok—referred to by my noble friend Lady Benjamin in her powerful speech—has recently illustrated, we cannot leave the safety of our children to chance. We face a technological and moral emergency. The Internet Watch Foundation, represented at the meeting today which the noble Lord, Lord Russell, and my noble friend mentioned, has warned of a staggering 380% increase in confirmed cases of AI-generated child exploitation imagery. The noble Lord, Lord Russell, is right that the extent of this abuse is sickening beyond imagination.
The amendment would mandate a safety-by-design intervention, requiring providers to proactively risk-assess their services and report identified risks to Ofcom within 48 hours. In Committee, the Minister, the noble Lord, Lord Hanson, pushed back against this proposal, arguing that it
“would place unmanageable and unnecessary operational burdens on … the National Crime Agency and Ofcom”.—[Official Report, 27/11/25; col. 1533.]
He further claimed that these measures risk creating “legal uncertainty” by “duplicating” the Online Safety Act. Both assertions need rebutting. First, protecting children from an industrial-scale explosion of AI-generated abuse is not an unnecessary burden; it is the primary duty of our law enforcement and regulatory bodies. Secondly, we cannot rely on the theoretical protections of an Online Safety Act designed for a world before generative AI. Ofcom itself has maintained what might be called a tactical ambiguity about how the Act applies to stand-alone AI chatbots and large language models.
Alongside the noble Baroness, Lady Kidron, who we will support if she puts the amendment to a vote, we ask for an ex ante duty: providers must check whether their models can be used to generate CSAM before they are released to the public. Voluntary commitments and retrospective enforcement are simply not enough. The Government have already committed to this principle; it is time to put that commitment into statute. I urge the Minister to accept Amendment 209 and ensure that we move away from ex post measures that address harm only after a child has been victimised.
The current definitions of “search” and “user-to-user” services do not neatly or comprehensively capture these new generative technologies. We cannot allow a situation where tech developers release highly capable models to the public without first explicitly checking whether they can be used to generate CSAM. Voluntary commitments and retrospective civil enforcement are simply not enough. We need this explicit statutory duty in the Bill today and I urge the Minister to accept Amendment 209.
My Lords, Amendment 209, in the name of the noble Baroness, Lady Kidron, would require providers of relevant online services to assess and address the risks that their platforms may be used for the creation, sharing or facilitation of child sexual abuse material, placing a strengthened duty on them to take preventive action. More than anyone in this Chamber, I fully recognise the intention behind strengthening preventive mechanisms and ensuring that providers properly assess and mitigate risks to children. Requiring companies to examine how their services may facilitate abuse is, in principle, entirely sensible. The scale and evolving nature of online exploitation means that proactive duties are essential.
However, I have some concerns about the proposed mechanism, on which I hope the Minister may also be able to provide some input. The amendment appears to rely on providers conducting their own risk assessments. That immediately raises several practical questions, such as what objective standard those assessments would be measured against, whether there would be statutory guidance setting out minimum criteria, and how consistency would be ensured across companies of vastly different sizes and capabilities. There also remains the crucial question of what enforcement mechanisms would apply if an assessment was superficial or inadequate. Without clear parameters and oversight, there is a danger that such a system could become uneven in practice.
I would welcome reassurance from the Minister as to how the Government intend to ensure that risk-based duties in this space are transparent and robust for the purposes of child protection. The question is not whether we act, but how. We all share the same objective of reducing the prevalence of child sexual abuse material and protecting children from exploitation. The challenge is ensuring that the mechanisms we legislate for are clear and enforceable in practice. I look forward to the Minister’s response.
I am grateful to the noble Baroness Kidron, for tabling Amendment 209 and for her commitment to doing all we can to prevent online harms. I was struck strongly by the contributions from the noble Baronesses, Lady Benjamin and Lady Bertin, the noble Lords, Lord Pannick and Lord Russell of Liverpool, my noble friend Lord Stevenson of Balmacara and the noble Earl, Lord Erroll.
This is a really serious issue. The Government are committed to making sure that we have constructive engagement with the noble Baroness, as I have tried to do, including one formal and one informal meeting this very day, to ensure that we can make this work in the interests of what everybody in this House wants to do: to ensure, particularly given the rapid development of technology, that the public, and especially children, are safeguarded from harm. This Government are committed to tackling sexual exploitation and abuse and ensuring that new technologies are developed and deployed responsibly. I know that that matters; I know that it is important, and I know that this Government want to make sure that we deal with it.
A few weeks ago, the Grok AI chatbot was used to create and share vile, degrading and non-consensual intimate deepfakes. This House should ensure that no one lives in fear of having their image sexually manipulated by technology. From the Prime Minister to the DSIT Secretary, we said at the time that we will do something to stamp out this demeaning and illegal image production.
My Lords, it is a pleasure to follow the wise words of the noble Lord, Lord Stevenson. Let me say from the outset that, in principle, on these Benches we conditionally support Amendment 239A, which has been spoken to so powerfully by the noble Lord, Lord Nash.
The noble Lord very clearly set out the urgent issues involved, as did my noble friend Lady Benjamin and the noble Lord, Lord Russell, and all of us who were there in the same meeting which we have referred to before. We are at a technological and moral crisis point, as we have debated in a previous group regarding child sexual abuse material online. We face a children’s mental health catastrophe, and the ubiquity of child sexual abuse material is a central driver of that catastrophe.
The noble Lord, Lord Nash, has explained that his amendment would mandate that manufacturers and importers of smartphones and tablets ensure their devices satisfy a CSAM requirement to prevent the creation, viewing, and sharing of such material.
The question, however, clearly arises as to whether this would undermine encryption or privacy. We recognise that the noble Lord, Lord Nash, in his revised Amendment 239A, does indeed include a duty of privacy in his regulations. In my view, the thing to avoid is the chance that a technological fix of this kind could involve some degree of surveillance. I do agree with the noble Lord, Lord Russell, that, at first sight, the technology looks extremely promising, as the noble Lord, Lord Stevenson, mentioned, but, before taking this further, we need to be absolutely sure about the robustness of this technology and its impact on privacy.
By requiring software to be preloaded at the system level, we would move away from the model of parental controls and platform responsibility, and we would place the duty on the manufacturers who profit from these devices. Quite apart from that, we do, of course, also need to ensure that the platforms take action.
The Minister may promise further consultation, but we do not need much more consultation to know that the status quo is failing; we need to find a solution now rather than playing an endless game of digital catch-up. As other others have urged, I hope that the Government will take a look at this proposal urgently, closely and seriously.
My Lords, this group of amendments addresses one of the gravest and most distressing areas of criminality: the sexual exploitation of children and the creation and circulation of child sexual abuse material. There will be no disagreement among noble Lords about the objective behind these amendments. The scale of this crime is deeply alarming and becoming increasingly technologically sophisticated. The question before us is not whether we act but how.
I turn to the amendments in the name of my noble friend Lord Nash. Once again, I entirely understand and support the underlying aim. The goal of ensuring that devices supplied in the UK have highly effective, tamper-proof system software capable of preventing the transmission or viewing of CSAM is a commendable one. Preventing abuse at source is always preferable to prosecuting it after the harm has occurred.
I recognise that Amendment 239A includes express provisions intended to safeguard user privacy, requiring that any such software must operate in a way that does not collect, retain, copy or transmit data outside the device, nor determine the identity of the user. It also provides for affirmative parliamentary approval of the regulations.
However, it is still hard to overlook the practical challenges that may arise from this amendment. Determined offenders frequently exploit encrypted platforms and modify operating systems, often using overseas-hosted services. A requirement limited to devices supplied for use in the UK could be circumvented by overseas purchases or software alterations. Even with privacy safeguards written into the regulation-making power, this amendment may still raise complex issues relating to encryption, cyber security, technical feasibility and enforcement. Mandating tamper-proof software across all relevant devices would represent a significant expansion of the regulatory framework established under the Product Security and Telecommunications Infrastructure Act 2022.
While I strongly support the objective of forestalling child sexual exploitation and disrupting the circulation of abuse material, I am not yet persuaded that this amendment provides a workable legislative solution. I look forward to hearing from the Minister how the Government are strengthening preventative technology and ensuring that industry plays a meaningful role in protecting children, while maintaining a framework that is technically feasible and legally robust.
I am grateful to the noble Lord, Lord Nash, for setting out his amendments. I know that he met last week with the Minister, my noble friend Lady Lloyd, and I hope that was a productive discussion. I was pleased to meet with him as well—I have lost track of the date, but it was some time in the last few months—when he graciously brought along representatives of companies that are developing the technology he talked about today. I found that meeting useful.
I acknowledge the noble Lord’s intention to protect children through this amendment, and I want to be clear, as I was on the previous amendment, that the Government share the ambition to protect children from nude imagery and prevent the spread of CSAM online. I hope that my response to the noble Baroness, Lady Kidron, showed that this is a matter the Government are taking seriously. That is why, in the violence against women and girls strategy, we have made it clear that we want to make it impossible for children in the UK to take, share or view nude images. We strongly agree that nudity detection on a device is an effective way in which this could be achieved.
My Lords, I thank all noble Lords for their contributions to what has been an important and, at times, deeply sobering debate. I place on record my sincere thanks to my noble friend Lady Owen, who has been tireless in campaigning on these issues inside and outside this House. In Committee, noble Lords from across the House recognised not only the seriousness of the harm caused by non-consensual intimate images but the persistence and expertise she has brought to improving the law in this area. That work has already borne fruit in previous legislation, and it continues to shape the debate constructively here.
It is also pleasing to hear the Government agreeing with much of what my noble friend Lady Owen has said. The Prime Minister made absolutely no mention of her work when he announced the 48-hour takedown policy, and we all know that that success lies with her, so I am pleased the Minister has rectified that today. My noble friend has also highlighted an inconsistency in the Government’s position. If they are to enact the 48-hour takedown policy, they will need to establish a central hash register, given the gap between what Ofcom is able to do and what would be required to enact the Prime Minister’s announcement.
These proposals relating to hashing and the establishment of a statutory non-consensual intimate image register build on existing voluntary initiatives, including work undertaken by the Revenge Porn Helpline. In Committee, there was recognition across the House that hashing technology has already proven effective in tackling child sexual abuse material and that extending similar mechanisms to adult victims of intimate image abuse merits serious consideration. But, more than that, they are essential to enacting the Government’s own recently announced policy.
The proposal to require deprivation and deletion orders following conviction is, surely, the logical conclusion of the existence of the offence. If it is an offence for these images to be made and shared, then a court should require their deletion.
The amendments concerning screenshotting, copying of temporarily shared images, and the creation or distribution of degrading material are also rooted in the lived experience of many individuals, particularly young women and girls. Technology has outpaced the assumptions underpinning older offences. As my noble friend has argued, consent given for a time-limited viewing is not consent to permanent capture, nor should the law allow perpetrators to evade liability through technical loopholes.
Finally, on Amendment 277, we are supportive of the proposed expansion of the voyeurism offence to include where a person records non-consensual images of a person with the intent of obtaining sexual gratification. It is appalling that people can film others without their knowledge and consent and use those images for their own nefarious purposes.
I also thank the Government for their welcome engagement with my noble friend on these matters. It has been clear, both in Committee and since the Ministers met with my noble friend and other stakeholders, that there has been constructive cross-party dialogue. This is reflected in the numerous amendments they have tabled in this group to similar effect. That spirit of collaboration is to be commended. These issues, which concern dignity, privacy, exploitation, and protection from abuse, should never be partisan. I am therefore grateful for what has been achieved up to this point.