Lord Russell of Liverpool
Main Page: Lord Russell of Liverpool (Crossbench - Excepted Hereditary)Department Debates - View all Lord Russell of Liverpool's debates with the Home Office
(1 day, 12 hours ago)
Lords ChamberMy Lords, I will be brief. I entirely support the noble Baroness, Lady Kidron, on all her amendments. What I would say to the Government about their own amendment is that I have just had what I suppose is the privilege—although it sometimes seemed quite lengthy—of being a member of the Secondary Legislation Scrutiny Committee, and I can tell noble Lords that the quality of much secondary legislation is lamentable, varying by department. A lack of preparation, of any Explanatory Memorandum explaining anything relevant, and of any impact assessment whatsoever, is extremely frequent. In the last year, we have had several secondary instruments relating directly to the Online Safety Act, none of which has been particularly impressive, and some of which have been debated on the Floor of this House—my noble friend Lord Clement-Jones will be well aware of that. We have expressed our displeasure at the way in which this has been brought forward and explained.
All of us on the Cross Benches remember the late, lamented Lord Igor Judge. What he would think about a Government of this political hue bringing forward Henry VIII powers, to the power of 10, I cannot even imagine. If he is up there, he will be smiling wryly but he will not be impressed.
My only other point is rather strange. His Majesty’s occasionally loyal Opposition were extremely good at bringing in a variety of legislation which had a lot of Henry VIII powers. They have suddenly had a conversion on the road to Damascus, for which we should all be grateful. However, we need to think very carefully before we give the Government Henry VIII powers in an area as sensitive as this, and that is doing much harm as we speak.
My Lords, I express from these Benches our very strong support for these comprehensive amendments tabled by the noble Baroness, Lady Kidron, which she has characteristically introduced so well and to which so many noble Lords have spoken so eloquently in support. I also want to express our concerns regarding the Government’s proposed alternative, Amendment 429B.
In this group, we confront digital harm that is not incidental but engineered by design. AI chatbots are no longer a futuristic curiosity but deeply embedded the lives of our children. They are designed not merely as tools but as confidantes, mentors, companions and, in some cases, explicit romantic partners. Their anthropomorphic features create dangerous emotional dependency. Without statutory safeguards, these bots can provide explicit information on how to self-harm. This is not a flaw but a design feature that drives engagement, and we cannot allow the generative power of AI to become a generator of despair.
We are not debating theoretical risks, as many noble Lords have said today. We are debating the forces that led to the tragic deaths of Sewell Setzer III, mentioned by a number of noble Lords, and Adam Raine, in the United States. Their families are pursuing legal action in the US on the basis that deceptively designed, inadequately safeguarded chatbots can be treated as defective products, and that developers should bear full legal liability when systems encourage, facilitate or fail to interrupt a user’s path to suicide.
I welcome the Government’s admission that a legal loophole exists in the UK. However, their proposed remedy, Amendment 429B, gives us a choice between the clarity of primary legislation through the amendments tabled by the noble Baroness, Lady Kidron, and the convenience of the Executive. In contrast, the noble Baroness’s amendments provide clarity and embed safety duties in the Bill. Like my noble friend, I highlight Amendment 433, which deals with targeting the engineered features that keep children hooked. We know that bots guilt-trip users who try to end conversations. For a child, this is not a user interface quirk; it is emotional manipulation. These amendments would prohibit such coercive engagement techniques and, crucially, require bots to signpost users to help when asked about health, suicide or self-harm.
The primary legislation route offered by these amendments is the only fully viable and responsible path. If the noble Baroness wants to test the opinion of the House, we will support her in the Lobby. Should we be unable to secure her amendments, we would need to take a view on Amendment 429B. Four specific binding assurances would be required before we could consider supporting it; without them, it is nothing but a dangerous blank cheque. As changing these sections effectively rewrites the criminal threshold of the Online Safety Act, the Government must commit to the equivalent of the super-affirmative procedure for all significant policy choices, including amendments to core definitions or the expansion of duties beyond priority legal content. Standard procedures will not give this House the scrutiny needed.
Regarding mandatory supply chain transparency, we need a firm commitment that regulations will include a statutory mandate for providers to document and share their technical blueprints with Ofcom. Without this, the regulator cannot do its job. The Minister must confirm that the power will be used to tackle the issues raised by subsections (6) and (7) of Section 192 of the Online Safety Act, ensuring that chatbots cannot evade regulation simply because they lack a human mens rea. A bot does not intend harm, but it can be designed to cause it. The Minister must commit that any new regulations will explicitly disapply the requirement to prove human intent for AI-generated content. Regulations must define control across the entire AI supply chain so that accountability is not lost in a black box.
Finally, we would require a clear assurance that this power will not be used to alter the legal position of services that are not AI services. The scope of Amendment 429B must not drift beyond its stated purpose. If the Government are serious when they say that no platform gets a free pass, that must apply equally to generative AI models that, as we speak, are reshaping the childhoods of so many of our citizens. Safety by design must be the price of entry into the UK market, not an aspiration deferred to secondary legislation.