Terminally Ill Adults (End of Life) Bill Debate
Full Debate: Read Full DebateLord Deben
Main Page: Lord Deben (Conservative - Life peer)Department Debates - View all Lord Deben's debates with the Department of Health and Social Care
(1 day, 7 hours ago)
Lords ChamberMy Lords, I am grateful to my noble friend for laying such a broad amendment, and obviously I agree with much of what the right reverend Prelate said. It is interesting that this is coming straight after the debate on face-to-face conversations. We are all used to ticking the “I am not a robot” box, but AI now has the ability to create persons, and it is often very difficult if you are not face to face to judge whether the person on screen is actually a person. I cannot believe we have got there quite so quickly.
However, it is also important to consider about public confidence and understanding at the moment. This is, as we keep saying, such an important life-or-death decision. There is a lack of understanding and people are potentially worried about these implications, often with regard to employment but also other purposes. For instance, as I was preparing this, it made me reflect, as the noble Baroness, Lady Gerada, said, on how your GP uses AI. When Patchs told me recently that the NHS guidance was that I should not take an over-the-counter drug for more than two weeks, I queried it.
However, only yesterday, I thought: was that answer actually from my GP or was it from an AI tool sitting behind the system? We really need to be careful with the level of public understanding and awareness of its use. This use of AI is also one step on and connected to Clause 42, which relates to advertising. I am grateful that the noble and learned Lord is going to bring forward some amendments on that clause. I hope that the connection with AI, as well as the Online Safety Act 2023, have been considered. If I have understood the noble and learned Lord correctly, I am disappointed that we have had no assurance that those amendments will be with us by the end of Committee, when the noble and learned Lord gave evidence on 22 October last year and accepted that there was additional work to be done on Clause 42.
I said at Second Reading that the Bill is currently drafted for an analogue age. I am not wanting to take us back to some kind of quill and no-use-of-AI situation. Obviously, as other noble Lords have said, the Bill do not deal with the pressure or coercion not being from a human being. It also does not consider that coercion can now be more hidden with the use of AI. The Bill does not deal with people being able to learn to answer certain tools by watching YouTube. Therefore, we could be in a situation where someone who would not qualify if there was a face-to-face non-AI system could learn those answers and qualify.
There are also good studies to say that its use in GP practices has had some inaccuracies. In many circumstances, there is a lack of transparency and accountability in tracing where the decision has come from. We do not even understand the algorithms that are sending us advertisements for different shops, let alone how they could be connected to a decision such as this.
Finally, my biggest concern is that there will be a limited number of practitioners who will want to participate in this process. That has been accepted on numerous occasions in your Lordships House. I will quote from a public letter written on 12 June last year. All of Plymouth’s senior palliative medicine doctors were signatories to a letter warning us of the risks of the Bill and saying that the
“changes would significantly worsen the delivery of our current health services in Plymouth through the complexity of the conversations required when patients ask us about the option of assistance to die”.
That is relevant for two reasons. First, if we have a shortage of practitioners in parts of the country, such as the south-west if those doctors’ opposition to the Bill translates into not being involved, there may therefore be an increased temptation to resort to more use of AI. I hope that the noble and learned Lord or the Minister can help on this point.
Many of these systems—I am speaking as a layperson here—rely on data groups and information within the system: the learning is created from that. If you have a very small pool of practitioners and some form of AI being used, does that not affect the creation of the AI tool itself? I hope that I have explained that correctly. With such a small group doing it, will that not affect the technology itself?
I come to this amendment with a good deal of suspicion. I am always worried when the House of Lords decides that it is getting worried about some new thing that is coming along, so we had better do something about it. The noble Baroness, Lady Coffey, explained that this was a broad demand in order that we should concentrate on the important bit. I recommend to those in the House who were not here for last night’s debate on super-clever AI to read it, because it explains why we should be concerned about this. If it will not embarrass him, I shall say that I hope the House will read with care the speech by the right reverend Prelate the Bishop of Hereford, which brought his scientific knowledge and moral concern together in a most interesting and perceptive way. If his quoting Saint Thomas Aquinas interests people, there is a remarkable book called Why Aquinas Matters Now, which is well worth reading in the context of this particular Bill.
On the first point, as I said, the review has to report in the first reporting period required under Clause 50. That means that it reports probably three years before the Bill comes into force, so there will be no cases. It is not doing what the noble Lord, Lord Sandhurst, was saying. On the second point about wriggling out, what the noble Baroness was describing would also be a Fatal Accidents Act case, so it would be covered, one hopes, by what the review deals with.
My Lords, I come back to the idea of having an inquiry and a report. I do not quite understand why the noble and learned Lord does not feel that it is much more sensible for us to have it in the Bill. After all, otherwise you are in a sense dictating what the inquiry shall come up with. The only inquiry that you would want to have is one that found an answer to the problem, but you do not know that if you set up an inquiry. I would rather like to have the answer to start with in the Act so that we know that those people are protected.
The reason, from discussing and thinking about this issue, is that the Government see the most convenient way of doing it is to have a review that can make sure every single aspect is covered. That is the argument for the review.