Terminally Ill Adults (End of Life) Bill

Lord McCrea of Magherafelt and Cookstown Excerpts
Friday 30th January 2026

(1 day, 7 hours ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Polak Portrait Lord Polak (Con)
- Hansard - - - Excerpts

My Lords, I support the amendments in this group, especially the one from my noble friend Lord Evans. I was not going to speak but I was moved by what the noble Baroness, Lady Smith, said about her father.

I am not a Luddite. My mother passed away in July 2023 from brain cancer, and this debate has reminded me of the Zoom call we had to look at the next stage of her treatment. I was here in London; my sister was with my mother in Liverpool, where she was lying in bed unable to speak. The nurse who was looking at the next stage of treatment for her was in Margate, had never met my mother, and was asking questions for over an hour to which mother could not reply. I have listened to this whole debate, and if we cannot put face-to-face consultation in the Bill, we are doing a great injustice to many people.

Lord McCrea of Magherafelt and Cookstown Portrait Lord McCrea of Magherafelt and Cookstown (DUP)
- Hansard - -

My Lords, earlier in the debate, the noble Baroness, Lady Jay of Paddington, intervened to say that she could not understand why, having talked so much, we had not actually talked about terminal illness. If the noble Baroness remains in her place, she will be here for the fifth group of amendments, on terminal illness, and there will certainly be a lot of discussion then of that issue. In fact, if we were speaking to that group of amendments now, we would be told by the Whip to address instead the amendments before us.

In the light of my experience as a minister, dealing with the general public from another side, I gently say to the noble Baroness, who was advocating for online assessments, that, if they are so perfect, why are so many mistakes made? Should we just dismiss those mistakes?

--- Later in debate ---
Baroness Hayman Portrait Baroness Hayman (CB)
- Hansard - - - Excerpts

My Lords, I will briefly follow on from the noble Lord on the issue of overengineering. I had great sympathy with the words of the noble Baroness, Lady Blackstone, and I suspect that there is widespread support in the Committee that face-to-face consultations should, in general practice and in the norm, be what happens in these circumstances. We get into great difficulty when we micro-legislate to cover every single circumstance that might occur. A code of practice is a more reasonable and flexible document to deal with this. The noble Lord shakes his head, but he just spoke about the dangers of having anyone else in the room in a consultation because of the possibility of coercion, yet the noble Baroness, Lady Smith, spoke potently about how important it was for there to be a family member, or support, or someone who could hear.

Lord McCrea of Magherafelt and Cookstown Portrait Lord McCrea of Magherafelt and Cookstown (DUP)
- Hansard - -

I was not speaking against someone being in the room. I am speaking about someone being in the room whenever it is on Zoom or on camera and not in person, because you do not know whether the person in the room is privately and secretly coercing that person.

Baroness Hayman Portrait Baroness Hayman (CB)
- Hansard - - - Excerpts

I understand that the noble Lord was talking about a subset of consultations, but this is my point: I think he accepted that there might, in any process, be exceptional circumstances where a consultation was not in person. I am just saying that, even in that narrow subset, there might be a reason for another person to be in the room. I am not talking about that specific point; I am trying, in general, to suggest that we should try to lay down some principles but not try to overengineer and cover every possible circumstance.

--- Later in debate ---
Lord Hamilton of Epsom Portrait Lord Hamilton of Epsom (Con)
- Hansard - - - Excerpts

My Lords, I have supported AI for as long as I can remember, and I think it is the future for this country. If we are looking for improvements in productivity, there is no doubt that we should look to the National Health Service and the public sector, where we can see AI having its greatest effect and improving the health of the economy of this country.

However, we are in early days with AI, although it has been with us for some time. We must be very careful not to rely on it for too many things which should be done by human beings. The noble Lord, Lord Stevens, has already referred to the appalling rate of misdiagnosis. We can look at these statistics and say, “Well, it is only a small number who are misdiagnosed”. Yes, but my noble friend Lord Polack was misdiagnosed as only having six months to live and he is still with us 32 years later. You must think about this, because if you get the situation with misdiagnosis badly wrong, it undermines the basis of this Bill. Therefore, we must be very careful that AI does not contribute to that as well.

I pay tribute to the right reverend Prelate. AI is having a tremendous effect in the health service and helping a large number of people to get better, and it may well be that AI introduces cures for people who are being written off by their doctors—perhaps wrongly. We must not dismiss AI, but we must be very wary about where it leads us. There will be an awful lot of bumps in the road before AI is something in which we can all have complete confidence and believe will deliver better outcomes than human beings.

Lord McCrea of Magherafelt and Cookstown Portrait Lord McCrea of Magherafelt and Cookstown (DUP)
- Hansard - -

My Lords, there are just a few remarks I would like to make. We live in an age where it is hard to get a human to interact with any more. We lift the phone and speak to a voice that says that if you want one thing, press 1, and if you want something else, press 2. I fear that this is what we are heading for: if you want death, just press a button.

I have no doubt that if this legislation is passed as it is, in the near future we will be heading towards AI assessment procedures. My concern is not where we start in this process, but where it leads to and where it ends.

I am informed that, in the Netherlands, it has been proposed to use AI to kill patients in cases where doctors are unwilling to participate. Indeed, it is suggested that AI could be less prone to human error. Surely, in crucial assessments and decision-making processes for a person seeking assisted suicide, AI could not identify subtle coercion and assess nuanced capacity, bearing in mind the irreversible nature of the outcome. There are concerns about the risk of coercion or encouragement by AI. It should be noted that, with the newly developed AI functions and chatbots, there are already cases globally of individuals being coerced into all sorts of different behaviours, practices and decision-making.

Clause 5 allows doctors to direct the person

“to where they can obtain information and have the preliminary discussion”.

That source of information could be AI or a chatbot. Is there anything in the Bill that will prevent this?

AI undermines accountability. If the technology fails, who bears responsibility? Traditionally in the health service, the doctor bears responsibility. If AI is used, who bears responsibility?

Baroness Lawlor Portrait Baroness Lawlor (Con)
- Hansard - - - Excerpts

My Lords, to add to what has been said, AI is based on large language models, which involve big datasets. I ask your Lordships to consider whether such large datasets, based on assessing a snippet of data to assist diagnosis, are a good way of assessing individual patients. They were not designed to assess individual patients. Every doctor will tell you that each individual case is different, and that diagnosis can vary. I am very grateful to the noble Lord, Lord Stevens, for sharing the results of the 98,000 cases that were assessed for accuracy. Therefore, I am not sure that it is a suitable tool to assess and diagnose individual cases.