Terminally Ill Adults (End of Life) Bill Debate

Full Debate: Read Full Debate
Department: Department of Health and Social Care

Terminally Ill Adults (End of Life) Bill

Lord Hamilton of Epsom Excerpts
Friday 30th January 2026

(1 day, 7 hours ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Berger Portrait Baroness Berger (Lab)
- Hansard - - - Excerpts

My Lords, very briefly, I support the amendment that is calling for face-to-face consultations to take place, rather than only in exceptional cases. I want to reflect on why this matters. We know from other jurisdictions that many of these assessments are being done online. It is a really important question for us to consider whether we would want that in our country—and if not, it should be in the Bill. In addition, my comments are informed by the evidence that we were presented with in the Select Committee and drawn from my experience of meeting a number of elderly constituents over the course of nearly a decade as a Member of Parliament.

I reflect particularly on the women I met in their 70s, 80s and 90s who shared their experiences of domestic abuse. This conversation and these amendments matter because this legislation does not happen in a vacuum. The Labour Government today are rightly concerned with addressing the public health emergency of violence against women and girls in our country and has an important landmark mission and goal of halving violence against women and girls over the next decades. The NHS is playing its part and enhancing its efforts in tackling and violence against women and girls, focusing particularly on early identification. There is a lot of other very important work going on via training and investment, and I commend the work of many colleagues who are dealing with this on a daily basis. It was the experts that told us that to identify coercion, undue influence and pressure, doctors and other professionals need to look at someone’s body language. It is not just the words we say, how we say them, the volume or the tone—it is our non-verbal cues and what our body says. It is what we do not say that often shares an important message.

I listened very carefully to the counterchallenge of noble Lords so far. I do not think there is anything to stop the Bill from stipulating that, in exceptional circumstances, the doctors, or the independent advocates or panel members can visit an individual. But I would much rather that we had legislation that supports the Government’s important aim to reduce violence against women and girls, rather than something that will exacerbate the very serious problem that we know that too many women in our country face, particularly at their most vulnerable moments, which includes the end of life.

Lord Hamilton of Epsom Portrait Lord Hamilton of Epsom (Con)
- Hansard - -

Can the noble Baroness recall that last week she told the House that 23% of six-months-to-live diagnoses turned out to be wrong and that people lived longer? Does that not make the whole position of face-to-face diagnosis much more important when doctors so often get it wrong?

Baroness Grey-Thompson Portrait Baroness Grey-Thompson (CB)
- Hansard - - - Excerpts

My Lords, technology has gone a long way to helping disabled people to lead inclusive and integrated lives in British society, and I generally support the use of it. But for many of us who worked on the coronavirus legislation, where we had to make very quick decisions, the speed with which we went online made it seem as if, as a society, we had moved decades forward from having to meet only in person. Even your Lordships’ Chamber managed to meet and vote online. But that comes with a set of challenges.

We have to look at what happened during Covid and the huge increase in domestic abuse. It was not just because we did not have to ask people to turn their cameras on. It was deemed that would be upsetting, so we could not see if somebody had been domestically abused. The impact of increasing domestic abuse was also because there was more recording. Even when you look at the technology that we have in your Lordships’ Chamber, it is not foolproof. I was on a call yesterday in my office. The system crashed twice, and the people I was speaking to on Teams did not even realise and carried on talking. We have to think very carefully about how we would use technology.

Age UK said that about 2.4 million older people do not have access to technology in this country and just under 2 million do not have a mobile phone, let alone a smartphone, so if we are going to do this, we need to think carefully about what other provisions will be in place. I agree with the noble Baroness, Lady Berger; why can the panel not go and visit the individual? I think there is something about being in their own home. The noble Baroness, Lady Pidgeon, raised rural areas. What if people do not have the technology? What will be put in place to ensure that there is a suitable online option?

--- Later in debate ---
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

My Lords, the Government’s 10-year health plan for England seeks to

“make the NHS the most AI-enabled health system in the world”.

Like others, I think that is an incredibly exciting prospect. I do not want it to be dystopian. I think that the right reverend Prelate the Bishop of Hereford makes an important point in warning us against going completely over the top. I think it is important that this amendment has been tabled, because it makes us think about what the possible problems are, which have been well expressed by others. Despite my excitement about what AI might do, even in terms of treatments—there are wonderful possibilities in terms of helping people to walk, what is happening with the brain, and so on—we do not want to be naive.

The question for the noble and learned Lord, Lord Falconer, is: as the NHS digitises and doctors become increasingly reliant on AI for notes and diagnostics, given that the diagnosis is so important in a life-or-death situation in this instance, how can we ensure that a time-poor doctor does not use AI as an assessment tool or a shortcut? We would be naive to imagine that that does not happen elsewhere; we would only have to think of politics. People now use AI to avoid doing research, in a wide range of instances, and I do not want that to be translated over.

As for the patients, algorithms are supremely impressive and can take things that have happened on Facebook or TikTok, from when you have been on a Teams meeting or Zoom—all sorts of indications—and detect chronic illness conversations. The algorithms can then push pro-assisted dying content such as the Switzerland adverts or positive end-of-life options. Interestingly, when discussing banning social media for under-16s, which I completely disapprove of, or bringing in the Online Safety Act, which I argued against, everybody kept saying, “Algorithms, oh my goodness, they can do all these things”. We should consider not that chatbots are malevolent but that AI tends to agree with people via the algorithms; to quote the title of a piece in Psychology Today, “When Everyone Has a Yes-Man in Their Pocket”. If you say that you are interested in something, they will just say, “Yes, here are your options”. That is something to be concerned about, and it will come up when we discuss advertising.

I finish with that BBC story from August of a Californian couple suing OpenAI over the death of their teenage son. They allege that ChatGPT encouraged him to take his own life, and they have produced the chat logs between Adam, who died last April, and ChatGPT that show him explaining his suicidal thoughts. They argue that the programme validated his most harmful and self-destructive thoughts. I am just saying that AI is a wonderful, man-made solution to many problems, but if we pass a Bill such as this without considering the potential negative possible outcomes, we would be being irresponsible.

Lord Hamilton of Epsom Portrait Lord Hamilton of Epsom (Con)
- Hansard - -

My Lords, I have supported AI for as long as I can remember, and I think it is the future for this country. If we are looking for improvements in productivity, there is no doubt that we should look to the National Health Service and the public sector, where we can see AI having its greatest effect and improving the health of the economy of this country.

However, we are in early days with AI, although it has been with us for some time. We must be very careful not to rely on it for too many things which should be done by human beings. The noble Lord, Lord Stevens, has already referred to the appalling rate of misdiagnosis. We can look at these statistics and say, “Well, it is only a small number who are misdiagnosed”. Yes, but my noble friend Lord Polack was misdiagnosed as only having six months to live and he is still with us 32 years later. You must think about this, because if you get the situation with misdiagnosis badly wrong, it undermines the basis of this Bill. Therefore, we must be very careful that AI does not contribute to that as well.

I pay tribute to the right reverend Prelate. AI is having a tremendous effect in the health service and helping a large number of people to get better, and it may well be that AI introduces cures for people who are being written off by their doctors—perhaps wrongly. We must not dismiss AI, but we must be very wary about where it leads us. There will be an awful lot of bumps in the road before AI is something in which we can all have complete confidence and believe will deliver better outcomes than human beings.

Lord McCrea of Magherafelt and Cookstown Portrait Lord McCrea of Magherafelt and Cookstown (DUP)
- Hansard - - - Excerpts

My Lords, there are just a few remarks I would like to make. We live in an age where it is hard to get a human to interact with any more. We lift the phone and speak to a voice that says that if you want one thing, press 1, and if you want something else, press 2. I fear that this is what we are heading for: if you want death, just press a button.

I have no doubt that if this legislation is passed as it is, in the near future we will be heading towards AI assessment procedures. My concern is not where we start in this process, but where it leads to and where it ends.

I am informed that, in the Netherlands, it has been proposed to use AI to kill patients in cases where doctors are unwilling to participate. Indeed, it is suggested that AI could be less prone to human error. Surely, in crucial assessments and decision-making processes for a person seeking assisted suicide, AI could not identify subtle coercion and assess nuanced capacity, bearing in mind the irreversible nature of the outcome. There are concerns about the risk of coercion or encouragement by AI. It should be noted that, with the newly developed AI functions and chatbots, there are already cases globally of individuals being coerced into all sorts of different behaviours, practices and decision-making.

Clause 5 allows doctors to direct the person

“to where they can obtain information and have the preliminary discussion”.

That source of information could be AI or a chatbot. Is there anything in the Bill that will prevent this?

AI undermines accountability. If the technology fails, who bears responsibility? Traditionally in the health service, the doctor bears responsibility. If AI is used, who bears responsibility?