Terminally Ill Adults (End of Life) Bill Debate
Full Debate: Read Full DebateBaroness Coffey
Main Page: Baroness Coffey (Conservative - Life peer)Department Debates - View all Baroness Coffey's debates with the Department of Health and Social Care
(1 day, 7 hours ago)
Lords ChamberI am grateful to the noble Lords, Lord Harper and Lord Empey, for signing this amendment. I will also speak to my Amendment 65 and consider an amendment put forward by the shadow Front Bench.
It is disappointing that the Justice Minister is not in her place on the Front Bench today, because in discussing these amendments I want to consider the important issue of assessing whether someone has capacity. The MoJ is responsible for that, and for several of the other matters I wish to speak on. We are only at Clause 1(3), but this is a key element to consider carefully: where do all these issues have to happen, and do they have to happen face to face?
As the Bill reads currently, it suggests that only the initial request for assistance, the first declaration, the doctor’s assessment and the second doctor’s assessment, and then the second declaration, have to happen while the person making all these requests is in the country. No other part of the Bill, including preliminary discussions and the act itself—all these other things—has to happen in this country; the person does not have to be here.
I think I have made it clear in a series of amendments that I have brought to the Committee that my concern is how this becomes something that is decided not just on paper. There should be real interaction, and I am trying to understand how the Bill will work in practice. That is why I have asked a series of questions on whether or not the terminally ill person making the request has to be in the country. We should get into other aspects, such as whether the panel has to be here.
We had a debate earlier in Committee, during which I made a clumsy attempt to make sure people had to be in this country. As I said, you can be ordinarily resident in more than one country at the same time. I want to continue to focus on this being a person-based process—I do not like using the term “patient-based process”, as I do not consider this to be a health treatment—and a lot of that is about where somebody is and whether there is a face-to-face link.
Recently, in a different policy, the Government rightly want to accelerate and increase substantially the number of face-to-face assessments for consideration of eligibility for sickness benefits. A lot of that was changed during Covid because, frankly, it was not practical to undertake that process. It has been gradually brought back and needs to be accelerated. The thinking alongside that policy is critical to the application of this Bill.
I have interpreted Clause 1(3)(b) as meaning that only
“steps under sections 10 and 11”
have to be done by persons in England and Wales, and that is the initial assessment. I put it to the Committee that a lot more of this should be done face to face. As the late James Munby pointed out, it is absolutely right that the panel should be considering this process and looking into this. I am conscious there will be medics here who have perhaps an even greater understanding than I of how the variety of assessments should be done face to face. What happens when people are making a declaration? Are we sure that somebody is not in the room, giving them the eyes so that they will give the right answers? How are we to understand whether coercion can happen or not?
In a documentary undertaken by ITV, the Bill’s promoter, Kim Leadbeater, expressed concerns about what happens in Oregon, where a lot of this is done by video link. I believe she was uncomfortable and would consider adding an amendment to make it clear that consultations with doctors could not be done by video call and should be done in person. That has not been done so far, and no explanation has been given. That is why I have tabled these amendments. They would be a very important way of making sure there are safeguards so that, as we go through this novel process to us— I appreciate it is not novel to the world—we have every confidence that a lot of the safeguards which people are concerned about are going to be appropriately applied.
Last week, a discussion on a group brought forward by the noble Lord, Lord Birt, gave us a picture of how this could look. Indeed, the amendments tabled by the noble and learned Lord, Lord Falconer, have started to touch on aspects of this, such as how a commission can happen. But I can see that, very quickly, especially bearing in mind some of the amendments last week—though I appreciate that the noble and learned Lord did not accept them—a panel could be meeting every day. Right now, it could involve somebody on holiday in Tenerife and somebody else elsewhere meeting on Zoom or Teams or whatever. That could quickly become a routine tick-box exercise. That is the very reason the late Sir James Munby pointed out that this should not be given to judges—what is the point of having a judge if it will be just a tick-box exercise? We need to be careful that we do not end up in that situation.
In Amendment 65, I have suggested specifically what needs to be done face to face: the preliminary discussion, the request—as is in the Bill—and the witness. The witness should be there and it should be face to face. That seems sensible. We have the first and second assessment already there, but I think we could go further. What about the interaction with the independent advocate? Is that going to be done down the phone? These are the serious things which we need to consider. Should the panel meet face to face with the person applying? I appreciate that, in Amendment 320A, the assumption is that it should be face to face, but perhaps with exceptions by a video link. Again, when I initially started observing this at the other end, I thought that this would happen. What seems to have evolved is that a lot of this will be done remotely. The only thing I have not included in Amendment 65 is the actual doctor being there and the assessment happening in this country—although that is not specified in the Bill. Clause 25(3) says that the co-ordinating doctor has to be there in person, although under the following clause that can all be delegated to somebody else.
I do not want to overly labour the point in consideration, but I hope noble Lords will give some thought to how they want to see the Bill work in practice. It may be that people are happy for this all to happen via video and are wondering why we are getting in the way, given that this is about autonomy. However, these would be sensible amendments to consider to make sure that, while no Minister yet has said this is a safe Bill, it is as safe as possible. We need to look at the operation of it. It is certainly the case in other parts of the health system that a lot of this would not be acceptable and would have to be done face to face. It is not a case of overengineering the Bill or leaving it to regulations. We should be clear in Parliament that this is what we are going to do.
I am conscious of Amendment 320A, and I appreciate that my noble friend Lord Evans of Rainow, in Amendment 376, has particularly singled out “in person” for parts of Clause 12. I get that some people may be so terminally ill that perhaps a video link might be used, but that should be exceptional, if we are going to go down that route at all. I look forward to hearing my noble friend explain why that is the case and how it can be administered. With that, I beg to move.
My Lords, I have put my name to some of these amendments. In the spirit of what the Chief Whip said, I will not repeat what the noble Baroness, Lady Coffey, said, but I gently remind the Committee that this Private Member’s Bill is not normal, in so far as most Private Member’s Bills are five, six, seven or eight pages. This one is 51 pages, with 59 clauses. It is a very different animal from what we are used to.
I think the amendments in this group have been tabled because, in many respects, this aspect of the process is deeply disturbing. We are talking about life and death here; we are talking about making assessments of a person who is making an application for an assisted death. Noble Lords will be aware that, on 29 October, in the Select Committee, Professor Martin J Vernon, chair of the British Geriatrics Society’s ethics and law special interest group, said:
“Assessing somebody remotely, digitally, without a face-to-face assessment, particularly if they have complex health and social care needs, is nigh-on impossible”.
I would have thought that, to assess somebody’s state of mind and to have any sense of judging whether they are being coerced or not, one of the most obvious things is to see them in front of you and get the feel for that. How can a psychiatrist judge this?
The other point I would make is about the practicalities. Depending on where someone is in this country, they may or may not have the equipment or the capacity to use it; signals drop off. Inevitably, if somebody is in a frail and unstable condition, there will have to be other people present to operate this. Does that mean that a team from the hospital would have to go out to some remote location—or, even worse, are we doing stuff on the phone? Can you imagine how people would react? “Dial-a-death” would be the sort of way that people would describe it.
That is a very valid point in relation to this particular amendment. The reason I think some sort of regulatory process from the Secretary of State—a code of practice or something similar—is better is that you can give much more detail and many more examples. You should not be relying on just a particular two-word legal test.
My Lords, this has been quite a revealing debate in many ways. In trying to go with the spirit of getting on with the Bill, I could have easily broken that group into just consideration about what should be physically in the country and what should be face to face. However, I thought bringing that debate together could have been, and has been, of benefit to your Lordships.
One thing that has come up, and the noble and learned Lord has recognised, is that the concept of face to face being largely default has been well received. There have been a few other issues, though. In her contribution, the noble Baroness, Lady Pidgeon, gave a series of examples where, I have been informed by barristers, it would be required for clinicians to conduct home visits.
I was particularly struck by several speeches: I am not going to repeat them all. The noble Baroness, Lady Keeley, spoke about something as straightforward as a will, and certainly the legal protections are there.
Going forward to Report, I am clear that I believe that a lot of the operation and activities of these panels should happen in this country, rather than the psychiatrist, the KC or whoever being abroad in Tenerife—never mind anybody else. I think it is not sufficient to rely simply on statutory guidance. I gave the example last week when we talked about Montgomery: the GMC only changed their guidance, as was referred to, five years after the legal ruling. It did not happen straight away. The other thing about statutory guidance is that it does not have to come to Parliament; it is simply what the Minister can put out. For me, there are deficiencies in that approach, although I understand the flexibility. The whole point is that—
Whether it has to come to Parliament depends on the provisions, does it not?
As it stands, there seems to be a variety around the Bill on whether or not there is that 40-day pre-laying. It just seems to vary. Standard legislation would not require it, unless Parliament or the Government inserted that specifically into the Bill.
Overall, there is still a lot to be discussed. I would like to seek a meeting with the Minister on the response that somehow Article 8 is engaged, linked to Article 14 and how Amendments 60 and 65 in particular are not operable in that regard. With that, I beg leave to withdraw Amendment 60.
My Lords, I am conscious that I might be accused of preferring quill and pen than the latest technology in Amendment 66. In recognising how artificial intelligence is emerging, I thought I would put down a blunt amendment to allow us at least to have a debate. Inevitably, in a variety of legal and health situations, we will start to see artificial intelligence being used routinely. There was a recent legal ruling in which it turns out a judge had completely relied on AI and gave a completely inaccurate ruling based on it. This is not simply about what would be considered by medical practitioners.
I worry about judgment. We have already heard, reasonably, that trying to predict when somebody will pass away due to a terminal illness involves a bit of science but is largely an art. Perhaps I am being ungenerous in that regard. Certainly, in the DWP, we moved accelerated access to benefits from a six-month consideration to 12 months simply because, routinely, the NHS does not require its practitioners to assess six months; it is much more accurate at assessing 12 months. It is interesting that this Bill is focused on six months when, routinely, the NHS does not use that period. However, I am diverting slightly from the point of artificial intelligence.
I was somewhat interested in the previous debate, because there seemed to be a majority—I will not say a consensus—who felt that face to face was an important part of this happening in practice. But there are still a significant number of people who seem happy that we use a variety of technology for some of the interactions.
Forgive me for fast forwarding, but I see this whole issue becoming pretty routine. What I want to avoid is outsourcing. It strikes me how much people rely on Wikipedia and think that they are actually dealing with the Encyclopaedia Britannica, even though a lot of what is on Wikipedia is a complete load of garbage. What is even more worrying is that many of the AI mechanisms use sources such as Wikipedia, or simply put two and two together and come up with 22. I saw this, not that long ago, when I was trying to find something from when I had been on the Treasury Committee and interrogated the FCA about something. The first thing that came out of ChatGPT was that, somehow, I had become a non-executive director of the FCA—if only. That certainly was not the case. I am concerned that an overreliance on AI might start to happen in this regard.
I want to avoid a world of chatbots that removes the human element. That is why I keep coming back to the themes of face to face, being in this country and this having a personal element. I am conscious that the NHS and other practitioners, including legal practitioners, will continue to evolve—I am not stuck in some dinosaur age—but I feel that the issues that those of us concerned about the Bill have will continue. We completely understand why people might want to do this, but we want to make sure that the safeguards, particularly around coercion, are as safe as possible. That is why I have raised for debate the consideration of whether, as a matter of principle, artificial intelligence should not be used in the deployment of the future Act.
As I said, there may be evolution in medicine; we see that that is already happening. I do not know to what extent the Government have confidence in the use of AI in the diagnosis of lifespans. A new evolution in government is that AI is now starting to handle consultations. That might get tested in court at some point, to see whether it is a meaningful way to handle consultations—it is certainly a cost-efficient way to do so. My point is that, according to the Wednesbury rule, there is supposed to be proper consultation, not just a tick-box exercise.
I will not dwell on this, but I would be very interested to hear, from not only the sponsor but the Government, their consideration of artificial intelligence in relation to the practicality and operability of the Bill if it were to become law. I beg to move.
My Lords, I have put my name to Amendment 66, in the name of the noble Baroness, Lady Coffey. At present, the Bill makes no allowance for any restriction on the possibility of the use of non-human assessment and automated administration devices during the application and decision-making process for assisted death. Obviously, AI will be used for recording meetings and stuff like that—I am not a quill and paper person to that extent—but AI has already been proposed for use in killing patients in the Netherlands, where doctors are unwilling to participate.
The Data (Use and Access) Act 2025 established a new regulatory architecture for automated decision-making and data interoperability in the NHS. It provides that meaningful human involvement is maintained for significant decisions—decisions which may affect legal status, rights or health outcomes. Of course, assisted death would come within that definition.
That reflects the purpose of the NHS. We have talked about its constitution. I looked at the constitution and the guidance. It says that the purpose of the NHS is
“to improve our health and wellbeing, supporting us to keep mentally and physically well, to get better when we are ill and, when we cannot fully recover, to stay as well as we can to the end of our lives”.
I know that the noble and learned Lord, Lord Falconer, is going to put down an amendment suggesting that the constitution and guidance will have to be amended, but the current situation is that that is the purpose of the NHS. The assisted suicide of patients is certainly not provided for in the NHS, nor should AI be used in the crucial assessment and decision-making process for assisted dying, given the extreme difficulties in identifying coercion and assessing nuanced capacity, and the irreversible nature of death. What plans does the noble and learned Lord have to address these issues?
In the Commons, amendments were passed allowing the Secretary of State to regulate devices for self-administration. The amendment was not put to a vote; in fact, only seven votes were permitted by the Speaker on the more than 80 non-Leadbeater amendments. The Commons have accepted that devices will be used for self-administration. Of course, the assisted suicide Bill requires self-administration. Nothing in the Bill prohibits a device that uses AI to verify identity or capacity at the final moment. If a machine makes the final go/no-go decision based on an eye blink or a voice command, have we not outsourced the most lethal decision-making in a person’s life to technology? I have to ask: is this safe?
Public education campaigns on assisted suicide are explicitly allowed for in Clause 43. The Government have said that there will be an initial education campaign to ensure that health and social care staff are aware of the changes, and that there would likely be a need to provide information to a much wider pool of people, including all professionals who are providing or have recently provided health or social care to the person, as well as family members, friends, unpaid carers, other support organisations and charities. That controls only government activity. The other observation I would make is that I presume the public education campaign will inform families that they have no role in a person’s decision to choose assisted death, and that the first they may know of an assisted death is when they receive the phone call telling them that the person is dead. It is profoundly important that people know this.
There is nothing to prevent an AI chatbot or search algorithm helpfully informing a patient about assisted dying services and prioritising assisted dying over palliative care search results. By legalising this service, the Bill will feed the training data that makes these AIs suggest death as a solution. I would ask the noble and learned Lord, Lord Falconer, how he intends to police that situation.
There is also a risk of algorithmic bias. If prognostic AI is trained on biased datasets—we know the unreliability of the prognosis of life expectancy—it could disproportionately label certain demographics as terminal, subtly influencing the care options, including assisted dying, presented to them. The National Commission into the Regulation of AI in Healthcare established by the MHRA in 2025 is currently reviewing these risks to ensure that patient safety is at the heart of regulatory innovation. I ask the Minister: will that work cover assisted dying?
The AI Security Institute’s Frontier AI Trends Report in December 2025 highlights that:
“The persuasiveness of Al models is increasing with scale”,
and:
“Targeted post-training can increase persuasive capabilities further”.
In a healthcare context, this raises the risk of automated coercion, where the person interacting with a chatbot or an AI voice agent might be subtly persuaded towards certain end-of-life choices. The AISI has said that safeguards will not prevent all AI misuse. We have to remember that there will be financial incentives to provide assisted suicide; after all, the CEO of Marie Stopes received between £490,000 and £499,000 in 2024. There is big money, even though this will be charitable or NHS work. Clause 5 allows doctors to direct the person to where they can obtain information and have the preliminary discussion. That sort of information could be an AI or a chatbot at the present time.
Dr Sarah Hughes, giving evidence to the Lords Select Committee, said there was a real risk of “online coercion”. With newly developed AI functions and chatbots, we already know there are cases all around the world of individuals being coerced into all sorts of different behaviours, practices and decision-making. There is also an issue of misinformation around diagnosis and prognosis. Hannah van Kolfschooten questioned who has ultimate responsibility if the technology fails. She said:
“In traditional euthanasia settings, a doctor is accountable, but in AI-driven scenarios, accountability could become ambiguous, potentially resting between manufacturers, healthcare providers, and even the patient”.
AIs also have a record of encouraging suicide. We know that, and we have seen terrible cases among young people; they have no regard for human life.
Evidence shows that doctors suspect only 5% of elder abuse cases. Detecting subtle coercion requires, as was said in the previous group, professional judgment to interpret things such as non-verbal cues, body language and discomfort. AI systems are ill-equipped to handle these nuanced, non-quantifiable elements. It is imperative for trust in the system that the individual circumstances of each request for assisted death are recorded and are available for interrogation, or even potentially a criminal investigation, by the panel or another regulatory authority. The only insight as to what happened in the consulting room will come from these records. The patient will be dead. The current provision in the Bill does not provide any protection against the use of AI, which has algorithmic bias, to protect an individual in these circumstances. Can the noble and learned Lord, Lord Falconer, explain how he proposes to deal with these concerns?
My Lords, it has been an interesting debate. I was struck by the question from the noble Baroness, Lady Finlay of Llandaff, about fake voices. That is an interesting thing for us to consider.
Yes, my amendment was quite blunt, but we once had a briefing—sorry, it sounds like I am going to name drop—at the Cabinet table; we had a whole session on it, telling us that artificial intelligence had learned to lie. A classic example is “I am not a robot”: artificial intelligence will, in effect, just press the element that says, “I’ve got accessibility issues”, and if you do that, you do not have to do any more verification. There is a whole series here. The noble Lord, Lord McCrea, referred to an article. It is worth reading AI and Ethics Volume 5 from last year.
I wondered if the Minister might raise—I hoped that she would—the commission by the MHRA, which was started last year, specifically considering artificial intelligence. If she has not already, I think it would be worth while sending an instruction to the MHRA to start considering this aspect.
Ultimately, although I appreciate that there are noble Lords who do not want us to talk about the detail of how this might work, I think it is critical that we as parliamentarians set out for the future of any potential guidance, SIs or whatever legal rulings what we expect or how we expect this Act to deploy.
There is no doubt that algorithmic bias is a concern. That is why, although I do not entirely agree with the noble and learned Lord that we might not revisit an amendment on Report referring to AI, I thank noble Lords for this debate and beg leave to withdraw the amendment.
My Lords, I put in my opposition generally. Just as we said on the first day of debate, it is not certainly my intention to move a Division on this. We will leave these things to Report.
We have debated Clause 1 extensively but, as I said in my explanatory statement, I wanted the opportunity to potentially revisit certain issues, or to try to get some more answers on them if we felt they had not been covered. I know that on day one noble Lords thought I was filibustering by talking about Wales, but, for me, devolution is a really important part of the Bill, and I have to say that I was not satisfied by the Minister’s answers at the time.
Increasingly, pretty much every freedom of information request has been rejected, and I do not think that helps Parliament. The Cabinet Office tells us that the Government are forming policy on this issue in anticipation, but that it is not in the public interest to share that. I find that really challenging. It is no surprise that we have spent so much time on Clause 1; it is the key clause and let us not pretend otherwise. However, even the briefing pack for an official who gave oral evidence to your Lordships’ Select Committee— I did not put that FoI request in, but someone did—was turned down for release. That decision was made personally by the Justice Minister, again on the grounds that it was not in the public interest and “We’re only going to work with the sponsor”. I genuinely think we would make more progress if we had better understanding and more shared information as we consider one of the most significant changes to the law that we have had. However, I will not be pressing this to a Division.
My Lords, I agree with Clause 1. If there was a vote on it, I would vote for it. I may not carry the Committee with me but, having sat through Committee and one day of Second Reading— I could not attend the first day—I feel we have reached the point where, if the Bill were a tree, we have dealt with the trunk. We have now got to a point where we look at the branches.
The Commons—this is a message to the Commons, in a way—needs to know that we can move at pace once we have the Bill sponsor’s reaction and proposals following the debate on Clause 1. I am absolutely convinced that control of the progress and speed of the Bill is now completely in the hands of my noble and learned friend, not in those of the Committee.