4 Lord Clement-Jones debates involving the Ministry of Defence

Artificial Intelligence in Weapon Systems Committee Report

Lord Clement-Jones Excerpts
Friday 19th April 2024

(1 month, 1 week ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I add to the congratulations to the noble Lord, Lord Lisvane, on his excellent chairing of the committee and his outstanding introduction today. I thank the staff and advisers of the committee, who made an outstanding contribution to the report. It has been a pleasure hearing the contributions today. I add my thanks to the military who hosted us at the Permanent Joint Headquarters, at Northwood, where we learned a huge amount as part of the inquiry.

Autonomous weapon systems present some of the most emotive and high-risk challenges posed by AI. We have heard a very interesting rehearsal of some of the issues surrounding use and possible benefits, but particularly the risks. I believe that the increasing use of drones in particular, potentially autonomously, in conflicts such as Libya, Syria and Ukraine and now by Iran and Israel, together with AI targeting systems such as Lavender, highlights the urgency of addressing the governance of weapon systems.

The implications of autonomous weapons systems—AWS—are far-reaching. There are serious risks to consider, such as escalation and proliferation of conflict, accountability and lack of accountability for actions, and cybersecurity vulnerabilities. The noble Baroness, Lady Hodgson, emphasised the negatives—the lack of empathy and kindness that humans are capable of in making military decisions. I thought it was interesting that the noble Earl, Lord Attlee, in a sense argued against himself, at the beginning of his contribution, on the kinds of actions that an AI might take which a human would not. There were issues mentioned by the noble Lord, Lord St John, as well, about misinformation and disinformation, which is a new kind of warfare.

Professor Stuart Russell, in his Reith lecture on this subject in 2021, painted a stark picture of the risks posed by scalable autonomous weapons capable of destruction on a mass scale. This chilling scenario underlines the urgency with which we must approach the regulation of AWS. The UK military sees AI as a priority for the future, with plans to integrate “boots and bots” to quote a senior military officer.

The UK integrated review of 2021 made lofty commitments to ethical AI development. Despite this and the near global consensus on the need to regulate AWS, the UK has not yet endorsed limitations on their use. The UK’s defence AI strategy and its associated policy statement, Ambitious, Safe, Responsible, acknowledged the line that should not be crossed regarding machines making combat decisions but lacked detail on where this line is drawn, raising ethical, legal and indeed moral concerns.

As we explored this complex landscape as a committee—as the noble and gallant Lord, Lord Houghton, said, it was quite a journey for many of us—we found that, while the term AWS is frequently used, its definition is elusive. The inconsistency in how we define and understand AWS has significant implications for the development and governance of these technologies. However, the committee demonstrated that a working definition is possible, distinguishing between fully and partially autonomous systems. This is clearly still resisted by the Government, as their response has shown.

The current lack of definition allows for the assertion that the UK neither possesses nor intends to develop fully autonomous systems, but the deployment of autonomous systems raises questions about accountability, especially in relation to international humanitarian law. The Government emphasise the sufficiency of existing international humanitarian law while a human element in weapon deployment is retained. The Government have consistently stated that UK forces do not use systems that deploy lethal force without human involvement, and I welcome that.

Despite the UK’s reluctance to limit AWS, the UN and other states advocate for specific regulation. The UN Secretary-General, António Guterres, has called autonomous weapons with life-and-death decision-making powers “politically unacceptable, morally repugnant” and deserving of prohibition, yet an international agreement on limitation remains elusive.

In our view, the rapid development and deployment of AWS necessitates regulatory frameworks that address the myriad of challenges posed by these technologies. I was extremely interested to hear the views of the noble Lord, Lord Stevens, and others during the debate on the relationship between our own military and the private sector. That makes it even more important that we address the challenges posed by these technologies and ensure compliance with international law to maintain ethical standards and human oversight. I share the optimism of the noble Lord, Lord Holmes, that this is both possible and necessary.

Human rights organisations have urged the UK to lead in establishing new international law on autonomous weapon systems to address the current deadlock in conventional weapons conventions, and we should do so. There is a clear need for the UK to play an active role in shaping the nature of future military engagement.

A historic moment arrived last November with the UN’s first resolution on autonomous weapons, affirming the application of international law to these systems and setting the stage for further discussion at the UN General Assembly. The UK showed support for the UN resolution that begins consultations on these systems, which I very much welcome. The Government have committed also to explicitly ensure human control at all stages of an AWS’s life cycle. It is essential to have human control over the deployment of the system, to ensure both human moral agency and compliance with international humanitarian law.

However, the Government still have a number of questions to answer. Will they respond positively to the call by the UN Secretary-General and the International Committee of the Red Cross that a legally binding instrument be negotiated by states by 2026? How do the Government intend to engage at the Austrian Government’s conference “Humanity at the Crossroads”, which is taking place in Vienna at the end of this month? What is the Government’s assessment of the implications of the use of AI targeting systems under international humanitarian law? Can the Government clarify how new international law on AWS would be a threat to our defence interests? What factors are preventing the Government adopting a definition of AWS, as the noble Lord, Lord Lisvane, asked? What steps are being taken to ensure meaningful human involvement throughout the life cycle of AI-enabled military systems? Finally, will the Government continue discussions at the Convention on Certain Conventional Weapons, and continue to build a common understanding of autonomous weapon systems and elements of the constraints that should be placed on them?

As the noble Lord, Lord Lisvane, started off by saying, the committee rightly warns that time is short for us to tackle the issues surrounding AWS. I hope the Government will pay close and urgent attention to its recommendations.

Armed Forces Bill

Lord Clement-Jones Excerpts
Lord Craig of Radley Portrait Lord Craig of Radley (CB)
- Hansard - - - Excerpts

My Lords, the noble Lord, Lord Browne of Ladyton, has given us a very thoughtful, well-researched and deeply troubling series of remarks about the future in this area. I wanted to concentrate on a rather narrower point. Those who are ordered to fight for the interests of this country must do so—now and in the future, as more novel technologies find their way into kinetic operations—in the certain knowledge that their participation, and the way in which they participate, is lawful in both national and international jurisdictions. As has become evident in some of the asymmetric operations of recent years, there is real evidence that post-conflict legal challenges arise, and future operations may prove impossible to clear up quickly and comprehensively unless we have thought deeply about it.

Risking one’s life is a big ask, but to combine it with a risk of tortuous and protracted legal aftermath is totally unacceptable. I support the simple thrust of the amendment to demonstrate that the Government indeed have this matter under active review, as one must expect them to. It is infinitely better that the answers to these issues are there before a further operation has to be waged, not after it is over, when issues that should have been foreseen and dealt with press on individuals and others in our Armed Forces. Should the protection of combat immunity not be brought into the frame of discussion and resolution of this seriously troublesome issue?

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, it is a great pleasure to follow the noble Lord, Lord Browne of Ladyton, and the noble and gallant Lord, Lord Craig, in supporting Amendment 29, which the noble Lord introduced so persuasively, as he did a similar amendment on the overseas operations Bill that I signed and in Grand Committee on this Bill—I apologise for being unable to support him then. Since we are on Report, I will be brief, especially given the hour. Of course I do not need to explain to the Minister my continuing interest in this area.

We eagerly await the defence AI strategy coming down the track but, as the noble Lord said, the very real fear is that autonomous weapons will undermine the international laws of war, and the noble and gallant Lord made clear the dangers of that. In consequence, a great number of questions arise about liability and accountability, particularly in criminal law. Such questions are important enough in civil society, and we have an AI governance White Paper coming down the track, but in military operations it will be crucial that they are answered.

From the recent exchange that the Minister had with the House on 1 November during an Oral Question that I asked about the Government’s position on the control of lethal autonomous weapons, I believe that the amendment is required more than ever. The Minister, having said:

“The UK and our partners are unconvinced by the calls for a further binding instrument”


to limit lethal autonomous weapons, said further:

“At this time, the UK believes that it is actually more important to understand the characteristics of systems with autonomy that would or would not enable them to be used in compliance with”


international human rights law,

“using this to set our potential norms of use and positive obligations.”

That seems to me to be a direct invitation to pass this amendment. Any review of this kind should be conducted in the light of day, as we suggest in the amendment, in a fully accountable manner.

Autonomous Weapons Systems

Lord Clement-Jones Excerpts
Monday 1st November 2021

(2 years, 6 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Asked by
Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - -

To ask Her Majesty’s Government what assessment they have made of the calls made at the August meeting of the Group of Governmental Experts on Lethal Autonomous Weapons Systems at the Convention on Certain Conventional Weapons for a legally-binding instrument, including both prohibitions and positive obligations, to regulate autonomous weapons systems.

Baroness Goldie Portrait The Minister of State, Ministry of Defence (Baroness Goldie) (Con)
- Hansard - - - Excerpts

My Lords, the UK is an active participant in United Nations discussions on lethal autonomous weapons systems, working with partners to build norms to ensure safe and responsible use of autonomy. The UK and our partners are unconvinced by the calls for a further binding instrument. International humanitarian law provides a robust principle-based framework for the regulation of weapons deployment and use. A focus on effects is most effective in dealing with complex systems in conflict.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, the Minister’s reply is pretty disappointing. It puts the Government, despite statements in the integrated review, at odds with nearly 70 countries and thousands of scientists in their unwillingness to rule out lethal autonomous weapons. Will the Minister commit to rethinking government policy in terms of giving our representatives at the next meeting of the Convention on Certain Conventional Weapons on 2 December a mandate to go ahead with negotiations for a legally binding instrument, which, after all, has been called for by the UN Secretary-General?

Baroness Goldie Portrait Baroness Goldie (Con)
- Hansard - - - Excerpts

I am sorry that the noble Lord is disappointed, because I know the extent of his interest in this issue. I have tried to facilitate engagement with the department to enable him to better understand what the department is doing and why we take the views that we do. He will be aware that international consensus on a definition of laws has so far proved impossible. At this time, the UK believes that it is actually more important to understand the characteristics of systems with autonomy that would or would not enable them to be used in compliance with IHL, using this to set our potential norms of use and positive obligations.

No legislation designed to deliver on an overall policy intention to reassure our service personnel in the event that they are deployed overseas can deliver on that intention in this part of the 21st century without engaging the issues which this amendment addresses. Without this or a similar amendment, I fear that this legislation will be out of date as soon as it receives Royal Assent. I beg to move.
Lord Clement-Jones Portrait Lord Clement-Jones (LD) [V]
- Hansard - -

My Lords, it is a pleasure to follow the noble Lord, Lord Browne of Ladyton, in supporting his Amendment 32, which he introduced so persuasively and expertly. A few years ago, I chaired the House of Lords Select Committee on AI, which considered the economic, ethical and social implications of advances in artificial intelligence. In our report published in April 2018, entitled AI in the UK: Ready, Willing and Able?, we addressed the issue of military use of AI and stated:

“Perhaps the most emotive and high-stakes area of AI development today is its use for military purposes”,


recommending that this area merited a “full inquiry” on its own. As the noble Lord, Lord Browne of Ladyton, made plain, regrettably, it seems not yet to have attracted such an inquiry or even any serious examination. I am therefore extremely grateful to the noble Lord for creating the opportunity to follow up on some of the issues we raised in connection with the deployment of AI and some of the challenges we outlined. It is also a privilege to be a co-signatory with the noble and gallant Lord, Lord Houghton, who too has thought so carefully about issues involving the human interface with technology.

The broad context, as the noble Lord, Lord Browne, has said, is the unknowns and uncertainties in policy, legal and regulatory terms that new technology in military use can generate. His concerns about complications and the personal liabilities to which it exposes deployed forces are widely shared by those who understand the capabilities of new technology. That is all the more so in a multilateral context where other countries may be using technologies that we would either not deploy or the use of which could create potential vulnerabilities for our troops.

Looking back to our report, one of the things that concerned us more than anything else was the grey area surrounding the definition of lethal autonomous weapon systems—LAWS. As the noble Lord, Lord Browne, set out, when the committee explored the issue, we discovered that the UK’s then definition, which included the phrase

“An autonomous system is capable of understanding higher-level intent and direction”,


was clearly out of step with the definitions used by most other Governments and imposed a much higher threshold on what might be considered autonomous. This allowed the Government to say:

“the UK does not possess fully autonomous weapon systems and has no intention of developing them. Such systems are not yet in existence and are not likely to be for many years, if at all.”

Our committee concluded that, in practice,

“this lack of semantic clarity could lead the UK towards an ill-considered drift into increasingly autonomous weaponry.”

This was particularly in light of the fact that, at the UN Convention on Certain Conventional Weapons group of governmental experts in 2017, the UK opposed the proposed international ban on the development and use of autonomous weapons. We therefore recommended that the UK’s definition of autonomous weapons should be realigned to be the same or similar with that being used by the rest of the world. The Government, in their response to the report of the committee in June 2018, replied that:

“The Ministry of Defence has no plans to change the definition of an autonomous system.”


They did say, however,

“The UK will continue to actively participate in future GGE meetings, trying to reach agreement at the earliest possible stage.”


Later, thanks to the Liaison Committee, we were able on two occasions last year to follow up on progress in this area. On the first occasion, in reply to the Liaison Committee letter of last January which asked,

“What discussions have the Government had with international partners about the definition of an autonomous weapons system, and what representations have they received about the issues presented with their current definition?”


The Government replied:

“There is no international agreement on the definition or characteristics of autonomous weapons systems. Her Majesty’s Government has received some representations on this subject from Parliamentarians”.


They went on to say:

“The GGE is yet to achieve consensus on an internationally accepted definition and there is therefore no common standard against which to align. As such, the UK does not intend to change its definition.”


So, no change there until later in the year in December 2020, when the Prime Minister announced the creation of the autonomy development centre to,

“accelerate the research, development, testing, integration and deployment of world-leading AI,”

and the development of autonomous systems.

In our follow-up report, AI in the UK: No Room for Complacency, which was published in the same month, we concluded:

“We believe that the work of the Autonomy Development Centre will be inhibited by the failure to align the UK’s definition of autonomous weapons with international partners: doing so must be a first priority for the Centre once established.”


The response to this last month was a complete about-turn by the Government, who said:

“We agree that the UK must be able to participate in international debates on autonomous weapons, taking an active role as moral and ethical leader on the global stage, and we further agree the importance of ensuring that official definitions do not undermine our arguments or diverge from our allies.”


They go on to say:

“the MOD has subscribed to a number of definitions of autonomous systems, principally to distinguish them from unmanned or automated systems, and not specifically as the foundation for an ethical framework. On this aspect, we are aligned with our key allies. Most recently, the UK accepted NATO’s latest definitions of ‘autonomous’ and ‘autonomy’, which are now in working use within the Alliance. The Committee should note that these definitions refer to broad categories of autonomous systems, and not specifically to LAWS. To assist the Committee we have provided a table setting out UK and some international definitions of key terms.”