Debates between Baroness Chakrabarti and Lord Houghton of Richmond during the 2019 Parliament

Overseas Operations (Service Personnel and Veterans) Bill

Debate between Baroness Chakrabarti and Lord Houghton of Richmond
Baroness Chakrabarti Portrait Baroness Chakrabarti (Lab) [V]
- Hansard - -

My Lords, I can only commend my noble friend Lord Browne of Ladyton and the noble Lord, Lord Clement-Jones, on two of the most powerful, if terrifying, contributions to this Bill’s proceedings so far. In particular, I shall be having nightmares about their projections for the potential dissonance between varying international approaches to the definition of autonomous weapons and the way in which their deployment and development matches, or does not match, traditional approaches to humanitarian law.

Regarding the Bill, my noble friend has a very good point. He makes a specific observation about the fact that a drone operator in the UK will suffer many of the traumas and risks of a traditional soldier in the field but, on the face of it, that is not covered by this legislation at all. I look forward to the Minister’s response to that in particular, but also to the broader questions of risk—not just legal risk in a defensive way to our personnel but ethical and moral risk to all of us. In this area of life, like every other, the technology moves apace, but the law, politics, transparency, public discourse and even ethics seem to be a few paces behind.

Lord Houghton of Richmond Portrait Lord Houghton of Richmond (CB) [V]
- Hansard - - - Excerpts

My Lords, I am delighted to follow on from the noble Baroness, Lady Chakrabarti, who always seems to be a great source of common sense on complex moral issues. I am similarly delighted to support the amendment in the name of my one-time boss, the noble Lord, Lord Browne of Ladyton. I will not seek to repeat his arguments as to why this amendment is important, but rather to complement his very strong justification with my own specific thoughts and nuances.

I will start with some general comments on the Bill, as this is my only contribution at this stage. At Second Reading I made my own views on this Bill quite clear. I felt that it missed the main issues regarding the challenges of Lawfare. Specifically, I felt that the better route to reducing the problem of vexatious claims was not through resort to legal exceptionalism, but rather rested on a series of more practical measures relating to such things as investigative capacity, quality and speed; better training; improved operational record keeping; more focused leadership, especially in the critical area of command oversight; and a greater duty of care by the chain of command. On this latter, I wholly support the amendment of my noble friend Lord Dannatt.

Having listened to the arguments deployed in Committee, I am struck by the seeming inability of even this sophisticated Chamber to reach a common view as to whether the many provisions of this Bill offer enhanced protections or increased perils for our servicemen and women. This causes me grave concern. How much more likely is it that our servicemen and women—those whose primary desire is to operate within the law—will be confused; and how much more likely is it that are our enemies—those who want to exploit the law for mischief—will be encouraged?

I hold to the view that the law, in any formulation, cannot be fashioned into a weapon of decisive advantage in our bid to rid our people of vexatious claims. Rather, the law will increasingly be exploited by our enemies as a vector of attack, both to frustrate our ability to use appropriate force and to find novel ways of accusing our servicemen and women of committing illegal acts. The solution to this problem is a mixture of functional palliatives and better legal preparedness. This amendment addresses one element of this preparedness.

As we have already heard, one area of new legal challenge will undoubtedly be in the realm of novel technologies, particularly those which employ both artificial intelligence and machine learning to give bounded autonomy to unmanned platforms, which in turn have the ability to employ lethal force. We are currently awaiting the imminent outcome of the integrated review, and we understand that a defence command paper will herald a new era of technological investment and advancement: one that will enable a significant reduction in manned platforms as technology permits elements of conflict to be subordinated to intelligent drones and armed autonomous platforms.

However—and this is the basic argument for this amendment—the personal liability for action in conflict to be legal will not cease, although it may become considerably more opaque. We must therefore ask whether we have yet assessed the moral, legal, ethical and alliance framework and protocols within which these new systems will operate. Have we yet considered and agreed the command and control relationships, authorities and delegations on which will rest the legal accountability for much new operational activity?

Personally, I have a separate and deep-seated concern that a fascination with what is technically feasible is being deployed by the Government, consciously or unconsciously, primarily as the latest alchemy by which defence can be made affordable. It is being deployed without properly understanding whether its true utility will survive the moral and legal context in which it will have to operate. I therefore offer my full support to this amendment, in the hope that it will assist us in getting ahead of the problem. The alternative is suddenly waking up to the fact that we have created Armed Forces that are both exquisite and unusable in equal measure.