Technology Rules: The Advent of New Technologies in the Justice System (Justice and Home Affairs Committee Report)

Monday 28th November 2022

(1 year, 5 months ago)

Grand Committee
Read Hansard Text Read Debate Ministerial Extracts
Motion to Take Note
17:28
Moved by
Baroness Hamwee Portrait Baroness Hamwee
- Hansard - - - Excerpts

That the Grand Committee takes note of the Report from the Justice and Home Affairs Committee Technology rules? The advent of new technologies in the justice system (1st Report, Session 2021–22, HL Paper 180).

Baroness Hamwee Portrait Baroness Hamwee (LD)
- Hansard - - - Excerpts

My Lords, I am delighted to move this Motion and I hope the Grand Committee will support it.

This is the first formal report of our committee, which was formed in April last year. At the start, our members knew little about new technologies—I hope I am not being unkind to any of them. After some tuition, we confessed ourselves terrified, but we should not have been terrified about not understanding technologies; in a way, that is the point. The report is about new technologies and how they affect the citizen in the justice system. We looked largely at policing because that was where the evidence led us, but our recommendations have wider application.

Quite early on I asked, rhetorically, “How would I feel if I was arrested, charged, convicted and imprisoned on the basis of evidence I did not understand and could not access?” Towards the end of our work, another member said, “Look at Horizon and the Post Office; look at what happens when you assume the computer is always right”.

We heard about the software and tools used to record, store, organise, search and analyse data, and those used to predict future risk based on the analysis of past data. Predictive policing includes identifying, say, an estate where there has been a lot of crime, putting police in and detecting more crime than in an area that is not overpoliced. The data reflects this increased detection rate as an increased crime rate, and that is embedded in the next predictions. It is a vicious circle which, as a witness said, is

“really pernicious. We are looking at high-volume data that is mostly about poor people, and we are turning it into prediction tools about poor people.”

The noble Lord, Lord Blunkett, who had hoped to speak this afternoon but, given the change of time, has a clash and apologises for not being here, asked me to say the following:

“It is critical that the substantial issues addressed in the report are confronted before major problems arise, rather than because of them. The wide-ranging implications for the operation and therefore the credibility of the criminal justice system, and the unanimity supporting the committee’s findings, require something better than kicking the can down the road or believing that the present architecture can handle the growth and significance in the use of artificial intelligence.”

I heard a murmur of support when I was reading that, but I will continue even though it pretty much says what I will say over the next few minutes.

The “something better” includes welcoming innovation and regulating it appropriately. The issues are difficult, but the point was not to put them in the “too difficult” tray. I believe that the report answers the not unexpected concerns that we must not stifle innovation, that each police force should be free to take its own decision and that police and crime commissioners must ensure compliance with human rights.

Proposing regulation often raises hackles, but it is another way of requiring standards to be met. Standards are a good thing—in themselves and because something known to meet agreed standards is more likely to be trusted. For example, standards can ensure, to the greatest possible extent, that conscious and unconscious bias—such as racial bias in stop and search tools—is not baked in. That is to the benefit of the producer as well as others. In other words, standards support innovation.

Procurements deserve a lot of attention. A police officer procuring a product can be vulnerable to an overenthusiastic sales pitch—we heard some horror stories—or a one-sided contract. I would have loved to see a form of contract, for instance, about the ownership of data, both input and output. Does the commercial producer of the programme own it? It is a big question, which makes one wonder about data inadequacy, but I will not go there this afternoon. We were not able to get hold of a form of contract: commercial confidentiality gets in the way.

National standards would include requirements in respect of reliability, accuracy and performance in the context of their use, evaluation, validity, suitability and relevance. It is very worrying if standards are regarded as a threat.

We heard a lot about the independence of police and crime commissioners, and that PCCs and chiefs ensure compliance with human rights. I heard that as overdefensive. Of course each force should pick products to suit its local needs; there are 43 forces applying the same law. By analogy, the BSI kitemark is in common use for many products in other sectors—in other words, certification. The police could have a choice among certified products. That would not preclude them picking products to suit their own local priorities. Operationally, this would not mean that the police do not have to assess both the necessity and proportionality of each deployment.

This is all part of governance. The point was made more than once, including by government: “You can always go to court to sort things out”, but the courts’ role is to apply the law, and nothing goes to court unless someone takes it there. That needs determination, emotional energy and money. By definition, the judgment will not be a comprehensive assessment nor a systematic evaluation.

In a similar vein, the Minister said to us that Parliament is the national ethics body—to be fair, I think that was a throwaway line—but I doubt that we are qualified for that. However, Parliament has a role in establishing a national body: independent, on a statutory basis and with a budget. We think there should be a single national body. Our report lists 30 relevant bodies and programmes. That makes for very complicated governance.

There can never be a completely one-stop shop, but that does not mean that simplification is not needed. It is not surprising that there is confusion as to where to find guidance. The committee recommends a body where all relevant legislation, regulation and guidance are collated, drawing together high-level principles and practice. Primary legislation should be for general principles, with detailed regulation setting minimum standards—not so prescriptive as to stifle innovation, but recognising the need for the safe and ethical use of technologies. We recommend the use of statutory instruments, despite the procedural drawbacks with which your Lordships are familiar, as a vehicle for regulations and a basis for guidance, with scope for non-statutory guidelines.

To assess necessity and proportionality, we need transparency. A duty of candour is associated more with the health service, but we urge the Government to consider what level of candour would be appropriate to require of police forces regarding their use of new technologies.

We also recommend mandatory participation in the Government’s algorithmic transparency standard—currently, it is voluntary—and that its scope be extended to all advanced algorithms used in the application of law which has implications for individuals. This would in effect produce a register, under the aegis of the central body. I understand that the Information Commissioner’s Office and Thames Valley Police, and no doubt more, are involved with the standard, and there is clear wish to link compliance with it to processes to improve technology and to enable police to exchange information about what works and what does not. There is a wish too to link it to independent oversight.

Ensuring the ethical use of any tool is fundamental. That has to be integral to the use of the tool, as we have seen with live facial recognition and the London gangs matrix, whose review apparently led to the removal of the names of some 1,000 young black men. The West Midlands Police are leaders with their ethics committee, both in having it and in how it is used—I have been very impressed by what I have heard and seen of its operation. There are similar bodies in a few, but only a few, other forces. If we get the standards right, the tools will be better trusted, by the citizen and the police themselves. That will free up police resources.

Current legislation provides that a person shall not be subject to

“a decision based solely on automated processing, including profiling, which … significantly affects him.”

The then Home Secretary assured us that decisions about humans would always be taken by humans—a human in the loop—but clicking a button on a screen is not enough when one starts from the mindset that “the computer is always right”. We agreed with the witness who said that the better way is that the machine is in the loop of human decision-making.

Does the human understand what it and he are doing? “Explainability” is essential; I had not come across that term before, but it seems to be used a lot in the sector. It is essential for the user, the citizen affected and everyone else. If the police officer does not understand the technology, how can he know if he—or it—has made a mistake? A critical approach in the best sense is needed.

The Sunday Times recently reported on new AI which will detect sex pests and thugs on trains who intend to assault rail passengers. It said:

“When a woman is sitting on her own in a carriage with empty seats, it could also assess whether she feels threatened when a man comes to sit down next to her or whether she welcomes his presence.”


There is no hint there might be some fallibility in all this. With all of this, noble Lords will not be surprised that we identified a lot of training needs.

We received the Home Office response to our report in the summer. I wrote on behalf of the committee to the then Home Secretary that we were “disheartened”—the best term I felt I could use courteously—by the

“reaction to what we hoped would be understood as constructive conclusions and recommendations. These are very much in line with the recommendations of other recently published work”.

Indeed, a workshop discussing the report last week at the Alan Turing Institute bore this out. The response read to us as more satisfied with the current position than was consonant with the evidence we had used. I will not quote from the Government’s response as I am optimistic that the Minister today will be able to indicate an understanding of our conclusions and an enthusiasm to progress our recommendations. I beg to move.

17:42
Lord Hunt of Wirral Portrait Lord Hunt of Wirral (Con)
- Hansard - - - Excerpts

My Lords, I draw attention to my entry in the register, in particular to my role as a partner in the international commercial law firm, DAC Beachcroft. I am very much aware from that separate strand of my life how law firms are increasingly under pressure from their clients to make use of automation and AI. This can lead directly to efficiencies and cost savings. It also offers up the longer-term possibility of developing and licensing self-serve law tech solutions to replicate some of the services that law firms have traditionally provided, reducing the dependency on lawyers. In a highly competitive market, technology can make all the difference. So, both as a lawyer and a legislator, I warmly welcome this debate. I congratulate the noble Baroness, Lady Hamwee, for her impressive opening speech, her leadership of the select committee and her wise guidance in helping us to produce a very persuasive report.

I dare say that all reports suffer to some extent from in-built obsolescence, especially those dealing with technology. However, I hope that by going back to first principles, the committee has given this one sustainable life and relevance. As we read our way into these questions and raised them with witnesses, I think it is fair to say that we grew more, not less, concerned about the implications for the rule of law of the burgeoning technologies that are increasingly available.

The very good report we produced by consensus with the help of our excellent support team makes our sense of concern—even alarm—very overt and apparent. Our inquiry left me in no doubt about the scale of the challenge we all face to ensure that new technologies serve the best interests of justice and the public interest more widely.

Some noble Lords may have heard or read a highly stimulating lecture earlier this year by the Master of the Rolls, Sir Geoffrey Vos, in which he mused on the significance for us all of

“the inexorable rise in blockchain technologies”,

which will

“immutably record every event or transaction in our lives.”

He also predicted that a

“truly integrated online digital justice system to resolve civil family and tribunals disputes”

would be in place in England and Wales by the mid-2020s at the latest. It is quite a thought.

It is very easy to be seduced by the technologies themselves, but I would like to pull focus to questions of transparency, governance and accountability. We are told that much accountability within the system now rests with police and crime commissioners. My own dealings with such a commissioner give me no reassurance at all—quite the opposite, in fact. I do not believe that PCCs can provide adequate or even meaningful accountability, especially where fast-moving technology is concerned. They lack the necessary expertise and, looking at some of the turnouts in PCC elections, they lack the authority too.

With both the criminal and civil justice systems so overstretched and behindhand, it is all the more tempting to succumb to the allure of the glittering baubles of high tech, AI, algorithms and all the rest, with the promise they appear to offer of a faster, slicker set of outcomes. If we are also persuaded that those outcomes are also more just and fairer, with human fallibility stripped out, the Lorelei cry may prove irresistible. Yet, again and again during the course of our inquiry, we heard from experts how algorithms, however sophisticated, can be “gamed”. If this is true, I wonder whether algorithms can ever truly be fit for purpose within a justice system.

It all takes us inevitably back to the old, uneasy, irreconcilable tension between the supposedly sacrosanct principle of operational independence versus the ultimate need for accountability to prevent a police force or chief going rogue, which, as I have witnessed myself, does indeed happen from time to time, although fortunately rarely. I am becoming increasingly troubled by what we call “fairness metrics”. We hear much talk of using AI, not simply to deliver the status quo more effectively and efficiently, but actively to make society “fairer”—a subjective and loaded term, if ever I heard one—by rectifying perceived social, economic and other inequalities. If that initiative acquires significant momentum, we as parliamentarians must surely be profoundly concerned about what is being factored in.

I see a clear analogy here with the development of automation and AI in the automotive sector. We were told six or seven years ago that driverless cars would be on our roads by 2021. The reality is, they are still not here. Safe implementation is a vital consideration, as is the need for an appropriate legislative and regulatory framework both pre and post placement and, ideally, through testing in a sandbox environment to ensure the veracity and reliability of algorithms.

Rushing the implementation of automation and AI would be damaging enough in the context of automated vehicles, but getting it wrong risks pushing back mass-market adoption of technologies designed to improve productivity and mobility. A similar mistake is surely inconceivable and wholly unacceptable in the context of the criminal and civil justice systems. Who is keeping a close eye on all this? Is it Ministers?

I am sad that the noble Lord, Lord Blunkett, is not here. To quote from the evidence that we received from the Minister, when I asked at question 107,

“Will you be keeping a careful eye on this?”


The Minister responded,

“That is a very good question which I will have to think about … We have some brakes and levers that we can pull”.


At that point, the noble Lord, Lord Blunkett, said,

“There are ways and means, I promise you.”


At the end of the day, that is what this debate is all about. Who is keeping a careful eye? Is it officials? If it is, from which of the plethora of departments and public bodies that are active in this field will they emerge?

We come back to accountability. Who has practical, day-to-day responsibility for the legal, ethical and active use of advanced technologies of this kind? Who has day-to-day decision-making powers, and where is the practical transparency and ultimate accountability? The reality is always that ultimate responsibility must rest with Ministers and Parliament. The Executive takes the decisions and faces the scrutiny of the legislature in either or both of our Houses of Parliament. The question then is how to make that work quickly, effectively and reliably.

It is perhaps inevitable that a report of this kind raises more profound questions than it would ever be capable of answering, especially when addressing so complex and controversial a topic. I was worried at the time of publication that we would not succeed in our aim of moving Ministers to share our concerns. The trials and tribulations within the Government in recent months have not served to calm my fears. Now we appear—I stress, appear—to be in a period of much-needed stability again. I hope we catch the eyes and ears of Ministers and make a difference, for in the field of radical innovation, just as in the field of criminal and civil justice, prevention of an undesirable outcome is invariably preferable to cure.

17:52
Baroness Primarolo Portrait Baroness Primarolo (Lab)
- Hansard - - - Excerpts

My Lords, I will make a small contribution to this debate as a member of the committee but, first, I pay tribute to the noble Baroness, Lady Hamwee. She steered our committee through complex and, at times, contradictory evidence to try to make sense of what, as the noble Lord, Lord Hunt, said, is a rapidly changing and developing area. Her opening remarks in this debate said, with great eloquence, exactly which problems the committee identified and our fears for the implications for the justice system if those problems are not addressed. The noble Lord, Lord Hunt, referred to the prediction of driverless cars. The issues addressed within our justice system are cumulative and the problem will be too large to address if we do not take these very small steps at the beginning.

I welcome the Minister to this debate. I recognise that this is a complex area of policy and that the Government are trying to catch up, balancing all sorts of priorities, while the technologies continue to develop and change.

Your Lordships and the Minister will see from the committee’s report that we raised concerns, particularly about the risks to human rights and civil liberties as a result of the increasing use of these advanced technologies, and particularly in the police forces, which was our focus. We stopped there because the subject was so enormous, if we had not, we would still be deliberating on the evidence. Clear questions continually emerged which remained unanswered—questions of accountability, efficacy, transparency and the potential to undermine inadvertently the basic principles of our criminal justice system.

The question that our committee kept finding itself confronted with was: what are the principles which should underpin the safe and ethical use of these new technologies in the justice system? Currently, a lack of national minimum standards, transparency, rigorous evaluation and training in the use of these technologies means that human rights and civil liberties could be compromised. Are we to wait until they are compromised before we decide to address these principles?

In endorsing this report, the committee unanimously decided that now is the time to start acting. It cannot be right that 43 constabularies are doing their own thing, most in isolation from each other, evaluating as they go along, at best—if they do it at all. However, that evaluation is not open to public scrutiny—it does not provide a route through to the point that the noble Lord, Lord Hunt, made: who is accountable? Parliament has to be accountable, and how do we discharge those responsibilities without the information in the first place? Each constabulary develops the use of the technologies to its local policing objectives and does different tasks to different levels of complexity. I am not making a case for a national police force, and neither was the committee, but it would be helpful to those constabularies to be provided with clarity from government on the basic principles that they should be observing, as this fits within the wider justice system.

Some are in no doubt—it may be the case; I do not have a crystal ball—that advanced technologies have a huge potential in assisting the police in delivering priorities and a policing system that commands confidence, trust and respect, improving efficiency, productivity and problem-solving. That is the sales pitch to the constabularies. But—and it is a very big but—these technologies have challenging and significant downsides with regard to civil liberties and human rights, as the noble Baroness, Lady Hamwee, pointed out. If not addressed, they will undermine the same confidence, trust and respect, and, if inadvertently and wrongly used, they will undermine the concept of fairness in our justice system. We should be under no illusion that if these technologies are allowed to mushroom in the police service without clear, consistent, understandable standards and protections, we will build up significant problems.

We urgently need consideration at national level of the trade-offs in using these new technologies: human rights versus interference with those rights, while ensuring that the interventions are necessary and proportionate. When we asked where the balance was, who was accountable and who was watching, answer came there none. On their effectiveness, we asked: are the public safer with these enhanced technologies and do these technologies make a difference? Again, in the absence of evaluation, answer came there none.

Transparency is a crucial principle because, increasingly, citizens want transparency about how their personal information is used and shared. Many benefits flow from transparency, including identifying problems early on and, crucially, improving the public’s trust in data-driven technologies. With that trust comes a pathway to developing appropriate technologies in supporting priorities. It happens elsewhere, so the committee suggested a central organisation or regulating body. NICE does it for the health service with regard to the efficacy of drugs. Why can it not be done in the justice system? The embryology authority balances what is possible medically with what is acceptable ethically. Why can we not use similar models?

Does the Minister think that police forces should satisfy themselves in advance of using new technologies, through independent verification, that the software program does not have an unacceptable level of bias? How can we be confident that historic cultural bias is not built into the system? Does the technology actually work and do what we want?

There are steps forward that could be taken—I know it is going to be very difficult—to deliver two central propositions recommended in this report. First, will the Minister agree to bring together the 43 constabularies, either by requiring or facilitating them, to share their knowledge and experience in this area so that we can begin the painful process of getting on the right side of this development? Secondly, will he consider appointing an expert panel of academics and practitioners to advise him on how to make progress on having the correct balance for a regulatory authority that protects us and our civil liberties, but enables the police and the justice system to do their job effectively?

18:03
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, it is a pleasure to follow three such excellent opening speeches. I draw attention to my interests in the register, particularly my interest in artificial intelligence technologies as a former chair of the AI Select Committee of this House. As a non-member of her committee, I congratulate my noble friend Lady Hamwee and the committee on such a comprehensive and well-argued report.

I entirely understand and welcome the width of the report but today I shall focus on live facial recognition technology, a subject that I have raised many times in this House and elsewhere in Questions and debates, and even in a Private Member’s Bill, over the last five years. The previous debate involving a Home Office Minister—the predecessor of the noble Lord, Lord Sharpe, the noble Baroness, Lady Williams—was in April, on the new College of Policing guidance on live facial recognition.

On each occasion, I drew attention to why guidance or codes are regarded as insufficient by myself and many other organisations such as Liberty, Big Brother Watch, the Ada Lovelace Institute, the former Information Commissioner, current and former Biometrics and Surveillance Camera Commissioners and the Home Office’s own Biometrics and Forensics Ethics Group, not to mention the Commons Science and Technology Committee. On each occasion, I have raised the lack of a legal basis for the use of this technology—and on each occasion, government Ministers have denied that new explicit legislation or regulation is needed, as they have in the wholly inadequate response to this report.

In the successful appeal of Liberal Democrat Councillor Ed Bridges, the Court of Appeal case on the police use of live facial recognition issued in August 2020, the court ruled that South Wales Police’s use of such technology had not been in accordance with the law on several grounds, including in relation to certain human rights convention rights, data protection legislation and the public sector equality duty. So it was with considerable pleasure that I read the Justice and Home Affairs Committee report, which noted the complicated institutional landscape around the adoption of this kind of technology, emphasised the need for public trust and recommended a stronger legal framework with primary legislation embodying general principles supported by detailed regulation, a single national regulatory body, minimum scientific standards, and local or regional ethics committees put on a statutory basis.

Despite what paragraph 4 of the response says, neither House of Parliament has ever adequately considered or rigorously scrutinised automated facial recognition technology. We remain in the precarious position of police forces dictating the debate, taking it firmly out of the hands of elected parliamentarians and instead—as with the recent College of Policing guidance—marking their own homework. A range of studies have shown that facial recognition technology disproportionately misidentifies women and BAME people, meaning that people from those groups are more likely to be wrongly stopped and questioned by police, and to have their images retained as the result of a false match.

The response urges us to be more positive about the use of new technology, but the UK is now the most camera-surveilled country in the Western world. London remains the third most surveilled city in the world, with 73 surveillance cameras for every 1,000 people. The last Surveillance Camera Commissioner did a survey, shortly before stepping down, and found that there are over 6,000 systems and 80,000 cameras in operation in England and Wales across 183 local authorities. The ubiquity of surveillance cameras, which can be retrofitted with facial recognition software and fed into police databases, means that there is already an apparatus in place for large-scale intrusive surveillance, which could easily be augmented by the widespread adoption of facial recognition technology. Indeed, many surveillance cameras in the UK already have advanced capabilities such as biometric identification, behavioural analysis, anomaly detection, item/clothing recognition, vehicle recognition and profiling.

The breadth of public concern around this issue is growing clearer by the day. Many cities in the US have banned the use of facial recognition, while the European Parliament has called for a ban on the police use of facial recognition technology in public places and predictive policing. In 2020 Microsoft, IBM and Amazon announced that they would cease selling facial recognition technology to US law enforcement bodies.

Public trust is crucial. Sadly, the new Data Protection and Digital Information Bill does not help. As the Surveillance Camera Commissioner said last year, in a blog about the consultation leading up to it:

“This consultation ought to have provided a rare opportunity to pause and consider the real issues that we talk about when we talk about accountable police use of biometrics and surveillance, a chance to design a legal framework that is a planned response to identified requirements rather than a retrospective reaction to highlighted shortcomings, but it is an opportunity missed.”


Now we see that the role of Surveillance Camera Commissioner is to be abolished in the new data protection Bill—talk about shooting the messenger. The much-respected Ada Lovelace Institute has called, in its report Countermeasures and the associated Ryder review in June this year, for new primary legislation to govern the use of biometric technologies by both public and private actors, for a new oversight body and for a moratorium until comprehensive legislation is passed.

The Justice and Home Affairs Committee stopped short of recommending a moratorium on the use of LFR, but I agree with the institute that a moratorium is a vital first step. We need to put a stop to this unregulated invasion of our privacy and have a careful review, so that its use can be paused while a proper regulatory framework is put in place. Rather than update and use toothless codes of practice, as we are urged to do by the Government, to legitimise the use of new technologies such as live facial recognition, the UK should have a root-and-branch surveillance camera and biometrics review, which seeks to increase accountability and protect fundamental rights. The committee’s report is extremely authoritative in this respect. I hope today that the Government will listen but, so far, I am not filled with optimism about their approach to AI governance.

18:11
Lord Hope of Craighead Portrait Lord Hope of Craighead (CB)
- Hansard - - - Excerpts

My Lords, I congratulate the noble Baroness, Lady Hamwee, on securing this debate. As another non-member of the committee, I join the previous speaker in congratulating her and all members of the committee on such an excellent and informative report. I hope that when the Minister replies, he will be able to remove at least some of the evident disappointment which the noble Baroness felt on reading the Government’s response.

Before I go into any detail, I should explain that my interest in this subject is directed to the use of AI in the courts and the challenges that it faces. However, I confess that I have no technical expertise and have had very little contact with the courts’ use of AI at first hand; nor I did not have the advantage the committee members had of listening to the evidence, so I start with a definite disadvantage. I come from a generation which is unable to use its thumbs to operate the mobile phone. We did not have these things when we were at school so I have to jab it, as others of my generation do, with my forefingers. Things have been moving so fast that even the eight years since I retired from my judicial career have seen changes that were barely in prospect when I was still sitting as a judge.

I have struggled with the word “algorithm”, for example—not a word that I was ever accustomed to using. When I looked it up in my copy of the third edition of the shorter English dictionary, which was published in 1964 and which I purchased one year later when I was embarking on my legal career, I was told that “algorithm” is an erroneous version of “algorism”, which is an Arabic system of numbering. No other definition was offered, so I am grateful to the committee for telling me in box 1 of the report what in today’s language it really means. That definition should perhaps have made it clear that the instructions are given by means of numbers, which I believe is the way that AI operates. We owe all this to the Arabic system, which is why the word derives from the previous one.

Even so, I struggle to understand how the system works. Where do the instructions come from, and are they the right people? How do we know that the answers it produces are the right ones? Is the system open to cross-examination to test these issues? If so, how can this be done? I share the committee’s concern about where all this is leading. So far as the courts are concerned, AI comes especially into play in two ways. The first is in the provision of evidence in a criminal trial. The other is in its use in dispute resolution in the civil courts. Each of them presents very real challenges.

The report, for the most part, is directed at the use of advanced technologies by police forces. The courts become involved when evidence that has been gathered by this means is led at a criminal trial to secure a conviction. Some years ago—in fact, quite a number of years ago—I presided in a case before the criminal appeal court in which the appellant had been convicted on the basis of a primitive system of facial recognition technology. He insisted that it was a mistake and that its use was unfair because, due to problems with legal aid, he had no access to expert evidence to challenge it. It seemed to us that that amounted to a miscarriage of justice, so we set aside the conviction so that he could face trial again with expert assistance.

In the retrial, the jury—unfortunately, from his point of view—reached the same conclusion as the first jury on the recognition evidence and once again he was convicted. My point is that fairness and transparency, which the noble Baroness, Lady Primarolo, emphasised in her impressive speech, should be at the heart of any criminal trial. That requires that evidence of this kind should be open to challenge. As it happens, there was no suggestion that the evidence in that case had been manipulated; it was just said to be a mistake. The reference to the possibility of manipulation must give rise to real concerns, as shown by the very important selection of paragraphs 23 to 26 in the report, under the heading,

“The right to a fair trial”.

I support the recommendations that are referred to as numbers 1, 2 and 4 in the Government’s response. They are all designed to ensure the safe and ethical use of AI. The Government say they are confident that existing structures and organisations create a sufficient network of checks and balances, but the evidence that is narrated in this report suggests that that confidence may be misplaced. More safeguards than those that are available may be needed in this fast-moving area. I endorse the point made by the noble Lord, Lord Blunkett, which the noble Baroness mentioned: it is far better to do this now than later, when it would be too late and things would have moved on beyond recall.

As for AI’s use in dispute resolution in the civil courts, I pay tribute to the work of the Library and its very helpful briefing on the report. It contains a link to an article referred to by the noble Lord, Lord Hunt of Wirral, headed,

“Technology to become embedded in UK justice system by 2040, senior judge suggests”.


That contained a link to a speech that was given online in March this year by the Master of the Rolls, Sir Geoffrey Vos, about the future for dispute resolution in what he referred to as a “brave new world”.

If one wants to be enlightened of the huge advantages that AI can offer, they can be seen in Sir Geoffrey’s speech. He is an enthusiastic supporter, promoting AI’s use in the civil courts as fast as possible. He focuses particularly on the advantage of speed and simplicity, which gathering evidence in this way can produce. I am certainly not one of those who decries the use of AI; it is all a question of how it can be best operated.

According to Sir Geoffrey, factual disputes will themselves become a thing of the past, as so much of what we do will be indelibly recorded by AI. He referred, among other things, to number plate recognition. You cannot really dispute where your car has appeared, because AI no longer leaves any room for dispute about that. He says that we are more and more likely to find this a feature of dispute resolution in the civil courts.

He went on to say that some decisions, admittedly minor decisions, such as those about time limits and other procedural aspects, could be made by this system with no human intervention. Proposals for dispute resolution themselves would be “driven by AI”, as he puts it.

He acknowledged that public confidence is important, and that the public would need to understand what had been decided by a machine and what had not. He also said that, ultimately, there must be the ability to question an AI-driven decision before a human judge. That begs the question whether and how that can be done, and how far we can trust algorithms that are not open to being tested in that way.

I was encouraged by the statement in paragraph 32 of the Government’s response that they will work with the justice system with a view to

“better long term research and evaluation of the different circumstances in which predictive algorithms”

are described and used to support future decision-making. Of course, there is much that the courts themselves can do to control and regulate their use, but the extent of the ability of litigants to question and interrogate the algorithms is not open to control or guidance by the courts. That is why the recommendation in paragraph 155 of the report, which is dealt with in paragraph 18 of the Government’s response, is so important. It is about the need for a requirement on producers to embed explainability within the tools. If that requirement is there, one may be able open up a system of cross-examination to find out what is going on and see whether what has been produced can be relied on. I fear that the Government’s response in paragraph 35 hardly does justice to this crucial issue.

I hope that when he comes to reply the Minister will be able to reassure the noble Baroness that the Government will look again at the evidence and recommendations in the committee’s report, to see whether more can be done to regulate and control the way that AI is imposing itself on our lives. I suggest that if the Minister and his team have not already done so, they might like to read Sir Geoffrey’s speech, because it will show the advantages and concerns which surround this whole issue.

18:22
Baroness Sanderson of Welton Portrait Baroness Sanderson of Welton (Con)
- Hansard - - - Excerpts

My Lords, it is an honour to follow the noble and learned Lord. As others have before me, I compliment the chair of the committee, the noble Baroness, Lady Hamwee, on her comprehensive opening remarks—no easy feat with this report—and her very fair and decent approach throughout the committee’s inquiry. I also compliment our secretariat on its hard work and guidance.

There are many topics we could cover—and have covered—in this debate today: the technology itself, the dangers of inherent bias and predictive policing and the implications for civil liberties. However, for the purposes of today, I will concentrate on the pace at which new technologies are developing, particularly within the police—which I, and perhaps the Minister, notice seems to be an emerging theme—and pick up on some of the Home Office’s responses to our concerns.

As my noble friend the Minister will know from the report, when we began this investigation, we did it on the understanding that, despite the concerns I have just mentioned, AI is a fact of modern life. We acknowledged that it can have a positive impact in improving efficiency and finding solutions in an ever more complex world.

However, in terms of the justice system, and more specifically the police, we became alarmed at the relatively unchecked proliferation of new technology across all 43 forces. As has been mentioned, we made a number of recommendations to combat this: a central register, kitemark certification, mandatory training and better oversight.

I know that these are significant steps and that they have costs attached, but they were carefully thought through and, to be honest, we were not expecting to be quite so roundly dismissed by the Government in their response. They seemed to imply that we had failed to appreciate the value and necessity of AI tools in today’s policing environment. In particular, the response highlighted the use of CAID—the Child Abuse Image Database—which brings together all the images found by police and the NCA, helping them to co-ordinate investigations.

In one sense, the Government are right to make much of CAID because it was game-changing. For instance, a case with 10,000 images that would typically have taken up to three days to review could be done in an hour, thanks to CAID. Perpetrators could be apprehended more quickly, officers protected from the effects of viewing these images and more focus placed on identifying the victims. As someone who worked on child sexual abuse and exploitation at the Home Office when CAID was introduced, I assure the Minister that I completely understood—and understand—the value of new technologies in certain instances.

However, in the context of the report, I just do not think that it is a very helpful example. The Home Office itself helped to develop CAID in collaboration with the police and industry partners. Once piloted, it went live across all police forces and the NCA. To suggest that that is the norm would be misleading, and it should not be used as a reason not to address the clear problems that we identified in a system where all 43 forces, as has been mentioned, are free to individually commission whatever tools they like in a market that is, as we said, opaque at best and the Wild West at worst, in which the oversight mechanisms are, frankly, inadequate. The Home Office may think that we are overreacting, but the truth is that it would be hard-pushed to make that case because without a central register, as we suggested, it is impossible to know who is using what, how and to whom.

If we dig a little deeper, the Minister may see why we are concerned. Some of this has already been mentioned. On procurement, we heard from a police representative who said that procurement is not the comfort zone of all police forces. When the tools they are procuring may have consequences for human rights and the fairness of the justice system, as we have been talking about, never mind taking into account the complexities of the technologies market, where providers are reluctant to share information on the basis of commercial confidentiality, as the noble Baroness, Lady Hamwee, said, that is truly worrying.

Then there is the problem that, as the NCC Group told us,

“many claims made by [Machine Learning] product vendors, predominantly about products’ effectiveness in detecting threats, are often unproven, or not verified by independent third parties.”

There are the salespeople who—in an understandably overzealous way in a burgeoning market—according to one developer,

“take something they do not understand and shout a number that they do not understand”.

I would add that in many cases they then make it available to officers who do not understand it either. Incredibly, the police are not required to be trained to use different AI technologies—this is one of the things I found most shocking in our report—including facial recognition, because they are procured at a local level.

All this does not feel like a solid foundation on which to deploy such highly sensitive tools and, as the noble Baroness, Lady Hamwee, has already alluded to, there are some in the police and in the market who agree with us. At the excellent conference at the Alan Turing Institute last week, one speaker representing the police pointed out that in order to become a detective you have to pass an exam, and that the same should be true for technology. Another from a different force said: “Artificial intelligence is not on the tip of the tongue of the public yet, but we don’t want it to be another frontier of failure.”

One way in which we could help to build confidence is statutory specialist ethics committees, which would not only increase community involvement and understanding but help to create an institutional culture of accountability, something that we already know needs to be improved. I am afraid to say that that was another recommendation dismissed by the Home Office.

I am not blaming the police here. There are some brilliant forces, such as West Midlands, which have spotted the benefits but also the pitfalls, and which are working hard to get ahead of them. Without more commitment from the Government, though, I fail to see how the current system leads to anything but another frontier of failure. As people have said throughout the debate, at some point under the current free-for-all, when a police force that has not put in the protections that, say, West Midlands has, it feels inevitable that something is going to go very wrong.

It is not as though the Government are not doing anything. The Centre for Data Ethics and Innovation, which is based in DCMS, is piloting the public sector algorithmic transparency standard. We on the committee would all agree with that, and, genuinely, people around the world are looking at it. Can the Minister tell me, if you compare the work that is going on in DCMS with the response to our report, how closely do officials in the Home Office work with their counterparts in DCMS on this? This pilot includes some police forces, and it does not feel as if the two marry up.

Again, as others have said, I know that probably quite a few people may wish to put this report on the shelf and watch it gather dust. However, I think we all know that in practice, that is unlikely to happen because the concerns raised within it will surely become more apparent down the line.

Finally, we heard a great analogy at the conference last week with regard to training for those using AI. The speaker said: “For a car to be allowed on the road, it’s got to have an MOT, but the driver also has to have a licence.” I am afraid that at the moment, with regard to these technologies, we do not have either.

18:31
Baroness Chakrabarti Portrait Baroness Chakrabarti (Lab)
- Hansard - - - Excerpts

My Lords, what an absolute honour to follow that contribution from the noble Baroness, Lady Sanderson of Welton. Your Lordships can imagine what the contribution her fabulous communication skills and powers of analysis made to the work that we did on this report. I now have the daunting privilege of being the last member of the recently constituted Justice and Home Affairs Committee to contribute. We have also had two expert contributions from a technology expert in the noble Lord, Lord Clement-Jones, and of course the noble and learned Lord, Lord Hope. I will try not to repeat too much but will add just a little framing and a few points of emphasis.

First—this is relevant beyond even the vital business of this report—I had never sat on one of the House of Lords’ select committees before, and it was and continues to be a wonderful experience. This was a perfect subject to examine with the rigour of a Lords Select Committee in a totally cross-party way. It feels almost odd now to be a few swords away from the noble Baroness, Lady Sanderson of Welton, and the noble Lord, Lord Hunt, because on the journey that we went on together on this committee, there was no significant partisanship at all. Rights and freedoms and the rule of law should not be a partisan issue. That was definitely my experience of being on the committee of the noble Baroness, Lady Hamwee—she chaired it with the elegance of a society host, the creativity of a film director and the rigour of a judge.

I was reading in the press just today some comments from the American computer science genius and polymath Jaron Lanier. He was talking about the rise of these technologies in general, not about the criminal justice system in particular, and he told the Guardian:

“People survive by passing information between themselves. We’re putting that fundamental quality of humanness through a process with an inherent incentive for corruption and degradation. The fundamental drama of this period is whether we can figure out how to survive properly with those elements or not.”


That is a comment on the rise of these very exciting new technologies in general but I suggest that, of all the spheres in which artificial intelligence and these new technologies are being employed, the criminal justice sphere is special. There are great potential benefits, as we have heard, but real dangers as well. Why are the criminal justice system and the ambit of the home department so special? It is because we are talking about people’s rights and freedoms. We are thinking about the right to life and to protect people, our communities and victims and potential victims, but we are also talking about the gravest rights, freedoms and liberties of the subject. That came through very clearly in both the evidence to and the private deliberations of our committee.

I remind noble Lords that it was just over 40 years ago that, in response to the Brixton riots in this city, Lord Scarman produced his report because there was a crisis of trust and confidence in policing in so many of our communities. Not long after that legendary Scarman report, a Conservative Thatcher Government produced the Police and Criminal Evidence Act 1984. There was inevitably some controversy attached to it but, none the less, I would consider it a piece of human rights legislation, because it attempted to set a framework of principles and law for governing police power.

We would not dream today of rescinding or repealing that Act. It has been amended, but it is still on the statute book. The idea is that police power, while essential, needs to be regulated and consolidated in one place. Of course, new and intrusive technologies have emerged. The PACE codes have had to be updated and the legislation itself has been amended, but some basic principles and ideas of accessibility and transparency in the use of intrusive police power hold still, over 40 years later.

I do not believe that noble Lords and Ministers would dream of rescinding that, and nor should the Government think that such a framework is not needed today in relation to these new powers—these powers which we cannot even see being used, or understand, because they are effectively in a black box, or in a jar in the form of the pill but I cannot say what is in the pill that I am taking. That is why regulation and framework legislation is required.

It is simply not enough to rely on the current arrangement of broad police discretion and the occasional police witness to our committee or some other forum to say, “Oh, but you know: proportionality”. We are compliant with human rights proportionality as it if is a mantra. That is not detailed enough for regulation. It would not be detailed enough for powers of arrest and it certainly would not be detailed enough for the use of drugs. We need to get into the black box: we need to prescribe it and to decide what is legitimate and proportionate in the use of this technology and its design. Legislation is absolutely essential to avoid what the noble Baroness, Lady Sanderson of Welton, called the Wild West—because that is exactly where we are now in the use of this technology in the criminal justice system and, to some extent, at the border in relation to its intrusive use.

In addition to this framework legislation—the Police and Criminal Evidence Act and an AI Act for the 21st century—we need a national body that will do the prescription and kitemarking. There is no doubt that we need this because of the black box. Lay citizens and even parliamentarians cannot understand the technologies, read and decipher the algorithms, and understand whether coded bias is being baked in—which is happening.

I commend the Netflix documentary on facial technology that features the noble Baroness, Lady Jones of Moulsecoomb, from this House. It is a wonderful documentary. I hope that noble Lords, Ministers and their officials—who are passing them notes, probably saying “Yes, it’s a great documentary”—will watch it.

Kitemarking is essential before any procurement of these technologies and algorithms within the criminal justice system. It should not be left to local police officers, or even PCCs, to have lunch with some people who are selling their wares and decide what is a good deal or not.

In addition to the kitemarking of the product, there is a great opportunity for His Majesty’s Government and the United Kingdom in going down the road being advocated in our report. We could be world leaders in the kitemarking and regulation of this technology. In years to come, if we take up the recommendations from this committee, there could be countries all over the world that say, “We go for the UK AI in criminal justice model”. It is the equivalent of saying they want to contract in English law or in Delaware law, or whatever it is. This technology is being developed and used all over the world, and if we get ahead of the kitemarking and regulation game, others may contract into our arrangements and adopt our technologies and systems over time.

It is completely without justification, it seems to me, for private companies to be experimenting on our populations, including with their intimate data and with policing and intelligence and so on, and then claiming that they will not engage with transparency or legality because of commercial sensitivities. That is a swindle and a scandal, and it needs to end. We would not allow arms companies or drugs companies to behave this way; we certainly should not be allowing it in these deals that are being done in the 43 forces with these people in the Wild West—I will not say who it is that rides around on horses in the Wild West, but the point is made.

To conclude, we are just asking for this technology to be governed by the rule of law, for Parliament to step up and, crucially, for Ministers to step up, as their predecessors did in the Thatcher Government in the 1980s in response to the Brixton riots and the Scarman report. Only this time, we are asking that this be done before a scandal and before a crisis of confidence that reaches the kind of levels where it will be harder to use the technology in a positive way in the future.

18:42
Baroness Ludford Portrait Baroness Ludford (LD)
- Hansard - - - Excerpts

My Lords, this report produced by the committee chaired by my much-praised noble friend Lady Hamwee is both powerful and shocking. It does not mince its words. I will be quoting from it, as I cannot improve on its wording. The report is not before time—indeed, it is overdue. One can only wonder that successive Governments have neglected to introduce the reforms and protections that this report so convincingly explains are essential to protect us from breaches of equality, human rights and data protection safeguards.

The committee is a remarkably strong one, including as it does a former Home Secretary, the noble Lord, Lord Blunkett; a former National Security Adviser, the noble Lord, Lord Ricketts; a former director of Liberty, the noble Baroness, Lady Chakrabarti; and several very senior lawyers. The report says that the committee was

“taken aback by the proliferation of Artificial Intelligence tools potentially being used without proper oversight, particularly by police forces across the country.”

It warns that,

“without sufficient safeguards, supervision, and caution, advanced technologies may have a chilling effect on a range of human rights, undermine the fairness of trials, weaken the rule of law, further exacerbate existing inequalities, and fail to produce the promised effectiveness and efficiency gains.”

That is a stunning catalogue of dangers.

The report explains how public bodies and all 43 police forces are free to individually commission whatever tools they like or buy them from companies

“eager to get in on the burgeoning AI market”.

The committee found this

“particularly concerning in light of evidence we heard of dubious selling practices and claims made by vendors as to their products’ effectiveness which are often untested and unproven.”

No wonder that the committee reports that it

“uncovered a landscape, a new Wild West, in which new technologies are developing at a pace that public awareness, government and legislation have not kept up with.”

It refers to the phenomenon of “digital excitement”—one could say the delight in boys’ toys, if that were not sexist—felt by some who get their hands on a new technology product. That is of course not a good rationale for purchase. It is hardly surprising that my noble friend Lady Hamwee, in her letter to the Home Secretary, said that the committee was “disheartened” by the Home Office’s response to its “constructive conclusions and recommendations”, saying it found the Home Secretary—I think my noble friend Lady Hamwee quoted this—

“more satisfied with the current position than is consonant with the evidence”

that the committee had received. That is quite a strong message.

My noble friend Lady Hamwee said with considerable feeling that the committee

“hoped that when the House debates the report, the Minister will be able to explore with us in more depth the points that we raised, and not simply be briefed to repeat the formal response”.

We very much look forward to that more realistic response today. The Government’s response was disappointing and complacent, and failed to do justice to the quality of the evidence, the report and the committee. The Government

“was not persuaded that a new independent national body and certification system should be created. It said whilst certification worked in some contexts, it could also create false confidence and be costly. It disagreed with the idea of making transparency a statutory principle. It said … making transparency a legal duty could limit the police’s current transparency efforts to whatever would be set out in statute.”

Also, the Government said that

“it could not make the police and the judiciary undertake training on ‘meaningful interaction with technologies’. This was because training was the responsibility of the College of Policing and Judicial College, rather than the government.”

However, as the noble Baroness, Lady Sanderson of Welton, said, we oblige drivers to have a licence as well as for the car to have an MOT. The Government

“disagreed that there should be statutory ethics groups created to scrutinise the use of technologies and veto deployment … because they would not be democratically elected.”

These all seem remarkably weak points. An alternative term would be “scraping the barrel”.

The committee said that:

“While we found much enthusiasm about the potential of advanced technologies… we did not detect a corresponding commitment to any thorough evaluation of their efficacy … there are no minimum scientific or ethical standards that an AI tool must meet before it can be used in the criminal justice sphere. Most public bodies lack the expertise and resources to carry out evaluations … we risk deploying technologies which could be unreliable, disproportionate, or simply unsuitable for the task in hand.”


Are the Government happy with that situation?

The committee found the institutional landscape confused and duplicative—no wonder, with at least 30 organisations, initiatives and programmes having some input or other—and found governance arrangements complex and disconnected, while the Government are appointing still more bodies which make the picture even more crowded. The committee said:

“We have heard no evidence that the Government has taken a cross-departmental strategic approach to the use of new technologies in the application of the law … Thorough review across Departments is urgently required.”


Can the Minister tell us that that at least will happen? The report mentions that a government White Paper is supposedly in the pipeline. Can the Minister tell us the envisaged date for that?

The report has a number of important proposals on governance, oversight and evaluation to address these various deficits. One very sensible proposal is a new national body to set scientific and quality standards and certify new products against those standards. The committee recommends “evaluate centrally, procure locally”.

The committee says its

“evidence reflected organisational confusion about what guidance, regulation and legislation applied”

and argues persuasively for a strong legal framework to remedy the fact that

“users are in effect making it up as they go along.”

No wonder it uses the term “Wild West”.

The report refers to the EU artificial intelligence regulation, or “AI Act”, that is in preparation—I am not sure where it has got to—and notes that it would ban systems that pose an “unacceptable risk’”, such as social scoring and many deployments of facial recognition. I hope the Government are still willing to learn from the EU.

The committee suggests legislation to set principles, supplemented by regulations to govern the use of specific technologies. If the Government object that there is a lack of parliamentary time, I suggest at least three Bills that could and should be dropped to make space: the Northern Ireland Protocol Bill, the revocation of EU law Bill and the Bill of Rights Bill, otherwise known as the Human Rights Act destruction Bill.

The committee found the market “worryingly opaque”, with buyers often pretty ignorant about the systems that they were buying due to companies’ insistence on commercial confidentiality. It found some “dubious selling practices” and untested, unproven claims about effectiveness of the products.

The committee therefore makes a number of important proposals for increased transparency and explainability, including consultations and published impact assessments. The committee reports that there is no central register, making it virtually impossible to find out where and how these systems are being used such that parliamentarians, the media, academics and those subject to them could scrutinise and challenge them. The committee rightly called for a mandatory register.

The Government published their consultation paper Data: A New Direction just over a year ago, promising

“a bold new data regime”,

a phrase that makes me wary. I am concerned about prejudice to our data adequacy decision from the European Commission but also worried if it makes the Government less vigilant about data protection and privacy issues.

The committee said it sees

“serious risks that an individual’s right to a fair trial could be undermined by algorithmically manipulated evidence”,

with defendants and indeed courts ignorant of what technologies might have been used in their case. That is a pretty dire state of affairs.

The report raises serious concerns that bias in data collection could lead to discriminatory policing, especially in predictive policing. It is well-known, as my noble friend Lord Clement-Jones pointed out, that facial recognition technology is not sound when used on female and ethnic minority subjects because the learning algorithms have leaned more on data from white men than from other groups. The committee also warned of the danger of overpolicing through the use of predictive tools, which could become a vicious circle of concentration on poorer people in more disadvantaged areas.

The committee is highly concerned at the lack of accountability for the misuse or failure of these AI technologies and hence the lack of recourse for people who might suffer from their use. It suggests that the Government appoint a taskforce to produce guidance on consistent lines of accountability.

This is a first-class and hugely valuable report. The Government’s complacency—I could say blinkered complacency—is profoundly unwise when defects and unfairness in the deployment of AI systems could create a backlash through a loss of trust or become, in the words quoted by the noble Baroness, Lady Sanderson of Welton, “another frontier of failure.”

The glittering prize for the UK is, in the words of the report, to be

“a frontrunner in the global race for AI while respecting human rights and the rule of law.”

I hope we hear a better response than we had in June and concrete plans now from the Minister.

18:53
Lord Paddick Portrait Lord Paddick (LD)
- Hansard - - - Excerpts

My Lords, that was an extremely powerful contribution from my noble friend Lady Ludford, with which I wholeheartedly agree. I thank my noble friend Lady Hamwee, her eminent committee members and their officials for this impressive report, the importance of which cannot be overestimated. There have been equally impressive contributions from members of the committee, although not exclusively from them.

I am no Luddite. I am impressed by new technology and could be described in my own way as an early adopter of it, even if it is the new iPhone or the latest laptop—boys’ toys, as my noble friend just commented. Perhaps I get too excited by technology in the way that she mentioned. However, there are inherent dangers in the way that technology is being used in the criminal justice space that are a real cause for concern, as the report clearly points out and as noble Lords have described.

I do not know whether I am correct in thinking that, like direct and indirect racism, there are perhaps first and second-degree dangers in the use of advanced technology. As in the hackneyed phrase, when it comes to computers, of “Rubbish in, rubbish out”, there is a clear potential danger that artificial intelligence built on the results of biased policing and biased decision-making by the courts will be hard-wired into AI systems, as the noble Baroness, Lady Primarolo, said. Whether it is about the likelihood that a convicted person will reoffend or when used in connection with vetting inquiries, where racial bias in human decision-making is copied and pasted into AI systems, artificial intelligence also has the danger, for example, of being racially biased.

As my noble friend Lady Hamwee said, the report points out what I might call second-degree prejudice and discrimination, such as where AI is used to predict where volume crime might occur but not used to focus police resources on what used to be called white-collar crime, such as high-value fraud. This application bias has the danger of focusing police resources on poor neighbourhoods, where black and other minority ethnic people live, while majority white crime is seen to be even less solvable as the opportunities provided by AI to solve crime are focused elsewhere. The first-degree racism dangers in Durham’s predictor of how likely someone is to commit a crime in the future, or the Home Office sham marriage detector, should not overshadow the second-degree racism that might result from focusing advances in technology on the poor and disadvantaged.

It is not just having the mantra of “If you’ve done nothing wrong, you have nothing to fear” to downplay the harm caused by disproportionality in stop and search that we must be alert to, but that facial recognition technology is likely to give false positive results with women and black people. Operators that are not effectively regulated could load databases of political activists—or even images from Facebook groups that the system could be asked to trigger alerts for—allowing the police to track the individual movements of innocent citizens. That the city council of Santa Cruz in the United States placed a moratorium on the same live facial recognition software used by Kent Police between 2013 and 2018, because that council believed it endangered civil rights and civil liberties, and exacerbated racial injustice, perhaps indicates the dangers and how the UK is lagging behind other jurisdictions in addressing these dangers, as my noble friend Lord Clement-Jones said this evening.

I found the Information Commissioner’s remarks, quoted in the report, that every technology can create benefits or risks, depending on the context, governance and oversight measures, a little like the Chinese phrase “We live in interesting times”. It was fairly obvious but not particularly helpful, unlike the report, which not only shows how and where the governance and oversight measures are inadequate but, helpfully, recommends how and where they can be improved, as my noble friend Lady Hamwee described.

The report also points out that the courts are filling gaps in the legislation, something judges are reluctant to do. They want clear laws to interpret, not an absence of law that they then have to invent. I am reminded of going, as part of my Master of Business Administration degree, into the bank where my twin brother was a senior executive so we could act as quasi-management consultants and carry out a project on the system that the bank used to regulate salaries. The view of the operational arm of the bank was that the human resources department was holding back the business from moving forward, and that senior executives should be able to reward high performers outside the salary and grading structure.

Similarly, I appreciate how difficult it is for legislation to keep up with technological advances. However, given the erosion of civil liberties and, for example, the overpolicing of certain communities, that should not mean sacrifices just because, to quote Bill Heslop from the film “Muriel’s Wedding”, “You can’t stop progress!” That was his campaign slogan when he was running for political office and he did not win—not that I am suggesting that there are similarities between that character and my twin brother, or Kit Malthouse, the former Minister quoted in the report.

The report’s conclusions, that there is no clear line of accountability for the misuse or failure of technological solutions used in the application of the law and, as a result, no satisfactory recourse mechanisms, are worrying, together with the fact that there is a lack of transparency in the use of advanced technological solutions. Mandatory impact assessments are a safeguard, provided they are objective and independent.

Committee reports such as this one are a fundamental aspect of the work of the House, and we overlook them at our peril—this report perhaps more than many. As my noble friend Lady Hamwee said, the credibility of the criminal justice system could be at stake. As my noble friend Lady Ludford pointed out, the Government’s response could be described as complacent. I look forward to the Minister’s response saving the day by reassuring this Committee that he has taken on board the recommendations of this important report.

19:01
Lord Ponsonby of Shulbrede Portrait Lord Ponsonby of Shulbrede (Lab)
- Hansard - - - Excerpts

My Lords, I will start by outbidding the noble Lord, Lord Paddick: I too am an early adopter of technologies. In fact, I used to write algorithms and buy black boxes to use in various business contexts in my previous life as an engineer.

I have been reflecting on my various experiences, from my working life and my life as a magistrate, of what we have been talking about today. It is interesting that, as an engineer, I spent probably 15 years of my life doing this sort of technology but, when I eventually became a business owner and a chief executive, I did not use that technology in the business I ran; I was too sceptical of it. I occasionally commissioned work to be done, but it was absolutely not part of the business processes and decisions that I was making when I was the boss of a company.

To go back a bit further, to when I was working as a councillor in south-west London about 30 years ago, we were upgrading CCTV on the council estate where I represented people. It was an interesting exercise, because the councillors and the shopkeepers were in favour of it, but my friends who came from ethnic minorities were against it. There was a huge increase in CCTV technology on the estates I represented. Interestingly, that was also when the use of the hoodie became absolutely ubiquitous. All young people wore hoodies, partly because of the introduction of CCTV.

I have sat as a magistrate for 15 years and been through the whole experience of doing remote hearings in criminal, family and youth jurisdictions. We also use technology in various bits of the process we are considering, such as DNA and drug and alcohol testing. Interestingly, the Probation Service has its own predictive tools—which I do not think are AI based but are nevertheless predictive tools—on the likelihood of offenders to reoffend, and we read about those predictions in its reports and have to take them into account in our sentencing decisions. That has been a routine part of the sentencing exercise, if I can put it like that.

The one bit of technology which has made the biggest difference to my role as a magistrate has been body-worn video cameras. I think the Met Police invested well over £100 million in giving all operational police officers body-worn video cameras, and that has made a specific difference to the way in which we deal with domestic abuse cases. When police officers walk in through that front door and they are filming what they see in front of them, which of course you can then see in court, it makes a huge difference to the likelihood of getting a conviction. As we all know, very often the woman, who is usually the victim, does not want to go ahead and press charges. However, literally, when that front door is opened and a police officer walks in, you get a very different impression—a very realistic one—of the state of play in that house, if I may put it like that. That is one area where I have seen a huge improvement—I believe it is one—in the likelihood of getting convictions in domestic abuse cases.

To return to the debate and the report, I too congratulate the noble Baroness, Lady Hamwee, and all the members of the committee. This has been an extremely interesting debate. The officials are clearly very expert, and that is reflected in the debate itself. I was reading the recommendations of the report—I am not sure whether, in my role, I am supposed to say that I agree with them all wholeheartedly, but I do. The challenge put to the Minister to give a more sympathetic response than the official response that we have all read is fair, because the recommendations are born out of a great deal of work. The analogy with the health service and NICE, as my noble friend Lady Primarolo said, is a good one, and one could make other analogies with defence and other things like that, so why not in this context as well? I will be interested to hear the Minister’s answer to that question.

All the contributions to today’s debate have been exceptional. Again, my noble friend Lady Primarolo asked two questions of the Minister, on bringing together all 43 police forces to exchange information and look at the issues which they are facing, and on appointing an expert panel to look at the overall situation.

The noble Baroness, Lady Sanderson, also made a very good intervention. Her point about CAID—the identification of child abuse images—was interesting. As she said, that was a Home Office-developed and implemented technology that was done on a national scale, which of course is very different from what we are talking about in the context of this report.

As usual, my noble friend Lady Chakrabarti made an informed and provocative speech, if I may put it like that. As she said, we need to get into the black box—I thought that was the right way of putting it. That is what prompted me to talk about my previous business experience of the scepticism of sometimes buying pieces of kit when you know it is a black box; but when I was in a different position, I chose not to go down that route. As she said, we need a national body to look into those black boxes, because, ultimately, the fairness of the system is the most important thing.

As the noble and learned Lord, Lord Hope, said, ultimately, people need to believe that they are treated fairly, whether it is in a court, when they are charged or when they are in prison. They might not like what is happening to them, but they need to understand it and understand the process by which decisions are made about them. If they cannot do that, they will be far less likely to accept the results of a conviction, a prison sentence or whatever it is. So it is very much in all our interests that the technology is understood, and that people feel that the criminal justice system is treating them fairly.

I will conclude on this point: I have an insider’s look into the way that court hearings are conducted. In the vast majority of cases in one of the jurisdictions I am involved in, it is not legal or technology failures but administrative failures that lead to cases failing. That is a far more human element which has been underinvested in and which leads to a lack of faith in the criminal justice system. While we are talking about technology, we should not take our eye off the much bigger, more practical problem of administering our courts and criminal justice system in a reasonable way.

19:11
Lord Sharpe of Epsom Portrait The Parliamentary Under-Secretary of State, Home Office (Lord Sharpe of Epsom) (Con)
- Hansard - - - Excerpts

My Lords, I thank all noble Lords who have spoken in the debate today and particularly the noble Baroness, Lady Hamwee, for securing the debate. I also thank those who contributed to the Justice and Home Affairs Committee’s thoughtful and insightful report, which has paved the way for today’s discussion.

As the noble Baroness has made clear, the Government responded to that report in June, but it is nevertheless welcome that we have found time to discuss these important matters more fully. I hope this is not the last time we cover the topic; I suspect it will not be. I will remark briefly on the broad thrust of the committee’s report and the Government’s position, as well as on points made during this debate, while also—I am afraid—having to join the noble and learned Lord, Lord Hope, by admitting that I am not much good with my thumbs either.

I am not sure that this line is going to qualify as “riding to the rescue”, but there is significant agreement between the Government and the committee on the challenges posed by advanced technology and how it is rolled out into the justice system. I am sorry if noble Lords feel that the government response was in some way a brush-off, but I am sure all your Lordships would agree that the technology is very complicated. The policing and justice sector and the ethics around balancing competing human rights are also very complicated. The public expect us to have a world-class justice system, and I think all noble Lords acknowledged this. Utilising technology is a cornerstone of this. The police must use technologies to free up officer time to fight crime, by making administration more efficient, and as a tool to hold those responsible for crime to account.

The Government are committed to empowering the police to use the latest technologies because the public support their use. However, there are no easy answers and the risk of acting without fully understanding the implications of these technologies and getting it wrong is very real. We are not presently persuaded by the overall recommendations put forward in the report, but the Government are committed to the spirit of improving consistency, maintaining public trust, ensuring sufficient oversight and empowering the police which sit behind those recommendations.

The subject of transparency was raised by my noble friend Lord Hunt and others. In their evidence, the Government were clear that transparency is not optional. The police themselves see and understand that being transparent is in their interests. We do not agree that we should mandate specific rules on transparency across such a wide range of current and potential future technologies and uses, but that does not mean we take it any less seriously.

Transparency is an important part of data protection laws. Our policing model works only if there is public consent. For the public to consent, as the noble Lord, Lord Ponsonby, has just pointed out, they must be engaged. It is in the police’s interest to hold conversations and be open about what they are doing and why. Several police forces are working with the Centre for Data Ethics and Innovation to explore how the algorithm transparency standard may work for them. We welcome it as one tool that could promote the sharing of best practice, but transparency can come in many forms. Our position is that mandating a set of rules could restrict what information is ultimately provided to the public and risks turning transparency into a tick-box exercise.

Instead, we will continue to help the police to collaborate with experts and identify how they can be transparent in a way that allows scrutiny, both at a technical level by those with expert knowledge and at an ethical level by the wider public. There is no point being transparent if what is said cannot be understood. We are in agreement that the question of ethics is of fundamental importance, and the ethics of acting or using technology is not something to be considered lightly.

We have heard how important the roles of accountability and oversight are at each stage of the system. I would caution that a statutory ethics panel, as proposed in the report, may decrease democratic oversight because such powers could override local decision-making, local accountability and locally elected officials, but I note the particular reference to the West Midlands Police example. We are not persuaded that the creation of a national statutory ethics committee is the best way to bring expert insight into police practice, but we will continue to work with colleagues in policing to develop and support non-statutory models.

Our democratic system, and ultimately Parliament, is here to provide scrutiny and oversight. The committee’s report is proof of that, as is today’s debate. It is right that our institutions are held to account, especially in relation to the complex and important issues we have discussed today. The committee’s report noted that, below this, there are a range of oversight bodies tasked with providing oversight on various aspects of how the police use technology. We recognise the risk of overlap and confusion, which is why we have proposed in the Data Protection and Digital Information Bill to simplify the arrangements for biometric and surveillance cameras, because, ultimately, it is individuals, not technology, who take the key decisions within the justice system. Technology may be used to generate insights, but the decision to arrest will always remain with the officer, while the courts will decide what material can be given in evidence in determining guilt and any sentence. The Government will continue to support work to equip and educate the individuals working within the justice system so that they understand the technologies they use and how to use them correctly.

My noble friend Lord Hunt and others raised governance and accountability. On accountability, I think the question was who is responsible when things go wrong—who has the day-to-day responsibility for governance? There are existing regulations covering the responsibilities of parties when undertaking a procurement and when working together to provide a service. Depending on the issue, it may be addressed in different ways: illegal activity may be a criminal offence; other unlawful activities, such as a data protection breach, would be an issue for regulators; and poor performance should be mitigated against at the contractual level.

The public expect the police to innovate. They have to be allowed to do so within the law, so decisions on what technologies to use are highly operational ones for the police, independent of government. However, the police need to act within the legal framework set out by Parliament, and bans are in place where they are proportionate to the risk, such as in cases where the technology poses a risk of lethal or less than lethal force. This is not the same level of risk as that associated with the types of technologies raised in the report.

Chief constables ultimately decide when and how to use new technologies. However, they and their PCC are advised, regulated and overseen by a range of technical and regulatory bodies. The police chief scientific adviser, who I will come back to, advises chief constables on important matters such as good education. The ICO can and will take action where there is a lack of compliance with data protection laws. His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services has a duty to consider how forces are meeting the Peelian principles, of which the use of technology is of course a part. HMICFRS undertakes thematic reviews based on its local inspections, and the use of technology is an area which could merit specific analysis.

The noble Baroness, Lady Primarolo, asked about individual complaints challenging the use of technology. Challenging the use of technology in the courts is certainly a resource-intensive process, and it is best reserved as a solution when the circumstances are exceptional. However, individuals can report concerns through other avenues, and we encourage them to do so. Where there are concerns over necessity, proportionality or a policing justification, they could be raised with HMICFRS, which has a mandate to consider how professional standards are applied in its reports and investigations. If the matter relates to how individuals within policing are using technology and their behaviour, this may be something to take forward with the independent police complaints authority. Concerns related to fairness, equality or rights can be raised with the Equality and Human Rights Commission, while the Information Commissioner’s Office is well placed to investigate questions of data protection and privacy.

Noble Lords have acknowledged that the police are operationally independent, which is an essential principle of our system. Nevertheless, we are also alive to the need to ensure that law enforcement is given appropriate support in adapting to technological change and advancements. The role of the police chief scientific adviser, to which I have referred, was created to give policing a scientific capability, establishing a dedicated place for advice on how to innovate, test technologies and ensure that tools do what they claim. Since being appointed, the chief scientific adviser has led reform of how the sector works with the scientific community and is developing a strategy for science and technology. The NPCC’s science and technology strategy will strengthen how the police approach using validated and cutting-edge science in their mission to protect the public. The Government support this strategy and encourage its successful adoption. Those using the technology and impacted by it must be confident that it works as it should.

The Home Office is investing in policing to strengthen the technical evidence available on the most promising future technologies, as well as helping in the commission of research by the Defence Science and Technology Laboratory, which tests functional performance. Confidence in the scientific basis and validity of the technology being used is only part of the picture: there must also be confidence in the operational practice.

The wider question of technology in the justice system is clearly an area in which it is important constantly to develop best practice and future guidance. We agree that clear and consistent advice is essential to allow innovation. To this end, the sector is developing its repository of guidance and information. For example, the College of Policing published national guidance on live facial recognition earlier this year. The Government will support the sector to stay on the front foot in addressing specific technologies, as needed.

An approach centred on the “Move fast and break things” mantra may work for innovation in the Silicon Valley, but it would not be appropriate in the context of UK law enforcement. So we have no wish to break the system establishing the rule of law, which of course dates back a very long time. That is not to say that the Government intend to sit back and be solely reactive, but proactively regulating brings its own risks. Mandating standards without consensus in the sector on what it needs may turn certification into something that is easily gamed by bad actors, opening up public authorities to harm.

So, although I happily acknowledge that there will be an opportunity for someone to set global standards, at the moment the Government are of the opinion that certification, or kitemarking, can create false confidence in the validity of a technology. We want to ensure that responsibility for using lawful technologies is not delegated to a certification process that may be gamed. Within our existing regulatory model, the police have a responsibility to use products that are safe and meet the high ethical tests set out in the data protection, human rights and equalities legal framework.

Assessing proportionality and necessity, even if the technology works, depends on the unique factors of each use case. Organisations should not hide behind regulations or certification when it comes to deploying new technologies responsibly. The police must make justifiable decisions during procurement, development and deployment, reviewing them regularly. The current legal framework places responsibility for how to do that firmly on the organisation. However, in addition to the Centre for Data Ethics and Innovation, the Government have established an AI standards hub to help to promote good practice. But the responsibility and accountability that organisations face are theirs alone.

Although we did not generally share the committee’s overall approach of more and more legislation, we will act when the need is clear. We are confident that the regulatory model is proportionate and mature. We have established a statutory code for digital forensics and placed the forensic services regulator on a statutory footing. As practice consolidates around specific standards, we will continue to learn from the relevant experiences and engage with wider learning from sectors such as healthcare.

Someone, but I am afraid I have forgotten who, asked: does it actually work? The answer is yes. I have a large number of examples but in the time available I will provide one: all forces use facial recognition retrospectively. South Wales Police produces around 100 identifications a month, which, as a noble Lord—I forget who—noted, reduces certification time from 14 days to a matter of hours. South Wales Police and the Met have also used live facial recognition technology and successfully disrupted things like mobile phone theft gangs, with no reported thefts at rock concerts, for example, and there were 70 arrests overall during various trials, including for offences as severe as rape, robbery and other forms of violence.

The noble Lord, Lord Clement-Jones, raised the Bridges case. That was a compliance failure by South Wales Police. The court confirmed that there was a legal basis in common law and a legal framework including human rights, data protection and equalities law, in which live facial recognition and, by extension, other technologies could be usefully carried out. Since the judgment, the College of Policing has published an authorised professional practice clarifying the “who” and “where” questions.

On the question of potential bias, noble Lords will be interested to know that the US National Institute of Standards and Technology, which is generally recognised as the world’s premier outfit of this type, found that the algorithm that South Wales Police and the Met use shows almost indetectable bias.

The Committee may have noticed that I am slightly between focus ranges with or without glasses, which is making life rather complicated. I wish I were relying on technology at this point.

I was asked about live facial recognition as an example. I have just mentioned that the College of Policing authorised professional practice guidance on live facial recognition. That requires chief officers to ensure training within the force on the following: how to respond to an alert; the technical capabilities of live facial recognition; the potential effects on those subject to the processing; the core principles of human rights; and the potential impact and level of intrusion on each subject.

The adoption of live facial recognition standards serves as an example of where practice has moved quickly over the last few years following legal scrutiny and greater public discourse. The sector learned from the early pilots to test, improve and evolve policies following feedback. The pilots of this tool were just that—early tests. Now that more evidence is available and the maturity of the capability is advanced, we can analyse how the legal framework is working. This process points to the strength of our legal framework as it has driven the improvement of standards without suffocating innovation.

My noble friend Lady Sanderson and the noble Baroness, Lady Ludford, asked about DCMS and cross-departmental working. The answer is that we work very closely. The Home Office is also part of a pilot looking at how the algorithm transparency standard works for the department’s own activities. As for the White Paper, it will come some time next year but I am afraid I do not have a specified date.

I thank all noble Lords who have contributed to this fascinating debate. I extend my thanks again to the committee for all the work and insight that went into producing a thorough and engaging report on these very complex issues. We do not fully agree on the way forward in terms of specific steps, but I am confident in suggesting that there is a broad consensus about the need for a long-term approach. Whether that stops noble Lords being disheartened, I do not know.

For the Government’s part, we will continue to look at the entirety of the system and seek to encourage improvements at each stage, with a focus on developing policy to ensure that the benefits of new technology are realised throughout the justice system. As the report laid out so clearly, there is no option to pause or stand still. The issues discussed today are of fundamental importance to the safety and security of our citizens and our values, and I look forward to continuing our engagement on these matters.

19:29
Baroness Hamwee Portrait Baroness Hamwee (LD)
- Hansard - - - Excerpts

My Lords, there are more recommendations and conclusions in our report which any of us could have spoken to today, but noble Lords have covered a great deal of ground and I thank them all.

Our thanks go to the staff who supported this inquiry: Sam Kenny, our then clerk, and Achille Versaevel, our policy analyst, who, in truth were the authors; Amanda McGrath, who kept everything in order including the members; Aneela Mahmood, who got us coverage in an astonishing number of media outlets; David Shiels, our present clerk; and Marion Oswald, our enormously knowledgeable specialist adviser, who seems to know everyone. Of course, thanks also go to the people who gave us such powerful evidence. I thank the Alan Turing Institute, which hosted last week’s workshop, attracting contributors with such expertise, who I wish were sitting behind me, passing me notes of critique of what we have just heard. That workshop felt like an important validation of our work. My thanks go to all members of the committee, with whom I thoroughly enjoy working. None of their contributions is small.

We were drawn to the topic because of the lack of a legal framework, the rule of law and the potential for injustice—principles which must continue to apply. The speeches today have confirmed these and that the committee appreciates the use of AI. We have not been dismissive of it.

I thought that the noble Lord, Lord Hunt, might refer to the thalidomide case. It was mentioned at the workshop, where the point was made that it is essential to get the tests of a product right, otherwise compliance with the test is used as the defence to a claim.

I have been subjected to a type of AI at the border, where I could get through only when I took off my earrings, because I had not been wearing the same earrings when the passport photo was taken. That is such a minor example, but I felt quite rejected.

I have to say that I thought my noble friend Lord Paddick was going to say that the technology let him range freely through his twin brother’s bank because he thought he was his twin brother.

I do not think that the noble and learned Lord, Lord Hope, should begin to be apologetic about having no technical expertise. In a way, that is the point of our report. The judiciary was very much among those we regarded as affected by the use of AI.

The pace of development was referred to; it is enormous. The issues will not go away, which makes it all the more important that we should not be thinking about shutting the stable door after the horse has bolted or letting the horse bolt.

I thank the Minister for his response. It is not easy to come to this when many of us have lived with it for a long time. To sum up his response, I think the Government agree with our diagnosis, but not what we propose as the cure. We have to make transparency happen. He says it is not optional, but how do we do that, for instance?

There was a good deal of reference in his response to the public’s consenting, policing needing consent and the Peelian principles, but he then listed a number of institutions, which, frankly, confirmed our point about institutional confusion. On ethics and his point that a statutory body could override a democracy, that is not how any of the ethics organisations approach it. It is about closing the stable too late if one addresses specific technology as it is needed.

A commitment to the spirit of the report gets us only so far; it does not leave the Wild West way behind in our rear-view mirror. We will indeed come back to this, maybe when we get the new data protection Bill. This is not an academic issue to be left in a pigeonhole unconnected with issues current in Parliament—I need only say: the Public Order Bill.

Motion agreed.
Committee adjourned at 7.35 pm.