Technology Rules: The Advent of New Technologies in the Justice System (Justice and Home Affairs Committee Report) Debate

Full Debate: Read Full Debate
Department: Home Office

Technology Rules: The Advent of New Technologies in the Justice System (Justice and Home Affairs Committee Report)

Baroness Sanderson of Welton Excerpts
Monday 28th November 2022

(1 year, 5 months ago)

Grand Committee
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Sanderson of Welton Portrait Baroness Sanderson of Welton (Con)
- Hansard - -

My Lords, it is an honour to follow the noble and learned Lord. As others have before me, I compliment the chair of the committee, the noble Baroness, Lady Hamwee, on her comprehensive opening remarks—no easy feat with this report—and her very fair and decent approach throughout the committee’s inquiry. I also compliment our secretariat on its hard work and guidance.

There are many topics we could cover—and have covered—in this debate today: the technology itself, the dangers of inherent bias and predictive policing and the implications for civil liberties. However, for the purposes of today, I will concentrate on the pace at which new technologies are developing, particularly within the police—which I, and perhaps the Minister, notice seems to be an emerging theme—and pick up on some of the Home Office’s responses to our concerns.

As my noble friend the Minister will know from the report, when we began this investigation, we did it on the understanding that, despite the concerns I have just mentioned, AI is a fact of modern life. We acknowledged that it can have a positive impact in improving efficiency and finding solutions in an ever more complex world.

However, in terms of the justice system, and more specifically the police, we became alarmed at the relatively unchecked proliferation of new technology across all 43 forces. As has been mentioned, we made a number of recommendations to combat this: a central register, kitemark certification, mandatory training and better oversight.

I know that these are significant steps and that they have costs attached, but they were carefully thought through and, to be honest, we were not expecting to be quite so roundly dismissed by the Government in their response. They seemed to imply that we had failed to appreciate the value and necessity of AI tools in today’s policing environment. In particular, the response highlighted the use of CAID—the Child Abuse Image Database—which brings together all the images found by police and the NCA, helping them to co-ordinate investigations.

In one sense, the Government are right to make much of CAID because it was game-changing. For instance, a case with 10,000 images that would typically have taken up to three days to review could be done in an hour, thanks to CAID. Perpetrators could be apprehended more quickly, officers protected from the effects of viewing these images and more focus placed on identifying the victims. As someone who worked on child sexual abuse and exploitation at the Home Office when CAID was introduced, I assure the Minister that I completely understood—and understand—the value of new technologies in certain instances.

However, in the context of the report, I just do not think that it is a very helpful example. The Home Office itself helped to develop CAID in collaboration with the police and industry partners. Once piloted, it went live across all police forces and the NCA. To suggest that that is the norm would be misleading, and it should not be used as a reason not to address the clear problems that we identified in a system where all 43 forces, as has been mentioned, are free to individually commission whatever tools they like in a market that is, as we said, opaque at best and the Wild West at worst, in which the oversight mechanisms are, frankly, inadequate. The Home Office may think that we are overreacting, but the truth is that it would be hard-pushed to make that case because without a central register, as we suggested, it is impossible to know who is using what, how and to whom.

If we dig a little deeper, the Minister may see why we are concerned. Some of this has already been mentioned. On procurement, we heard from a police representative who said that procurement is not the comfort zone of all police forces. When the tools they are procuring may have consequences for human rights and the fairness of the justice system, as we have been talking about, never mind taking into account the complexities of the technologies market, where providers are reluctant to share information on the basis of commercial confidentiality, as the noble Baroness, Lady Hamwee, said, that is truly worrying.

Then there is the problem that, as the NCC Group told us,

“many claims made by [Machine Learning] product vendors, predominantly about products’ effectiveness in detecting threats, are often unproven, or not verified by independent third parties.”

There are the salespeople who—in an understandably overzealous way in a burgeoning market—according to one developer,

“take something they do not understand and shout a number that they do not understand”.

I would add that in many cases they then make it available to officers who do not understand it either. Incredibly, the police are not required to be trained to use different AI technologies—this is one of the things I found most shocking in our report—including facial recognition, because they are procured at a local level.

All this does not feel like a solid foundation on which to deploy such highly sensitive tools and, as the noble Baroness, Lady Hamwee, has already alluded to, there are some in the police and in the market who agree with us. At the excellent conference at the Alan Turing Institute last week, one speaker representing the police pointed out that in order to become a detective you have to pass an exam, and that the same should be true for technology. Another from a different force said: “Artificial intelligence is not on the tip of the tongue of the public yet, but we don’t want it to be another frontier of failure.”

One way in which we could help to build confidence is statutory specialist ethics committees, which would not only increase community involvement and understanding but help to create an institutional culture of accountability, something that we already know needs to be improved. I am afraid to say that that was another recommendation dismissed by the Home Office.

I am not blaming the police here. There are some brilliant forces, such as West Midlands, which have spotted the benefits but also the pitfalls, and which are working hard to get ahead of them. Without more commitment from the Government, though, I fail to see how the current system leads to anything but another frontier of failure. As people have said throughout the debate, at some point under the current free-for-all, when a police force that has not put in the protections that, say, West Midlands has, it feels inevitable that something is going to go very wrong.

It is not as though the Government are not doing anything. The Centre for Data Ethics and Innovation, which is based in DCMS, is piloting the public sector algorithmic transparency standard. We on the committee would all agree with that, and, genuinely, people around the world are looking at it. Can the Minister tell me, if you compare the work that is going on in DCMS with the response to our report, how closely do officials in the Home Office work with their counterparts in DCMS on this? This pilot includes some police forces, and it does not feel as if the two marry up.

Again, as others have said, I know that probably quite a few people may wish to put this report on the shelf and watch it gather dust. However, I think we all know that in practice, that is unlikely to happen because the concerns raised within it will surely become more apparent down the line.

Finally, we heard a great analogy at the conference last week with regard to training for those using AI. The speaker said: “For a car to be allowed on the road, it’s got to have an MOT, but the driver also has to have a licence.” I am afraid that at the moment, with regard to these technologies, we do not have either.