Facial Recognition and the Biometrics Strategy Debate

Full Debate: Read Full Debate
Department: Home Office

Facial Recognition and the Biometrics Strategy

Louise Haigh Excerpts
Wednesday 1st May 2019

(5 years ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Louise Haigh Portrait Louise Haigh (Sheffield, Heeley) (Lab)
- Hansard - -

It is a pleasure to serve under your chairmanship, Sir Roger. This excellent discussion has been informed by expert opinion, particularly from my hon. Friend the Member for Bristol North West (Darren Jones), whom I congratulate on securing this important debate. I think the public would be shocked to hear about the lack of legislative framework and guidance, and the potential for such intrusion into people’s lives by the state.

My hon. Friend spoke about the need for us all to understand the technology that could be used, and to ensure that the frameworks we set out are relevant and keep pace with legislation. That must be informed by a principles-based framework, because legislation will never keep up with the technology used by law enforcement or private operators. Several Members mentioned the police national database and the unlawful processing of custody images by police forces. That is not a good starting point for this debate, given that the Home Office’s response to that issue has been poor and the delays in the auto-deletion of images are worrying.

My hon. Friend mentioned the need for ethics by design to ensure that any biases, particularly against people from BME backgrounds, are built out of such technologies from the beginning and are not allowed to be replicated and harden. He described well the astonishing fact that there is no legal basis for these invasive, pervasive technologies and highlighted clear gaps in the biometric strategy in failing to address those issues. The hon. Member for Strangford (Jim Shannon) spoke powerfully about the consequences of false positives, and raised basic questions about the rights of innocent people. Those questions should be answered. We should not need to hold this debate to speak about the right of innocent people not to have their privacy undermined, and about the police unlawfully holding images of people who have committed no crime.

My hon. Friend the Member for Stretford and Urmston (Kate Green) spoke about her personal experience and the Trafford Centre in her constituency. She made the important point—I have had the same conversation—that the police want and need a transparent, national and consistent framework, because they feel that they have to make things up as they go along. Experiences will differ: South Wales police has demonstrated a completely different attitude from the Met’s in rolling out facial recognition, and it cannot be right for people to experience different technologies in completely different ways and with different attitudes, depending on the police force in the area where they live.

Kate Green Portrait Kate Green
- Hansard - - - Excerpts

My hon. Friend is right to say that the police want a clear, national framework, and it cannot be right that different police forces operate in different ways. Greater Manchester police has stopped using that technology altogether, but there may be circumstances where we would like it to be deployed to keep us safe.

Louise Haigh Portrait Louise Haigh
- Hansard - -

That is completely right, and that is why this debate and the framework are so important. We cannot allow the police, with all the best intentions, to attempt to use this technology and then in some cases to mess it up—as they will—and have to roll it back. We want to ensure that the framework is in place so that the police can go ahead with confidence and the public have confidence. We must ensure that biases are designed out and that people accept the intrusion into their privacy and understand that such technology is being used proportionately and out of necessity. At the moment we cannot have confidence in that, which is why this debate is so important.

Darren Jones Portrait Darren Jones
- Hansard - - - Excerpts

I thank my hon. Friend for giving way, not least because I spoke at great length today. I did not mention earlier that we took evidence in the Select Committee from the Biometrics Commissioner that trials should be conducted on the basis of rigorous scientific guidelines and processes. The problem is that if we let different police forces do different things in different ways, we do not get clear answers on how and in what circumstances the technology can best be used. We need guidelines not just for the regulatory purposes, but so that the trials can be done in the right way.

Louise Haigh Portrait Louise Haigh
- Hansard - -

That is absolutely right. I do not get a strong impression that individual police forces are learning from each other either. In the case of the Met, the word “trial” has been used for the technology’s use at Notting Hill carnival. It has been trialled for three years in a row. When does a trial become a permanent fixture? I do not think that that can now be called a trial. My hon. Friend is absolutely right that if it is a trial, we should be gathering data, and they should be informing Parliament and the public and should be addressing the concerns around false positives and ethnic biases and whether it is being used proportionately. My hon. Friend the Member for Stretford and Urmston gave the astonishing figure that demonstrated the mismatch between the numbers of people who were covered by the facial recognition technology when just one individual was identified. That surely cannot be proportionate.

The question of technology within law enforcement gets to the heart of public consent for policing in this day and age, and the issues we have discussed today represent only the tip of the iceberg of potential privacy issues. So much of what defines an investigation today is data-driven. Data-driven policing and data-led investigations are transforming policing. It is already completely unrecognisable from when I was a special constable only 10 years ago. The police have the scope to access more of the intimate details of our personal lives than ever before.

The trialling of technology—including facial recognition and, as my hon. Friend the Member for Bristol North West mentioned, risk assessment algorithms—has not been adequately considered by Parliament and does not sit easily within the current legal framework, but it is having some phenomenal results that we should not ignore. The identification of images of child sexual abuse rely on hashing technology, which enables law enforcement and the Internet Watch Foundation to scrape hundreds of thousands of images off the internet each year.

This week, we have had the news on what is in essence compulsion for rape victims to hand over their mobile phones for what potentially amounts to an open-ended trawl of data and messages, without which there is little prospect of conviction. That high-profile debate has lifted the lid on the ethical questions that ubiquity of data and technological advances are having on law enforcement. Nascent technologies such as facial recognition are at the sharp end of this debate. They do not just represent challenges around collecting and storing of data; they also provide recommendations to law enforcement agencies to act, to stop and search and, potentially, to detain and arrest people.

As my hon. Friend the Member for Bristol North West said, we served on the Data Protection Bill Committee, where we discussed these matters briefly. We outlined our concerns about facial recognition, in particular the lack of oversight and regulatory architecture and the lack of operational transparency. I reiterate the call I made to the Home Secretary in May last year that Her Majesty’s inspectorate of constabulary launch a thematic review of the operational use of the technology and report back to the Home Office and to Parliament.

We believe such a report should cover six key areas: first, the process police forces should and do follow to put facial recognition tools in place; secondly, the operational use of the technology at force level, taking into account specific considerations around how data is retained and stored, regulated, monitored and overseen in practice, how it is deleted, and its effectiveness in achieving operational objectives; thirdly, the proportionality of the technology’s use to the problems it is seeking to solve; fourthly, the level and rank required for sign-off; fifthly, the engagement with the public and an explanation of the technology’s use; and sixthly, the use of technology by authorities and operators other than the police.

It is critical as operational technology such as this is rolled out that the public are kept informed, that they understand how and why it is being used and that they have confidence that it is effective. The Minister has the power to commission reports of this type from HMIC and it would be best placed to conduct such a report into the use of police technology of some public concern.

We have discussed concerns about the accuracy of facial recognition tools, particularly in relation to recognising women and people from BME backgrounds—that is quite a swathe of the population! We do not know whether this is because of bias coded into the software by programmers, or because of under-representation of people from BME backgrounds and women in the training datasets. Either way, the technology that the police are currently using in this country has not been tested against such biases. In the debate around consent, it is extremely worrying that potentially inaccurate tools could be used in certain communities and damage the relationship with and the trust in the police still further.

As I said, we had some debates on this issue in the Data Protection Bill Committee, where we attempted to strengthen the legislation on privacy impact assessments. It should be clear, and I do not believe that it is, that police forces should be required to consult the Information Commissioner and conduct a full PIA before using any facial recognition tools.

I am further worried that the responsibility for oversight is far from clear. As we have heard, software has been trialled by the Met, the South Wales police force and other police forces across the country, particularly in policing large events. In September last year, the Minister made it clear in response to a written question that there is no legislation regulating the use of CCTV cameras with facial recognition. The Protection of Freedoms Act 2012 introduced the regulation of overt public space surveillance cameras, and as a result the surveillance camera code of practice was issued by the Secretary of State in 2013. However, there is no reference to facial recognition in the Act, even though it provides the statutory basis for public space surveillance cameras. The Surveillance Camera Commissioner has noted that “clarity regarding regulatory responsibility” for such facial recognition software is “an emerging issue”. We need clarity on whether it is the Biometric Commissioner, the Information Commissioner or the Surveillance Camera Commissioner who has ultimate responsibility for this use of technology. It would also be helpful if the Minister made absolutely clear what databases law enforcement agencies are matching faces against, what purposes the technology can and cannot be used for, what images are captured and stored, who can access those images and how long they are stored for.

The Government’s new biometric strategy takes a small step forward on oversight, with a board to evaluate the technology and review its findings, but it meets too infrequently—three times since last July, as far as I can tell—to have effective oversight of the operational use of the technology. In any case, it is clearly not designed to provide operational safeguards, and that is where big questions remain about discriminatory use and effectiveness. The lack of operational safeguards and parliamentary scrutiny may lead to ill-judged uses of the technology.

I am hopeful that the Minister can assure us today of the Government’s intention to make things a lot clearer in this space, that existing and emerging technologies will be covered by clear, consistent guidance and legislation from the Home Office, that the relevant commissioner will have all the powers they need to regulate these technologies, and that our law enforcement agencies fully understand what they need to do, both before any technology or new method of data collection is rolled out, and afterwards, when an individual’s data rights may have been abused. We need clear principles, and I am not convinced that the legislative landscape as it stands provides that.