Automated Facial Recognition Surveillance Debate

Full Debate: Read Full Debate
Department: Home Office

Automated Facial Recognition Surveillance

Sarah Olney Excerpts
Monday 27th January 2020

(4 years, 2 months ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Urgent Questions are proposed each morning by backbench MPs, and up to two may be selected each day by the Speaker. Chosen Urgent Questions are announced 30 minutes before Parliament sits each day.

Each Urgent Question requires a Government Minister to give a response on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Sarah Olney Portrait Sarah Olney (Richmond Park) (LD)
- Hansard - -

(Urgent Question): To ask the Secretary of State for the Home Department if she will make a statement on police use of automated facial recognition surveillance.

Kit Malthouse Portrait The Minister for Crime, Policing and the Fire Service (Kit Malthouse)
- Hansard - - - Excerpts

The Government are supporting the police and empowering them with the tools they need to deliver on the people’s priorities by cutting the crime that is blighting our communities. We have already pledged 20,000 more officers, new powers and the biggest funding increase in a decade, but embracing new technology is also vital and we support the use of live facial recognition, which can help to identify, locate and arrest violent and dangerous criminals who may otherwise evade justice.

Live facial recognition compares the images of people passing a camera with a specific and predetermined list of those sought by the police. It is then up to officers to decide whether to stop and speak to those flagged as a possible match. This replicates traditional policing methods such as using spotters at a football match. The technology can make the search for suspects quicker and more effective, but it must be used strictly within the law.

The High Court has found that there is an appropriate legal framework for the police use of live facial recognition, and that includes police common-law powers, data protection and human rights legislation, and the surveillance camera code. Those restrictions mean that sensitive personal data must be used appropriately for policing purposes, and only where necessary and proportionate. There are strict controls on the data gathered. If a person’s face does not match any on the watchlist, the record is deleted immediately. All alerts against the watchlist are deleted within 31 days, including the raw footage, and police do not share the data with third parties.

The Metropolitan Police Service informed me of its plans in advance, and it will deploy this technology where intelligence indicates it is most likely to locate serious offenders. Each deployment will have a bespoke watchlist made up of images of wanted people, predominantly those wanted for serious and violent offences. It will also help the police to tackle child sexual exploitation and to protect the vulnerable. Live facial recognition is an important addition to the tools available to the police to protect us all and to keep murderers, drug barons and terrorists off our streets.

Sarah Olney Portrait Sarah Olney
- Hansard - -

We must not allow the UK to become a society in which innocent people feel as though their every movement is being watched by the police. We must not throw away UK citizens’ right to privacy or their freedom to go about their lawful business without impediment.

An independent review of the Met’s facial recognition trial was published last July, and its conclusions are damning. Does the Minister agree with the report that the legal basis for this roll-out is questionable at best and is likely to be in conflict with human rights law? According to an analysis of the Met’s test data, 93% of supposed matches in the four years of trials have been wrong. As well as being inaccurate, facial recognition technology has been shown to be much less accurate in identifying women and ethnic minorities than in identifying white men. This means that women and black, Asian and minority ethnic people are much more likely to be stopped without reason than white men. Given that a black person is already 10 times more likely to be stopped and searched than a white person, does the Minister share the Liberal Democrats’ concern that this technology will increase discrimination and further undermine trust in the police among BAME communities?

The biometrics commissioner, the Information Commissioner and the surveillance camera commissioner have all raised concerns about facial recognition surveillance, and all three have argued that its impact on human rights must be resolved before a wider roll-out. What steps has the Minister taken since those warnings to examine and address the human rights issues they raise?

Kit Malthouse Portrait Kit Malthouse
- Hansard - - - Excerpts

The hon. Lady rightly raises a number of issues that need to be addressed in the operation of this technology. I assume she is referring to last year’s statement by the Information Commissioner’s Office. The commissioner reviewed the Met’s operation and raised some concerns about how it was operating the pilot of live facial recognition. Happily, the ICO put out a statement on Friday saying that it is broadly encouraged by the fact that the Met has adopted some of its recommendations in this deployment, although she is right that the ICO remains concerned about the legal basis.

Since the ICO report was published, we have had the judgment in a case brought against South Wales police’s deployment of this technology, in which the High Court found there is an appropriate legal basis for the operation of facial recognition. However, I understand that there may be an appeal, and there is a suspended judicial review into the Met’s operation, which may be restarted, so if Members do not mind, I will limit what I say about that.

As for disproportionality, there is no evidence of it at the moment; the Met has not found disproportionality in its data in the trials it has run, and certainly a Cardiff University review of the South Wales police deployment could not find any evidence of it at all. The hon. Lady is, however, right to say that in a country that prides itself in being an open and liberal society, we need to take care with people’s impressions of how technology may impinge upon that. As she will know, live facial recognition has an awful lot of democratic institutions looking at it, not only this House: the London Assembly has a policing ethics panel; we have the Surveillance Camera Commissioner and the Information Commissioner; and there is a facial recognition and biometrics board at the National Police Chiefs’ Council, which brings people together to look at these issues. There is lots of examination to make sure that it is used appropriately, and I am pleased to say that the Met will be operating it on a very transparent basis. As I understand it, the Met will be publishing information about which data was gathered and the success rate, and other information that will allow the public to have confidence that where the technology is deployed to identify wanted criminals it is having the effect intended.