Asked by: Baroness Uddin (Non-affiliated - Life peer)
Question to the Home Office:
To ask His Majesty's Government, with regard to the statement by the Secretary of State for the Home Office on 26 January (HC Deb col 610), what steps they are taking to correct and define new large language models for facial recognition to ensure errors and potential racial bias are removed.
Answered by Lord Hanson of Flint - Minister of State (Home Office)
Facial recognition algorithms provided by or procured with Home Office funding for police use are required to be independently tested for accuracy and bias. Independent testing is important because it helps determine the setting in which an algorithm can safely and fairly be used.
Where potential bias or performance issues are identified, the Home Office works with policing partners to ensure their guidance, practices, and oversight processes minimise any risks arising from use of the technology.
On 4 December last year, we launched a public consultation on when and how biometrics, facial recognition and similar technologies should be used, and what safeguards and oversight are needed. Following analysis of the responses, we will publish a formal government response in due course.
Asked by: Baroness Uddin (Non-affiliated - Life peer)
Question to the Home Office:
To ask His Majesty's Government, with regard to the statement by the Secretary of State for the Home Office on 26 January (HC Deb col 610), what assessment they have made of any bias and inconsistency of application in the use of facial recognition assessments and algorithms for Black and Asian men and women.
Answered by Lord Hanson of Flint - Minister of State (Home Office)
The algorithm used for retrospective facial recognition searches on the Police National Database (PND) has been independently tested by the National Physical Laboratory (NPL), which found that in a limited set of circumstances it was more likely to incorrectly include some demographic groups in its search results. At the settings used by police, the NPL also found that if a correct match was in the database, the algorithm found it in 99% of searches.
We take these findings very seriously. A new algorithm has been procured and independently tested, which can be used at settings with no statistically significant bias. It is due to be operationally tested in the coming months and will be subject to evaluation.
Manual safeguards embedded in police training, operational practice and guidance have always required trained users and investigating officers to visually assess all potential matches. Training and guidance have been re-issued and promoted to remind them of these long-standing manual safeguards. The National Police Chiefs’ Council has also updated and published data protection and equality impact assessments.
Given the importance of this issue, the Home Secretary has asked HMICFRS, supported by the Forensic Science Regulator, to inspect police and relevant law enforcement agencies’ use of retrospective facial recognition, with work expected to begin before the end of March.
It is important to note that no decisions are made by the algorithm or solely on the basis of a possible match– matches are intelligence, which must be corroborated with other information, as with any other police investigation.
For live facial recognition, NPL testing found, a 1 in 6,000 false alert rate on a watchlist containing 10,000 images. In practice, the police have reported that the false alert rate has been far better than this. The NPL also found no statistically significant performance differences by gender, age, or ethnicity at the settings used by the police.
On 4 December last year, we launched a public consultation on when and how biometrics, facial recognition and similar technologies should be used, and what safeguards and oversight are needed. Following analysis of the responses, we will publish a formal government response in due course.
Asked by: Baroness Uddin (Non-affiliated - Life peer)
Question to the Home Office:
To ask His Majesty's Government, with regard to the statement by the Secretary of State for the Home Office on 26 January (HC Deb col 610), what steps they will take to ensure that data and information collected as a result of the increased use of facial recognition (1) remains in British jurisdiction, (2) is managed by the government, and (3) is not transferred to any third party entities or nations.
Answered by Lord Hanson of Flint - Minister of State (Home Office)
Custody images used for retrospective facial recognition searches are stored on the Police National Database. The data is held at a secure location in the UK.
Police use of facial recognition is governed by data protection legislation, which require that any processing of biometric data is lawful, fair, proportionate and subject to appropriate safeguards.
Police forces act as the data controllers for facial recognition use and must manage data, including any international transfers, in line with data protection law and established policing standards.
On 4 December last year, we launched a public consultation on when and how biometrics, facial recognition and similar technologies should be used, and what safeguards and oversight are needed. Following analysis of the responses, we will publish a formal government response in due course.