Question to the Home Office:
To ask His Majesty's Government what assessment they have made of the use of facial recognition technologies by police forces and the implications of pausing deployment pending further study of potential racial bias; and what steps they are taking to ensure that such systems are subject to appropriate safeguards, oversight and standards to prevent discriminatory outcomes.
The Home Office works closely with police forces and stakeholders to assess the use of facial recognition by law enforcement. As part of this engagement, we have consulted on a new legal framework on how and when law enforcement should use biometrics and facial recognition, including the safeguards that should apply to the use of these technologies. That consultation closed on 12 February; we are considering responses and will legislate in due course.
When using the technology, the police must operate within the legal framework, including data protection, equality and human rights legislation, national guidance, a code of practice and force‑level policies. The Home Office is aware of the risk of bias in facial recognition algorithms and all police facial recognition systems funded by the Home Office must be independently tested so that they can be operated at settings where there is negligible bias.
The Home Secretary has also tasked His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services (HMICFRS), with support from the Forensic Science Regulator, to look at whether people have been affected by the bias as part of the inspection of police and relevant law enforcement agencies’ use of retrospective facial recognition. The inspection is in progress and the terms of reference have been published by HMICFRS.