Question to the Home Office:
To ask the Secretary of State for the Home Department, what steps she is taking to minimise the risk of racial bias found in AI powered Live Facial Recognition systems.
Police forces using facial recognition must comply with existing legal obligations, including the Human Rights Act 1998, Equality Act 2010 and Data Protection Act 2018.
Facial recognition algorithms provided by or procured with Home Office funding for police use are required to be independently tested for bias. Independent testing is important because it helps determine the setting in which an algorithm can safely and fairly be used.
Where forces procure their own algorithms, forces must ensure that any facial recognition software does not present unacceptable levels of bias. For live facial recognition, this expectation is set out in the College of Policing’s Authorised Professional Practice, which requires algorithms to be independently tested before use, with the results informing how systems are configured for safe and fair deployment.
The government intends to bring forward a new legal framework to create consistent, resilient rules and appropriate safeguards for the use of facial recognition and similar technologies.