Asked by: Baroness Jones of Moulsecoomb (Green Party - Life peer)
Question to the Home Office:
To ask His Majesty's Government what discussions they have had with the Information Commissioner's Office on the issue of bias in retrospective facial recognition searches of the Police National Database.
Answered by Lord Hanson of Flint - Minister of State (Home Office)
The Home Office is aware of the risk of bias in facial recognition algorithms and supports policing in managing that risk. Hence the Home Office, in collaboration with the Office of the Police Chief Scientific Adviser and the National Police Chiefs’ Council, commissioned independent testing of the facial recognition algorithm currently used by specially trained operators in police forces to search the Police National Database. Contracts were agreed in March 2024.
Independent testing helps to ensure algorithms are used at settings where statistically significant bias is reduced to negligible levels. Where potential bias is identified, the Home Office supports policing to ensure they have the operational processes in place to ensure the risk of any material impact is minimised.
Manual safeguards, embedded in police training, operational practice, and guidance, require all potential matches returned from the Police National Database to be checked by a trained user and investigating officer. These safeguards pre-date the National Physical Laboratory testing but they were reviewed once the results were known.
Home Office Ministers were first made aware of a bias in the algorithm used by specially trained operators in police forces to search the Police National Database in October 2024. Initial findings were shared with the Home Office between March 2024 and October 2024, and the final report was provided by NPL in April 2025 and updated for publication in October 2025.
A replacement system with a new algorithm has also been procured by the Home Office and independently tested. This testing has been published and shows that the system can be used with no statistically significant bias. It is due to be operationally tested early next year and will be subject to further evaluation.
The Home Office briefed the Information Commissioners Office on the findings of the independent report ahead of its publication and we continue to work closely with the ICO as we consult on a new legal framework for law enforcement use of biometrics, facial recognition and similar technologies.
The Home Office has also commissioned His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services, with support from the Forensic Science Regulator, to conduct an inspection of police and relevant law enforcement agencies’ use of retrospective facial recognition.
Asked by: Baroness Jones of Moulsecoomb (Green Party - Life peer)
Question to the Home Office:
To ask His Majesty's Government what assessment they have made, if any, of the impact of bias in retrospective facial recognition searches of the Police National Database.
Answered by Lord Hanson of Flint - Minister of State (Home Office)
The Home Office is aware of the risk of bias in facial recognition algorithms and supports policing in managing that risk. Hence the Home Office, in collaboration with the Office of the Police Chief Scientific Adviser and the National Police Chiefs’ Council, commissioned independent testing of the facial recognition algorithm currently used by specially trained operators in police forces to search the Police National Database. Contracts were agreed in March 2024.
Independent testing helps to ensure algorithms are used at settings where statistically significant bias is reduced to negligible levels. Where potential bias is identified, the Home Office supports policing to ensure they have the operational processes in place to ensure the risk of any material impact is minimised.
Manual safeguards, embedded in police training, operational practice, and guidance, require all potential matches returned from the Police National Database to be checked by a trained user and investigating officer. These safeguards pre-date the National Physical Laboratory testing but they were reviewed once the results were known.
Home Office Ministers were first made aware of a bias in the algorithm used by specially trained operators in police forces to search the Police National Database in October 2024. Initial findings were shared with the Home Office between March 2024 and October 2024, and the final report was provided by NPL in April 2025 and updated for publication in October 2025.
A replacement system with a new algorithm has also been procured by the Home Office and independently tested. This testing has been published and shows that the system can be used with no statistically significant bias. It is due to be operationally tested early next year and will be subject to further evaluation.
The Home Office briefed the Information Commissioners Office on the findings of the independent report ahead of its publication and we continue to work closely with the ICO as we consult on a new legal framework for law enforcement use of biometrics, facial recognition and similar technologies.
The Home Office has also commissioned His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services, with support from the Forensic Science Regulator, to conduct an inspection of police and relevant law enforcement agencies’ use of retrospective facial recognition.
Asked by: Baroness Jones of Moulsecoomb (Green Party - Life peer)
Question to the Home Office:
To ask His Majesty's Government when they were first aware of bias in retrospective facial recognition searches of the Police National Database.
Answered by Lord Hanson of Flint - Minister of State (Home Office)
The Home Office is aware of the risk of bias in facial recognition algorithms and supports policing in managing that risk. Hence the Home Office, in collaboration with the Office of the Police Chief Scientific Adviser and the National Police Chiefs’ Council, commissioned independent testing of the facial recognition algorithm currently used by specially trained operators in police forces to search the Police National Database. Contracts were agreed in March 2024.
Independent testing helps to ensure algorithms are used at settings where statistically significant bias is reduced to negligible levels. Where potential bias is identified, the Home Office supports policing to ensure they have the operational processes in place to ensure the risk of any material impact is minimised.
Manual safeguards, embedded in police training, operational practice, and guidance, require all potential matches returned from the Police National Database to be checked by a trained user and investigating officer. These safeguards pre-date the National Physical Laboratory testing but they were reviewed once the results were known.
Home Office Ministers were first made aware of a bias in the algorithm used by specially trained operators in police forces to search the Police National Database in October 2024. Initial findings were shared with the Home Office between March 2024 and October 2024, and the final report was provided by NPL in April 2025 and updated for publication in October 2025.
A replacement system with a new algorithm has also been procured by the Home Office and independently tested. This testing has been published and shows that the system can be used with no statistically significant bias. It is due to be operationally tested early next year and will be subject to further evaluation.
The Home Office briefed the Information Commissioners Office on the findings of the independent report ahead of its publication and we continue to work closely with the ICO as we consult on a new legal framework for law enforcement use of biometrics, facial recognition and similar technologies.
The Home Office has also commissioned His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services, with support from the Forensic Science Regulator, to conduct an inspection of police and relevant law enforcement agencies’ use of retrospective facial recognition.
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Home Office:
To ask His Majesty's Government what steps they are taking to ensure that any use of live facial recognition cameras by law enforcement bodies is subject to clear safeguards to protect privacy and human rights.
Answered by Lord Hanson of Flint - Minister of State (Home Office)
When deploying facial recognition technology, police forces must comply with existing legislation including the Human Rights Act 1998, Equality Act 2010, Data Protection Act 2018, Police and Criminal Evidence Act 1984, as well as their own published policies. For live facial recognition, police forces must also follow the College of Policing’s Authorised Professional Practice (APP) on Live Facial Recognition. Forces must also give due regard to the Surveillance Camera Code of Practice, which is supplemented by published policing policies.
On 4 December the Government launched a 10 week public consultation on law enforcement use of biometrics, facial recognition and similar technologies. We are consulting on a new legal framework to create consistent, durable rules and appropriate safeguards for biometrics and facial recognition. This framework will aim to strike the right balance between public protection and privacy. The consultation will close week commencing 9 Feb 2026.
Asked by: Mike Wood (Conservative - Kingswinford and South Staffordshire)
Question to the Cabinet Office:
To ask the Minister for the Cabinet Office, to ask the Minister for the Cabinet Office further to the Answer of 4 November 2025, to Question 85708, on Honours: Forfeiture, if he will publish the criteria for removal from the Roll of the Peerage.
Answered by Nick Thomas-Symonds - Paymaster General and Minister for the Cabinet Office
There are no set criteria for removal from the Roll of the Peerage.
Asked by: Baroness Jones of Moulsecoomb (Green Party - Life peer)
Question to the Home Office:
To ask His Majesty's Government what plans they have to restrict the circumstances in which children may be added to facial recognition watchlists.
Answered by Lord Hanson of Flint - Minister of State (Home Office)
Facial recognition is a crucial tool that helps the police locate missing people, suspects, and those wanted by the courts.
In some cases, under the existing legal framework this includes vulnerable individuals such as missing children. When using facial recognition technology, police forces must comply with legislation including the Human Rights Act 1998, Equality Act 2010, Data Protection Act 2018, Police and Criminal Evidence Act 1984, as well as their own published policies. For live facial recognition, police forces must also follow the College of Policing’s Authorised Professional Practice (APP) on Live Facial Recognition.
This sets out the categories of people who may be included on a watchlist. These include individuals wanted by the police or the courts, suspects, missing or vulnerable people, or those posing a risk of harm to themselves or others. In each case, inclusion on a watchlist must be justified and authorised, and must pass the tests of necessity, proportionality and use for a policing purpose.
On 4th December the Government launched a consultation on a new legal framework for law enforcement use of biometrics, facial recognition and similar technologies. During the consultation we want to hear views on when and how biometrics, facial recognition and similar technologies should be used, and what safeguards and oversight are needed.
Asked by: Lord Bailey of Paddington (Conservative - Life peer)
Question to the Home Office:
To ask His Majesty's Government, further to the Written Answer and remarks by Lord Hanson of Flint on 17 November (HL11520) and 10 November (HL Deb col 66), whether for-profit social housing providers will be granted the same powers as not-for-profit housing providers under the Crime and Policing Bill.
Answered by Lord Hanson of Flint - Minister of State (Home Office)
Through the Crime and Policing Bill, we are strengthening the powers available to relevant agencies under the Anti-Social Behaviour, Crime and Policing Act 2014.
For-profit Social Housing Providers have grown in prominence since the 2014 Act first came into force. While it is important that all agencies have the powers they need to tackle ASB, it is also important that changes to the agencies that can use the powers in the 2014 ASB Crime and Policing Act are considered carefully, on a case-by-case basis. The addition of for-profit social housing providers as applicant agencies for Respect Orders, Housing Injunctions and Youth Injunctions remains under consideration, as mentioned in previous answers.
We are, however, legislating in the Crime and Policing Bill to extend the power to issue Closure Notices to Registered Social Housing Providers, including For Profit Housing Providers. This will make it easier for Housing Providers to take swift action to prevent disruptive ASB.
Asked by: Jim McMahon (Labour (Co-op) - Oldham West, Chadderton and Royton)
Question to the Cabinet Office:
To ask the Minister for the Cabinet Office, what assessment he has made of the regional representation of new peers appointed for each year since 2015 to the current December 2025 list.
Answered by Nick Thomas-Symonds - Paymaster General and Minister for the Cabinet Office
The House of Lords works best when there is a diversity of perspectives represented, including from all the nations and regions of the United Kingdom. The Prime Minister published a statement in June 2025, setting out the roles and responsibilities of those involved in the appointments system, in which he emphasised that party leaders should consider national and regional representation when making nominations, to ensure the second chamber better reflects the country it serves.
As a first step in reform of the House of Lords, the Government introduced the House of Lords (Hereditary Peers) Bill which removes the right of hereditary peers to sit and vote in the House of Lords. The Government’s priority is to get this Bill on the statute book as soon as possible.
Asked by: Lord Kempsell (Conservative - Life peer)
Question to the Home Office:
To ask His Majesty's Government whether the Metropolitan Police Flying Squad will have its firearms capability removed; and if so, what assessment they have made of that decision.
Answered by Lord Hanson of Flint - Minister of State (Home Office)
Decisions around the deployment of armed officers are operational matters for individual chief constables to determine. It is therefore the responsibility of the Commissioner of the Metropolitan Police Service to determine how best to meet the operational requirements and make decisions on deployment of armed officers in London.
Asked by: Martin Wrigley (Liberal Democrat - Newton Abbot)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, pursuant to the answer received on 21 November 2025 to Written Question 90488, how much additional re-investment her Department anticipates after bringing into force sections 61-64 of the Product Security and Telecommunications Infrastructure Act 2022.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
The government’s ambition is for all populated areas to have access to higher quality standalone 5G by 2030. Operators have set out investment plans that align with our ambition, and we are committed to ensuring we have the right policy and regulatory framework in place to support investment and competition in the market.
The aim of the 2017 reforms was to encourage investment in digital networks and improve coverage and connectivity across the UK. The changes introduced by the Product Security and Telecommunications Infrastructure Act 2022 ensure greater consistency throughout the UK and reduce confusion and uncertainty when agreements come to an end and are being renewed.
While the Department does not monitor levels of reinvestment in networks, Ofcom’s Connected Nations report provides an estimate for the level of investment into mobile networks by industry. Ofcom’s latest report estimates that in 2024, mobile network investment accounted for £1.8bn.