First elected: 12th December 2019
Speeches made during Parliamentary debates are recorded in Hansard. For ease of browsing we have grouped debates into individual, departmental and legislative categories.
e-Petitions are administered by Parliament and allow members of the public to express support for a particular issue.
If an e-petition reaches 10,000 signatures the Government will issue a written response.
If an e-petition reaches 100,000 signatures the petition becomes eligible for a Parliamentary debate (usually Monday 4.30pm in Westminster Hall).
These initiatives were driven by Ben Spencer, and are more likely to reflect personal policy preferences.
MPs who are act as Ministers or Shadow Ministers are generally restricted from performing Commons initiatives other than Urgent Questions.
Ben Spencer has not been granted any Urgent Questions
Ben Spencer has not been granted any Adjournment Debates
A Bill to create offences relating to repeat breaches of planning controls; to make provision about penalties for planning offences; to establish a national register of persons who have committed planning offences or breached planning controls and make associated provision about planning applications; and for connected purposes.
A Bill to provide for a duty on transport authorities and other specified persons to cooperate to reduce transport disruption and to ensure the effective operation of transport networks; to provide for reporting requirements in connection with that duty; to require the publication of assessments of expected transport disruption resulting from maintenance, construction, and other works related to transport infrastructure and ancillary services; and for connected purposes.
A Bill to make provision for the collection and publication of statistics on mental health hospital admissions; and for connected purposes.
Planning (Flooding) Bill 2024-26
Sponsor - Blake Stephenson (Con)
The Government is committed to transparency and accountability, including through clear and timely responses to correspondence. I can confirm that your letter to the Minister for the Cabinet Office dated 15 July, and subsequent email 8 September, have been passed to me as the Minister responsible for this policy, and I have replied to your letter. Please accept my apologies for the delay.
The Government is committed to transparency and accountability, including through clear and timely responses to correspondence. I can confirm that your letter to the then-Chancellor of the Duchy of Lancaster dated 5 March, and subsequent emails of 13 May and 11 June, have been passed to me as the Minister responsible for this policy, and I have replied to your letter. Please accept my apologies for the delay.
No, the Prime Minister does not use large language model software to help in drafting social media posts.
The consultation on the UK Internal Market Act 2020 follows the UK Government Consultation principles (Consultation principles: guidance - GOV.UK).
The consultation is available in HTML format to provide greater accessibility for users and all pages can be printed in an accessible format.
Alternative methods of completing the consultation are available including online, via email and via post. The Department can provide hard copies of the consultation document upon request.
Annually, in line with the Financial Reporting Council’s Corporate Governance Code, the Post Office Board runs a Board Evaluation exercise to determine skills gaps on the Board and areas for improvement. The outcomes of this review are shared with DBT as the Post Office Shareholder. The Government monitors the implementation of their recommendations via the Shareholder Representative, UK Government Investments, to support the continuous improvement of the Board.
In line with Post Office’s governance framework, the Government recently approved the Post Office Chair to lead the recruitment of new Non-Executive Directors (NEDs) specialising in technological transformation and organisational design to complement the Board’s existing composition. A further two new Postmaster NEDs should join the Board in the near future, providing an ongoing voice on behalf of the postmaster community.
The UK has over 70 trade agreements in place. APHA facilities and services managing the risks posed by animal diseases to human and animal health are crucial to underpinning the biosecurity of imports and exports under all of them.
Published impact assessments for the UK’s trade agreements can be found on Gov.uk.
The department launched the consultation ‘Improving the energy performance of privately rented homes’ on 7 February 2025, with more information available at the following link:
https://www.gov.uk/government/consultations/improving-the-energy-performance-of-privately-rented-homes-2025-update (opens in a new tab)
As per the department's accessible documents policy, the consultation document is published on GOV.UK and provided as both a tagged PDF and HTML so screen readers can understand the page structure. The digital survey for users to respond to the consultation is also hosted on an accessible platform.
Users of assistive technology (such as screen readers) can request a copy of the consultation in an accessible format by emailing alt.formats@energysecurity.gov.uk. Users who have queries on the consultation can also contact PRSMEESConsultation@energysecurity.gov.uk.
The department launched the consultation ‘Review of the Fuel Poverty Strategy’ on 7 February 2025, with more information available at the following link:
https://www.gov.uk/government/consultations/review-of-the-fuel-poverty-strategy
(opens in a new tab)
As per the department's accessible documents policy, the consultation document is published on GOV.UK and provided as both a tagged PDF and HTML so screen readers can understand the page structure. The digital survey for users to respond to the consultation is also hosted on an accessible platform.
Users of assistive technology (such as screen readers) can request a copy of the consultation in an accessible format by emailing alt.formats@energysecurity.gov.uk. Users who have queries on the consultation can also contact fuelpovertyconsultation@energysecurity.gov.uk.
The Department launched the consultation ‘Draft National Policy Statement for nuclear energy generation (EN-7)’ on 6 February 2025, with more information available at the following link: https://www.gov.uk/government/consultations/draft-national-policy-statement-for-nuclear-energy-generation-en-7 (opens in a new tab)
As per the department's accessible documents policy, the consultation document is published on GOV.UK and provided as both a tagged PDF and HTML so screen readers can understand the page structure. The digital survey for users to respond to the consultation is also hosted on an accessible platform.
Users of assistive technology (such as screen readers) can request a copy of the consultation in an accessible format by emailing alt.formats@energysecurity.gov.uk. Users who have queries on the consultation can also contact nuclearnps.consultation@energysecurity.gov.uk.
Work continues to progress the UK Severe Space Weather Preparedness Strategy which was published in September 2021.
The Strategy was developed in close collaboration with the academic community and commits to a series of targeted research and development activities. This is currently being delivered through the Space Weather Innovation, Measurement, Modelling and Risk (SWIMMR) programme.
We regularly meet with industry stakeholders, including AI firms, on potential risks that AI poses to businesses and the public.
There are a range of existing rules that already apply to AI systems to address risks, with the UK’s expert regulators empowered to apply rules in their own areas of competence. The government will act where these laws are not enough to ensure safe use.
The CSR Bill updates the UK’s cyber resilience framework set out in the NIS Regulations 2018 and does not impact the UK's key data protection legislation. It includes a range of measures that affect the Information Commission in its capacity as a NIS regulator, but not its capacity as the UK data protection authority.
The European Commission’s draft decision from 24 June 2025 on UK adequacy concludes that the UK continues to provide an essentially equivalent level of data protection. The government does not consider there to be specific developments that pose substantive risks to the EU adequacy decisions being renewed by the EU’s deadline for adoption of 27 December 2025.
DSIT consulted with the Information Commission during the development of the Bill in accordance with its obligations under Article 36(4) of the General Data Protection Regulation.
The Department for Science, Innovation and Technology (DSIT) recognises the importance of robust protections for the services essential to our society and economy. That is why we introduced the Cyber Security and Resilience Bill (CSRB) on 12 November - by enhancing protections for the most important digital services, Government services that rely on them will also benefit.
As the digital centre of government, DSIT also recognises that a step change in cyber and digital resilience is required across the government sector. However, we do not need to wait for legislation to take action.
We are acting in parallel with the approach of the CSRB through our mandate to set robust cyber security standards across government organisations. Government services have been subject to the National Cyber Security Centre’s Cyber Assessment Framework since 2022, which promotes resilience against both cyber attacks and the types of system failure that we saw with the Cloudflare outage.
Despite this progress, we are not complacent. DSIT will publish the Government Cyber Action Plan, which will lay out a detailed programme of work with clear expectations, targets, and milestones to enhance Government's cyber and digital resilience.
Officials have worked closely with regulators and the NCSC in developing the Cyber Security and Resilience Bill and will continue to do so throughout its parliamentary passage and implementation planning.
The NCSC already leads the UK’s response to cyber incidents by triaging reports, supporting affected organisations and coordinating government action during major incidents. In the year preceding, September 2025, NCSC received 1,727 incident tips, 429 of which required direct support. The Bill will expand the type of incidents reported to regulators and the NCSC, strengthening understanding of the threat landscape and improving national cyber-defences.
The Bill will also bolster regulator resources by reforming cost recovery. Currently, regulators are constrained – for example, they cannot recover the cost of enforcement. The Bill will enable regulators to fully recover their costs and utilise flexible, sector-appropriate charging mechanisms, ensuring they are properly equipped to meet their duties.
The Department for Science, Innovation and Technology is leading government’s response to the Cloudflare outage which occurred on Tuesday 19 November 2025. We understand that Cloudflare services were restored on Tuesday evening, and DSIT is engaging with Cloudflare to understand the full impact of this incident, and how such events can be mitigated in the future.
DSIT has identified disruption to some online Government services, which were restored within hours of the incident. We are not aware of any disruption to Critical National Infrastructure.
The outage affected a wide range of organisations across all sectors, and it will take some time to fully understand the scale of the economic impact.
The Government recognises the importance of robust protections for the services essential to our society and economy – that is why we introduced the Cyber Security and Resilience Bill on 12 November.
Many government departments use AWS services, and we are aware that the Home Office, DVLA, DWP and HMRC all experienced impacts as a result of the outage on Monday 20 October.
The Department for Science, Innovation and Technology (DSIT) is working with both departments and AWS to better understand the impacts, and will use this to inform future work on government digital resilience.
The cost of the outage is not yet known.
No direct discussions have taken place on these matters. The review has been shared with all departments, and is available for all arms-length bodies and institutions to consider in light of ongoing work in this area.
No direct discussions have taken place on these matters. The review has been shared with all departments, and is available for all arms-length bodies and institutions to consider in light of ongoing work in this area.
No direct discussions have taken place on these matters. The review has been shared with all departments, and is available for all arms-length bodies and institutions to consider in light of ongoing work in this area.
No direct discussions have taken place on these matters. The review has been shared with all departments, and is available for all arms-length bodies and institutions to consider in light of ongoing work in this area.
No direct discussions have taken place on these matters. The review has been shared with all departments, and is available for all arms-length bodies and institutions to consider in light of ongoing work in this area.
No direct discussions have taken place on these matters. The review has been shared with all departments, and is available for all arms-length bodies and institutions to consider in light of ongoing work in this area.
No direct discussions have taken place on these matters. The review has been shared with all departments, and is available for all arms-length bodies and institutions to consider in light of ongoing work in this area.
No direct discussions have taken place on these matters. The review has been shared with all departments, and is available for all arms-length bodies and institutions to consider in light of ongoing work in this area.
No direct discussions have taken place on these matters. The review has been shared with all departments, and is available for all arms-length bodies and institutions to consider in light of ongoing work in this area.
No direct discussions have taken place on these matters. The review has been shared with all departments, and is available for all arms-length bodies and institutions to consider in light of ongoing work in this area.
No direct discussions have taken place on these matters. The review has been shared with all departments, and is available for all arms-length bodies and institutions to consider in light of ongoing work in this area.
No direct discussions have taken place on these matters. The review has been shared with all departments, and is available for all arms-length bodies and institutions to consider in light of ongoing work in this area.
No direct discussions have taken place on these matters. The review has been shared with all departments, and is available for all arms-length bodies and institutions to consider in light of ongoing work in this area.
The Secretary of State has not had any conversations with the Government Office for Technology Transfer (GOTT) on this topic. GOTT helps to accelerate government’s innovations towards the market to impact growth and deliver new products and services for citizens.
Government, however, welcomes research on this important topic and has engaged with a wide range of stakeholders across industry, academia and civil society to understand the potential for further detection, prevention and removal of harmful and illegal generative AI content and to identify future research priorities.
The government has regular meetings with research and business organisations on matters relating to online safety and AI. The Online Safety Act places duties on platforms to proactively detect, prevent and remove child sexual abuse material (CSAM), including CSAM content created using AI technology. Ofcom has set out steps providers can take for these duties in draft codes of practice and will develop these iteratively. These steps include measures to detect, prevent and remove CSAM. The Act requires Ofcom to consult extensively when drafting its codes and Ofcom has an ongoing programme of research. The first code is due to come into force in Spring 2025.
The government has regular meetings with research and business organisations on matters relating to online safety and AI. The Online Safety Act places duties on platforms to proactively detect, prevent and remove child sexual abuse material (CSAM), including CSAM content created using AI technology. Ofcom has set out steps providers can take for these duties in draft codes of practice and will develop these iteratively. These steps include measures to detect, prevent and remove CSAM. The Act requires Ofcom to consult extensively when drafting its codes and Ofcom has an ongoing programme of research. The first code is due to come into force in Spring 2025.
The government has regular meetings with research and business organisations on matters relating to online safety and AI. The Online Safety Act places duties on platforms to proactively detect, prevent and remove child sexual abuse material (CSAM), including CSAM content created using AI technology. Ofcom has set out steps providers can take for these duties in draft codes of practice and will develop these iteratively. These steps include measures to detect, prevent and remove CSAM. The Act requires Ofcom to consult extensively when drafting its codes and Ofcom has an ongoing programme of research. The first code is due to come into force in Spring 2025.
The government has regular meetings with research and business organisations on matters relating to online safety and AI. The Online Safety Act places duties on platforms to proactively detect, prevent and remove child sexual abuse material (CSAM), including CSAM content created using AI technology. Ofcom has set out steps providers can take for these duties in draft codes of practice and will develop these iteratively. These steps include measures to detect, prevent and remove CSAM. The Act requires Ofcom to consult extensively when drafting its codes and Ofcom has an ongoing programme of research. The first code is due to come into force in Spring 2025.
Government made a clear manifesto commitment to ban the creation of sexually explicit deepfake images and we are bringing forward legislation to honour that commitment in the Crime and Policing Bill.
Under the Online Safety Act, it is already a criminal offence to share or threaten to share a sexually explicit deepfake.
We have designated the most harmful forms of deepfakes as priority illegal content, including child sexual exploitation and abuse and intimate image abuse. Services in scope will need to take proactive steps to prevent priority illegal content from appearing on their service and remove it quickly when it does.
AI-generated content is captured by the Online Safety Act where it constitutes illegal content or content harmful to children on an in-scope service. We will also criminalise the creation of non-consensual sexual deepfakes through the Crime and Policing Bill.
We welcome research on this important topic. DSIT co-led the Deepfake Detection Challenge with the Home Office to assess existing capabilities and identify innovative solutions to overcome the challenges of deepfakes. In addition, we have engaged with a range of stakeholders across industry, academia and civil society to understand the potential for further detection, prevention and removal of deepfake content and identify future research priorities.
AI-generated content is captured by the Online Safety Act where it constitutes illegal content or content harmful to children on an in-scope service. We will also criminalise the creation of non-consensual sexual deepfakes through the Crime and Policing Bill.
We welcome research on this important topic. DSIT co-led the Deepfake Detection Challenge with the Home Office to assess existing capabilities and identify innovative solutions to overcome the challenges of deepfakes. In addition, we have engaged with a range of stakeholders across industry, academia and civil society to understand the potential for further detection, prevention and removal of deepfake content and identify future research priorities.
AI-generated content is captured by the Online Safety Act where it constitutes illegal content or content harmful to children on an in-scope service. We will also criminalise the creation of non-consensual sexual deepfakes through the Crime and Policing Bill.
We welcome research on this important topic. DSIT co-led the Deepfake Detection Challenge with the Home Office to assess existing capabilities and identify innovative solutions to overcome the challenges of deepfakes. In addition, we have engaged with a range of stakeholders across industry, academia and civil society to understand the potential for further detection, prevention and removal of deepfake content and identify future research priorities.
AI-generated content is captured by the Online Safety Act where it constitutes illegal content or content harmful to children on an in-scope service. We will also criminalise the creation of non-consensual sexual deepfakes through the Crime and Policing Bill.
We welcome research on this important topic. DSIT co-led the Deepfake Detection Challenge with the Home Office to assess existing capabilities and identify innovative solutions to overcome the challenges of deepfakes. In addition, we have engaged with a range of stakeholders across industry, academia and civil society to understand the potential for further detection, prevention and removal of deepfake content and identify future research priorities.
AI-generated content is captured by the Online Safety Act where it constitutes illegal content or content harmful to children on an in-scope service. We will also criminalise the creation of non-consensual sexual deepfakes through the Crime and Policing Bill.
We welcome research on this important topic. DSIT co-led the Deepfake Detection Challenge with the Home Office to assess existing capabilities and identify innovative solutions to overcome the challenges of deepfakes. In addition, we have engaged with a range of stakeholders across industry, academia and civil society to understand the potential for further detection, prevention and removal of deepfake content and identify future research priorities.
AI-generated content is captured by the Online Safety Act where it constitutes illegal content or content harmful to children on an in-scope service. We will also criminalise the creation of non-consensual sexual deepfakes through the Crime and Policing Bill.
We welcome research on this important topic. DSIT co-led the Deepfake Detection Challenge with the Home Office to assess existing capabilities and identify innovative solutions to overcome the challenges of deepfakes. In addition, we have engaged with a range of stakeholders across industry, academia and civil society to understand the potential for further detection, prevention and removal of deepfake content and identify future research priorities.
AI-generated content is captured by the Online Safety Act where it constitutes illegal content or content harmful to children on an in-scope service. We will also criminalise the creation of non-consensual sexual deepfakes through the Crime and Policing Bill.
We welcome research on this important topic. DSIT co-led the Deepfake Detection Challenge with the Home Office to assess existing capabilities and identify innovative solutions to overcome the challenges of deepfakes. In addition, we have engaged with a range of stakeholders across industry, academia and civil society to understand the potential for further detection, prevention and removal of deepfake content and identify future research priorities.
AI-generated content is captured by the Online Safety Act where it constitutes illegal content or content harmful to children on an in-scope service. We will also criminalise the creation of non-consensual sexual deepfakes through the Crime and Policing Bill.
We welcome research on this important topic. DSIT co-led the Deepfake Detection Challenge with the Home Office to assess existing capabilities and identify innovative solutions to overcome the challenges of deepfakes. In addition, we have engaged with a range of stakeholders across industry, academia and civil society to understand the potential for further detection, prevention and removal of deepfake content and identify future research priorities.
AI-generated content is captured by the Online Safety Act where it constitutes illegal content or content harmful to children on an in-scope service. We will also criminalise the creation of non-consensual sexual deepfakes through the Crime and Policing Bill.
We welcome research on this important topic. DSIT co-led the Deepfake Detection Challenge with the Home Office to assess existing capabilities and identify innovative solutions to overcome the challenges of deepfakes. In addition, we have engaged with a range of stakeholders across industry, academia and civil society to understand the potential for further detection, prevention and removal of deepfake content and identify future research priorities.
AI-generated content is captured by the Online Safety Act where it constitutes illegal content or content harmful to children on an in-scope service. We will also criminalise the creation of non-consensual sexual deepfakes through the Crime and Policing Bill.
We welcome research on this important topic. DSIT co-led the Deepfake Detection Challenge with the Home Office to assess existing capabilities and identify innovative solutions to overcome the challenges of deepfakes. In addition, we have engaged with a range of stakeholders across industry, academia and civil society to understand the potential for further detection, prevention and removal of deepfake content and identify future research priorities.
AI-generated content is captured by the Online Safety Act where it constitutes illegal content or content harmful to children on an in-scope service. We will also criminalise the creation of non-consensual sexual deepfakes through the Crime and Policing Bill.
We welcome research on this important topic. DSIT co-led the Deepfake Detection Challenge with the Home Office to assess existing capabilities and identify innovative solutions to overcome the challenges of deepfakes. In addition, we have engaged with a range of stakeholders across industry, academia and civil society to understand the potential for further detection, prevention and removal of deepfake content and identify future research priorities.
AI-generated content is captured by the Online Safety Act where it constitutes illegal content or content harmful to children on an in-scope service. We will also criminalise the creation of non-consensual sexual deepfakes through the Crime and Policing Bill.
We welcome research on this important topic. DSIT co-led the Deepfake Detection Challenge with the Home Office to assess existing capabilities and identify innovative solutions to overcome the challenges of deepfakes. In addition, we have engaged with a range of stakeholders across industry, academia and civil society to understand the potential for further detection, prevention and removal of deepfake content and identify future research priorities.