Driving innovation that will deliver improved public services, create new better-paid jobs and grow the economy.
Oral Answers to Questions is a regularly scheduled appearance where the Secretary of State and junior minister will answer at the Dispatch Box questions from backbench MPs
Other Commons Chamber appearances can be:Westminster Hall debates are performed in response to backbench MPs or e-petitions asking for a Minister to address a detailed issue
Written Statements are made when a current event is not sufficiently significant to require an Oral Statement, but the House is required to be informed.
Department for Science, Innovation & Technology does not have Bills currently before Parliament
A bill to make provision about access to customer data and business data; to make provision about services consisting of the use of information to ascertain and verify facts about individuals; to make provision about the recording and sharing, and keeping of registers, of information relating to apparatus in streets; to make provision about the keeping and maintenance of registers of births and deaths; to make provision for the regulation of the processing of information relating to identified or identifiable living individuals; to make provision about privacy and electronic communications; to establish the Information Commission; to make provision about information standards for health and social care; to make provision about the grant of smart meter communication licences; to make provision about the disclosure of information to improve public service delivery; to make provision about the retention of information by providers of internet services in connection with investigations into child deaths; to make provision about providing information for purposes related to the carrying out of independent research into online safety matters; to make provision about the retention of biometric data; to make provision about services for the provision of electronic signatures, electronic seals and other trust services; to make provision about the creation and solicitation of purported intimate images and for connected purposes.
This Bill received Royal Assent on 19th June 2025 and was enacted into law.
e-Petitions are administered by Parliament and allow members of the public to express support for a particular issue.
If an e-petition reaches 10,000 signatures the Government will issue a written response.
If an e-petition reaches 100,000 signatures the petition becomes eligible for a Parliamentary debate (usually Monday 4.30pm in Westminster Hall).
Introduce 16 as the minimum age for children to have social media
Gov Responded - 17 Dec 2024 Debated on - 24 Feb 2025We believe social media companies should be banned from letting children under 16 create social media accounts.
Departments are responsible for managing their own commercial arrangements in line with procurement regulations and value-for-money principles.
Information on government contracts, including suppliers and contract details, is publicly available through the Find a Tender Service for above-threshold procurements and Contracts Finder for below-threshold procurements in England and other non-devolved territories. For procurements that began before 24 February 2025, only above-threshold notices are published on Find a Tender.
The Medical Research Council (MRC), which is part of UK Research and Innovation (UKRI), has committed a total of over £25.5 million since 2018/19 on epilepsy research, including over £9.5 million in 2024/25. This research spans discovery science and fundamental understanding of the disease, through to new approaches for diagnosis and intervention. MRC also supports epilepsy research within its portfolio of larger investments. For example, this includes a new MRC Centre of Research Excellence (CoRE) in Restorative Neural Dynamics which aims to develop brain stimulation devices to treat a range of conditions including childhood epilepsy, and the UK data platform for Traumatic Brain Injury research (TBI-REPORTER) which includes post-traumatic epilepsy as one of the areas of focus.
The Department of Health and Social Care also funds research through the National Institute for Health and Care Research (NIHR). The NIHR has funded a range of ongoing epilepsy research and has awarded £12.8 million to studies in the last five financial years. The NIHR continues to welcome funding applications for research into any aspect of human health and care, including alternative treatments for epilepsy.
The government consulted on several topics relating to the interaction between copyright and artificial intelligence (AI). We have carefully analysed the responses and continue to engage extensively on this issue, including through technical working groups.
The government published a progress update on 16 December 2025 and will publish a report on the use of copyright works in the development of AI systems, and economic impact assessment, by 18 March 2026.
The government consulted on several topics relating to the interaction between copyright and artificial intelligence (AI). We have carefully analysed the responses and continue to engage extensively on this issue, including through technical working groups.
The government published a progress update on 16 December 2025 and will publish a report on the use of copyright works in the development of AI systems, and economic impact assessment, by 18 March 2026.
The government consulted on several topics relating to the interaction between copyright and artificial intelligence (AI). We have carefully analysed the responses and continue to engage extensively on this issue, including through technical working groups.
The government published a progress update on 16 December 2025 and will publish a report on the use of copyright works in the development of AI systems, and economic impact assessment, by 18 March 2026.
Data centres are vital to the UK’s prosperity and security and underpin our digital economy and AI ambitions. We have taken decisive action and work closely with industry to monitor and mitigate potential future threats to data centres.
Last year we designated data centres as Critical National Infrastructure and are legislating through the Cyber Security and Resilience Bill to introduce proportionate regulatory oversight in the sector. The National Security and Investment (NSI) Act 2021 also gives the Government powers to intervene in or block investments and other acquisitions in the UK economy that could harm national security, and data infrastructure is one of the 17 mandatory areas of the economy requiring a notification to Government.
The Department for Science, Innovation and Technology will spend up to £42m across the 2026/27 financial year and £187 million over the next four years delivering the TechFirst programme. As announced by the Prime Minister in June 2025, TechFirst is designed to strengthen the UK’s domestic tech talent pipeline by improving the IT and digital skills of children in secondary schools, as well as undergraduate, masters, and PhD students. The TechFirst programme also includes a grant fund to help skilled individuals into work.
I refer the hon. Member to the answer given on 25 November 2025 to Question UIN 91769.
The Medical Research Council (MRC), which is part of UK Research and Innovation (UKRI), funds a broad portfolio of health research, including researcher led proposals using combinatorial genomic analysis. MRC has prioritised research into Myalgic Encephalomyelitis/Chronic Fatigue Syndrome (ME/CFS) for many years, investing over £4.65 million since 2020, and continues to welcome high quality applications in this area.
UKRI supports collaboration between Government funded bodies and private sector researchers across its councils and Innovate UK. This includes funding the LOCOME study led by Precision Life, through Innovate UK’s Advancing Precision Medicine programme, which supports the development of digital and data tools to improve diagnosis and treatment. MRC also enables academic-industry partnerships through its Industry Collaboration Framework.
UKRI does not typically maintain disease‑specific research strategies, instead providing open funding routes for the most impactful research across disciplines. Targeted work can be supported where needed. For example, in 2020, the National Institute for Health and Care Research (NIHCR), the Scottish Government and MRC funded the James Lind Alliance Priority Setting Partnership to identify ME/CFS research priorities.
The Medical Research Council (MRC), which is part of UK Research and Innovation (UKRI), funds a broad portfolio of health research, including researcher led proposals using combinatorial genomic analysis. MRC has prioritised research into Myalgic Encephalomyelitis/Chronic Fatigue Syndrome (ME/CFS) for many years, investing over £4.65 million since 2020, and continues to welcome high quality applications in this area.
UKRI supports collaboration between Government funded bodies and private sector researchers across its councils and Innovate UK. This includes funding the LOCOME study led by Precision Life, through Innovate UK’s Advancing Precision Medicine programme, which supports the development of digital and data tools to improve diagnosis and treatment. MRC also enables academic-industry partnerships through its Industry Collaboration Framework.
UKRI does not typically maintain disease‑specific research strategies, instead providing open funding routes for the most impactful research across disciplines. Targeted work can be supported where needed. For example, in 2020, the National Institute for Health and Care Research (NIHCR), the Scottish Government and MRC funded the James Lind Alliance Priority Setting Partnership to identify ME/CFS research priorities.
The Medical Research Council (MRC), which is part of UK Research and Innovation (UKRI), funds a broad portfolio of health research, including researcher led proposals using combinatorial genomic analysis. MRC has prioritised research into Myalgic Encephalomyelitis/Chronic Fatigue Syndrome (ME/CFS) for many years, investing over £4.65 million since 2020, and continues to welcome high quality applications in this area.
UKRI supports collaboration between Government funded bodies and private sector researchers across its councils and Innovate UK. This includes funding the LOCOME study led by Precision Life, through Innovate UK’s Advancing Precision Medicine programme, which supports the development of digital and data tools to improve diagnosis and treatment. MRC also enables academic-industry partnerships through its Industry Collaboration Framework.
UKRI does not typically maintain disease‑specific research strategies, instead providing open funding routes for the most impactful research across disciplines. Targeted work can be supported where needed. For example, in 2020, the National Institute for Health and Care Research (NIHCR), the Scottish Government and MRC funded the James Lind Alliance Priority Setting Partnership to identify ME/CFS research priorities.
The Labour Manifesto commits to “partner with scientists, industry, and civil society as we work towards the phasing out of animal testing”. The Government consulted industry, academia and civil society during the development of the recent Replacing Animals in Science strategy and will continue to do so during strategy implementation. This includes collaboration with civil society organisations with expertise in this area, including animal welfare organisations and learned societies, and other interested groups.
Under the Online Safety Act, Ofcom has a duty to publish a report on the role of app stores in children accessing harmful content on the apps of regulated services. The report will also assess the use and effectiveness of age assurance on app stores. This report is due by January 2027.
Following consideration of Ofcom’s report, the Secretary of State has a delegated power to apply duties on app stores, which may include greater use of age assurance.
Ofcom’s call for evidence to inform this report closed on 1 December. The government will consider next steps in due course.
Under the Online Safety Act, platforms must protect all users from illegal harassment and children from harmful content, including hateful and abusive content. These duties are now in force and Ofcom conducts regular surveys to track user experiences. DSIT and Ofcom are developing a longer-term evaluation framework to assess the Act’s impact.
Additional duties will require the largest services to offer adults optional tools to reduce engagement with legal abuse. In October, the Secretary of State wrote to Ofcom and asked it to use all its levers to tackle hateful content online and maintain urgent momentum in implementing these remaining duties.
The Government takes animal welfare very seriously. Under the Online Safety Act 2023, platforms must remove illegal content swiftly, including material promoting or facilitating animal torture, which is a designated priority offence. Services must also implement systems and processes to protect children from harmful depictions of animal cruelty, even where it is not illegal. Ofcom, as the independent regulator, enforces these duties and can issue fines of up to £18 million or 10% of qualifying worldwide revenue.
The Net Zero target in the Climate Change Act 2008, is a target for the whole of the UK, not individual departments or arms-length bodies.
Greening Government Commitments are the central framework setting out the actions UK government departments and their agencies will take to reduce their impacts on the environment, including setting targets to reduce emissions, during the framework period.
Defra are reviewing the Greening Government Commitments to ensure that they remain aligned with government priorities.
‘Freedom from Violence and Abuse: a cross-government strategy to build a safer society for women and girls’ commits to creating a joint team to address the issues in Baroness Bertin’s Review. The team will be formed by the Home Office, Department for Science, Innovation and Technology, Ministry of Justice and Department for Culture, Media and Sport. It will examine the evidence to inform the government’s approach to pornography policy.
Government has already taken action. Pornography showing strangulation or suffocation will be criminalised under the Crime and Policing Bill and will be a priority offence under the Online Safety Act.
The Online Safety Act contains robust provisions to protect young people from online abuse.
Under the Online Safety Act, platforms must protect all users from illegal harassment and content and children from harmful content, including hateful and abusive content. These provisions are already in force, and Ofcom has robust enforcement powers for platforms who fail to fulfil their duties.
The Act will also require the largest categorised services to offer adults user empowerment tools to enable them to reduce engagement with abusive content. Ofcom will be consulting on these user empowerment tools this year.
We continually monitor the Act’s impact and effectiveness to ensure all users are protected online.
On December 4, Ofcom released a summary of the tech sector's response to the UK's new online safety rules. While there has been progress, further action is needed, including major services. Ofcom has our full backing in using all available powers to protect users.
Government also continues to go further– announcing that self-harm, cyberflashing and strangulation in pornography will be priority offences under the Act, ensuring platforms take proactive action to tackle this content.
Ministers and officials meet Ofcom regularly to discuss online safety, and we continue to monitor outcomes through our joint evaluation programme.
AI Growth Zones (AIGZs) are a national mission to give the UK the world-class infrastructure it needs to lead in artificial intelligence, unlock billions in private investment, and drive long-term economic growth.
Following a formal application process, we have confirmed four AI Growth Zones located in Culham, the North East Combined Authority, North Wales, and South Wales. We will continue to review applications and carry out targeted site engagement to confirm future AIGZ locations in due course.
On 13 November 2025 DSIT announced a suite of new policy and reforms for enabling AI infrastructure as well as AI Growth Zones that will support access to energy, reduce planning barriers, and tackle energy costs. You can read the full publication here - Delivering AI Growth Zones - GOV.UK.
The UK Science and Technology Framework provides a holistic picture of the ten critical levers that the UK Government can use to drive growth and improve the lives of citizens through science and technology. We remain committed to the Framework and to applying these levers to ensure science and technology supports the delivery of core priorities, such as the Plan for Change and the Industrial Strategy.
All Northern Ireland City Growth Deal project business cases are reviewed by UK Government departments to ensure strategic alignment with wider government priorities. This includes officials in both DSIT and UKRI, who assess strategic alignment with the UK Government’s priorities for science and technology, including the UK Science and Technology Framework.
This process helps to ensure that City Growth Deals across Northern Ireland have taken account – at a local delivery level – of the UK Government’s priorities across science and technology, as set out in the Framework.
The UK already has a range of statutory frameworks that apply to AI. Existing rules that apply to AI systems include data protection, competition, equality legislation, the copyright framework, and other forms of sectoral regulation.
In February 2025 we published the Digital Inclusion Action Plan, which set out the first five actions the Government is taking to boost digital inclusion across the UK. This includes expanding opportunities for digital upskilling and strengthening support for the Essential Digital Skills framework, which helps individuals and employers understand and build the digital skills needed for work and everyday life. As part of the Action Plan, DSIT launched an 11.9m Digital Inclusion Innovation Fund, supporting 85 projects across England, and with funding allocated to devolved governments to support further projects in Scotland, Wales and Northern Ireland.
Through the Government’s digital entitlement, eligible adults can access fully funded essential digital skills courses and qualifications, supporting people to get online safely and confidently, improve their employability, and access public services. To increase awareness and take-up, DSIT have also launched marketing activity running as part of DfE’s ‘Skills for Life’ campaign, working with partners across the public, private and voluntary sectors.
Alongside this, DSIT is delivering the £187 million TechFirst programme that will support over 4,000 domestic graduates, researchers and innovators and engage 1 million students in digital skills and AI learning. These measures support people at all stages of life to develop the digital skills they need to participate fully in the digital economy and society.
The Government’s strategy to support replacing animals in science commits to publish biennially a list of alternative-methods research and development priorities, coalescing UK scientists around these areas and incentivising partnerships between research organisations, CROs and industry. These will be published during 2026 following development with stakeholders as part of the implementation of the strategy.
The government is clear that no one should have to go through the ordeal of seeing intimate images of themselves online.
There are no excuses not to act, and services must deal with this urgently. Ofcom are looking into this as a matter of urgency, and they have the government’s full backing to take any necessary enforcement action.
Services and operators have a clear obligation to act appropriately. This is not about restricting freedom of speech but upholding the law.
I meet regularly with civil society, industry and Ofcom to discuss online safety, including the risks of AI chatbots to children.
AI services allowing users to share content with one another or that search the live web are covered under the Online Safety Act and have a duty to protect users from illegal content, and children from harmful content.
I have already asked officials to investigate how the Act covers AI chatbots and I am considering what more can be done.
The Government is committed to ensuring the security and resilience of the UK’s telecommunications networks and services, including supporting informed purchasing decisions by businesses and other organisations.
The Telecommunications (Security) Act 2021 (TSA) amended the Communications Act 2003 to establish a robust security framework for UK public telecoms networks and services. Ofcom provides guidance to businesses purchasing telecommunications services and enforces protections for business customers through the General Conditions of Entitlement, which all telecommunications operators must meet to provide services in the UK.
The Government will continue to keep the security of telecoms networks under review.
The Telecommunications (Security) Act 2021 (TSA) established a robust security framework for UK public telecoms networks and services, placing new legal duties on public telecoms providers to identify and mitigate security risks. The framework is designed to ensure that security is embedded within the networks and services, so those using them can have confidence in their security.
DSIT works with Ofcom and the National Cyber Security Centre (NCSC) to ensure providers are aware of their obligations. Ofcom produces annual security reports for the Secretary of State on providers’ compliance with their obligations in the Act, and their progress against the guidance measures set out in the accompanying Telecommunications Security Code of Practice. These measures have staggered implementation timeframes based on factors such as their complexity and cost.
The first report was published on GOV.UK in January 2025, and is available to organisations purchasing telecommunications services. The report helps the Government monitor compliance approaches across the sector, including progress against guidance measures in the Code of Practice as they fall due.
The Government is committed to ensuring the security and resilience of the UK’s telecommunications networks and services. This includes regular assessment of security and resilience risks relating to such networks and services.
The Telecommunications (Security) Act 2021 (TSA) amended the Communications Act 2003 to establish a robust security framework for UK public telecoms networks and services, placing new legal duties on public telecoms providers to identify and mitigate security risks.
Some essential services may use private telecoms networks outside the scope of the TSA. However, under the Network and Information Systems (NIS) Regulations 2018, operators of essential services are required to manage risks to those services resulting from their use of such networks. In addition, the National Security and Investment Act 2021 includes powers to scrutinise and, if necessary, intervene in foreign acquisitions or investments in the UK telecoms sector that may pose national security risks.
The UK government also works closely to promote the adoption of appropriate and proportionate telecoms security regulations by other countries.
Residents in rural areas, as well as other areas of the country, rightly expect to have reliable mobile connectivity to participate in the modern digital economy. The Government recognises that events like storms and power outages can have a particular impact on rural communities.
Mobile network operators have legal obligations to put in place appropriate and proportionate measures to ensure the resilience of their networks and services. This is overseen by the independent regulator Ofcom, who have powers to monitor compliance, conduct investigations, issue penalties and enforce remedial actions.
Ofcom have completed a public consultation on power back-up for mobile services across the UK, which identified a particular impact on rural communities. They published an update on their work in February and announced they are completing further analysis to determine the appropriate and proportionate measures required to ensure adequate resilience for consumers. The Government will consider this analysis carefully. The Government is also supporting collaboration between the electricity and telecommunications sectors to deliver measures so that when power cuts occur the likelihood of disruption to telecommunications services is as low as possible, and where disruption does occur it should affect as few people for the shortest possible time.
Residents in rural areas, as well as other areas of the country, rightly expect to have reliable mobile connectivity to participate in the modern digital economy. The Government recognises that events like storms and power outages can have a particular impact on rural communities.
Mobile network operators have legal obligations to put in place appropriate and proportionate measures to ensure the resilience of their networks and services. This is overseen by the independent regulator Ofcom, who have powers to monitor compliance, conduct investigations, issue penalties and enforce remedial actions.
Ofcom have completed a public consultation on power back-up for mobile services across the UK, which identified a particular impact on rural communities. They published an update on their work in February and announced they are completing further analysis to determine the appropriate and proportionate measures required to ensure adequate resilience for consumers. The Government will consider this analysis carefully. The Government is also supporting collaboration between the electricity and telecommunications sectors to deliver measures so that when power cuts occur the likelihood of disruption to telecommunications services is as low as possible, and where disruption does occur it should affect as few people for the shortest possible time.
The Digital Economy Act 2017 (DEA) contains data sharing powers that allow specified authorities to share information, including personal data, for specific purposes. Anyone sharing information under Chapters 1- 4 of Part 5 of the DEA is required to have regard to the relevant Code of Practice when doing so. This states that those authorities, listed in schedules 4-8 and Chapter 2 of the DEA, should enter an information sharing agreement (ISA) when sharing data under these powers.
The codes of practice provide details to practitioners on how information sharing powers under the DEA must be operated. Those relating to public service delivery (PSD), debt and fraud and civil registration place a requirement on the Data Controller(s) to set out information about their ISA within a publicly available register.
The register, operated by Government Digital Service (GDS) and publicly available on GOV.UK, provides a central repository of all data shares made under the powers provided by Chapters 1 - 4 of Part 5 of the DEA. It is a key transparency measure which outlines details of each data share, including the bodies involved, why it is shared, for how long and the expected benefits.
The register currently contains 525 entries and 464 public bodies. It is available at https://www.digital-economy-act-register.data.gov.uk.
While GDS is responsible for maintaining the register, the DEA’s statutory Code of Practice makes clear that responsibility for the accuracy of register entries rests with the public authorities involved in each data share, except in relation to the debt and fraud provisions, where responsibility falls under the debt and fraud secretariat.
The government takes the security and integrity of our democratic processes very seriously, including the risks posed by AI-generated content. While recent UK elections did not see the scale and sophistication of AI anticipated, this remains an important issue.
The Online Safety Act requires in-scope services to mitigate risks from illegal disinformation, including AI-generated content, relevant to elections (e.g. false communications). Media literacy is also part of our wider approach, building public resilience to mis- and disinformation.
The department also engages through the government’s Defending Democracy Taskforce, which is committed to safeguarding the UK from the full range of threats to democracy, including those from AI.
AI Growth Zones will bring thousands of new jobs and millions of pounds in investment right to the places that need it most.
In North Wales, we anticipate 3,450 jobs will be created, and in South Wales we expect at least 5,000 jobs will be created - spanning construction, temporary roles and high-skilled engineering and technical roles.
Job creation will commence as infrastructure works progress, with full delivery of this infrastructure projected by the early 2030s.
The United States is our close ally and tech partner, and we are committed to ensuring that bond delivers real benefits for hardworking people on both sides of the Atlantic.
We look forward to resuming work on this partnership with the US as quickly as we can to achieve that and working together to help shape the emerging technologies of the future.
Most recently, we were pleased to announce advances in how we share cutting edge UK and US quantum research as well as TAE Technologies and the UK Atomic Energy Authority’s joint venture partnership to commercialise fusion technology in the UK.
The government is committed to ensuring that the adoption of artificial intelligence across the public sector is safe, effective, efficient and ethical. This work is guided by the AI Opportunities Action Plan and the AI Playbook for Government, which provide departments and public sector organisations with accessible technical guidance on the responsible use of AI.
The AI Playbook includes ethical and legal guidance for all civil servants on how to use AI safely and responsibly. This covers data protection, privacy, cybersecurity and sustainability, alongside the principles set out in the government’s pro-innovation approach to AI regulation. Departments are required to follow existing civil service-wide standards and policies, such as the Algorithmic Transparency Recording Standard, to ensure compliance and maintain accountability when deploying AI systems.
The Government, working with Ofcom, closely monitors the financial health of the telecoms market. Ofcom have powers to request financial information from providers where appropriate.
We recently held a public consultation on proposed updates to the Telecommunications Security Code of Practice, which provides guidance on how public telecoms providers can meet their statutory requirements to secure their networks and services. These include requirements relating to reviews, governance and board responsibilities. Ofcom monitor and enforce these requirements.
In response to the consultation, the Chartered Institute of Internal Auditors raised the matter of independent assurance arrangements. We are now carefully reviewing all feedback to the consultation to ensure that any updates to the Code of Practice are appropriate and proportionate.
The Government, working with Ofcom, closely monitors the financial health of the telecoms market. Ofcom have powers to request financial information from providers where appropriate.
We recently held a public consultation on proposed updates to the Telecommunications Security Code of Practice, which provides guidance on how public telecoms providers can meet their statutory requirements to secure their networks and services. These include requirements relating to reviews, governance and board responsibilities. Ofcom monitor and enforce these requirements.
In response to the consultation, the Chartered Institute of Internal Auditors raised the matter of independent assurance arrangements. We are now carefully reviewing all feedback to the consultation to ensure that any updates to the Code of Practice are appropriate and proportionate.
The Government, working with Ofcom, closely monitors the financial health of the telecoms market. Ofcom have powers to request financial information from providers where appropriate.
We recently held a public consultation on proposed updates to the Telecommunications Security Code of Practice, which provides guidance on how public telecoms providers can meet their statutory requirements to secure their networks and services. These include requirements relating to reviews, governance and board responsibilities. Ofcom monitor and enforce these requirements.
In response to the consultation, the Chartered Institute of Internal Auditors raised the matter of independent assurance arrangements. We are now carefully reviewing all feedback to the consultation to ensure that any updates to the Code of Practice are appropriate and proportionate.
The rollout of 5G infrastructure is commercially driven and government does not hold data on where, or when, future rollout of mobile infrastructure will take place.
Government has a clear ambition for all populated areas to have higher quality 5G standalone connectivity by 2030. All three mobile network operators have committed significant investment across the UK working towards achieving this.
In Ofcom’s Connected Nations Annual Report 2025 (published November 2025), which shows coverage as of July 2025, 5G coverage is already present outside of 91% of premises across Wales, and that standalone 5G is available outside of 59% of premises
The non‑binding Memorandum of Understanding between DSIT and Google DeepMind establishes a partnership for collaboration to support delivery on this government’s AI Opportunities Action Plan. This includes concrete initiatives such as priority access for UK scientists to AI tools; deepening collaboration with the AI Security Institute on AI safety and security research; and support for the development of AI-ready datasets in strategically important domains such as fusion energy.
The automated lab announced alongside the MoU is an independent Google DeepMind initiative, fully funded by Google DeepMind. The UK Government is not involved in operating or funding the lab.
The partnership with Google DeepMind will support DSIT’s efforts to explore how AI can improve productivity and service delivery across government. However, any use of AI in public services will be subject to the highest standards of safety and security, including the Data Protection Act 2018 and UK GDPR, the Government’s Data Ethics Framework, and relevant departmental assurance and security processes.
The Government is committed to ensuring the trusted and fair use of AI.
Through the AI Opportunities Action Plan, we committed to taking steps to drive responsible adoption of AI across sectors. This includes establishing the AI Assurance Innovation Fund. We are investing £11 million in the fund and convening a national consortium of expert stakeholders to support the quality and growth of the AI assurance market.
The Government has also published guidance on Responsible AI in Recruitment. This focuses on good practice for the procurement and deployment of AI systems for HR and recruitment. It identifies key questions, considerations, and assurance mechanisms that may be used to ensure the safe and trustworthy use of AI in recruitment.
The government is actively considering what exceptions could be made to regulation 6, and we shall update the House in due course.
Any regulations would be developed and drafted by the Department for Science, Innovation and Technology. The Information Commissioner’s Office (ICO) will publish recommendations for the government on this issue. The Government will consult the ICO and other interested stakeholders on the development of any regulations, as we are legally required to by the provisions in section 112(3) of the Data (Use and Access) Act 2025.
A range of regulation and legislation applies to AI systems such as data protection, equality legislation and sectoral regulation. Where AI systems contravene or are non-compliant with those rules, enforcement and mechanisms for redress will apply. The government is committed to supporting regulators to promote the responsible use of AI in their sectors including identifying and addressing bias.
To further tackle this issue, the government ran the Fairness Innovation Challenge (FIC) with Innovate UK, the Equality and Human Rights Council (EHRC), and the ICO. FIC supported the development of novel of solutions to address bias and discrimination in AI systems and supported the EHRC and ICO to shape their own broader regulatory guidance.
The Government’s new strategy sets out our long-term vision for a world where the use of animals in science is eliminated in all but exceptional circumstances, achieved by creating a research and innovation system that drives the development and validation of alternative methods to using animals in science. We will provide regular updates on strategy delivery including through a publicly available dashboard. Recognising that the legal framework in the UK already requires that animals are only ever used in science where there are no validated alternatives available, the government currently has no plans to legislate further on this matter.
The government takes tackling cyberbullying and online grooming extremely seriously.
Under the Online Safety Act, services must put in place measures to mitigate the risk of illegal activity, including grooming, and protect children from harmful content, such as bullying.
Ofcom recommends measures services can take to fulfil their duties in Codes of Practice, including using hash matching to detect and remove child sexual abuse material. Ofcom can introduce new measures in future iterations of the Codes.
On 18 December, the government published its Violence Against Women and Girls Strategy, including a world-leading ban on nudification apps. This government will not allow technology to be weaponised to humiliate and exploit women and girls.
I meet regularly with civil society, industry and Ofcom to discuss online safety, including the risks of AI chatbots.
AI services allowing users to share content with one another or that search the live web are covered under the Online Safety Act and have a duty to protect users from illegal content, and children from harmful content.
To build on this, I have made encouraging self-harm a priority offence under the Act and in-scope chatbots will need to have measures in place to prevent users from encountering this content.
The Government identifies and assesses risks to the nation through the internal, classified National Security Risk Assessment, and the external National Risk Register, the most recent version of which was published in August
As set out in the UK Government Resilience Framework, each risk in the National Security Risk Assessment is owned and managed within Lead Government Departments
Where those risks, including national security risks, relate to the work of the Department for Science Innovation and Technology (DSIT), then they are managed through the department’s risk management processes. Within DSIT, risks are regularly reported to the department’s SLT, chaired by the Permanent Secretary, and then scrutinised by the Audit and Risk Committee (ARAC) on a regular basis.
The Secretary of State has had no discussions with social media companies on this matter.
The Gas Safety (Installation and Use) Regulations 1998 make it a criminal offence for anyone who is not on the Gas Safe Register to carry out gas work in domestic properties.
The Advertising Standards Authority requires all advertising to be legal and socially responsible. It is working with online platforms which have signed up to its Intermediary and Platform Principles to encourage compliance with the advertising codes online.
The Online Advertising Taskforce, chaired by the Minister for Creative Industries, Media and Arts, is also working to improve transparency and accountability in the online advertising supply chain.