Driving innovation that will deliver improved public services, create new better-paid jobs and grow the economy.
Oral Answers to Questions is a regularly scheduled appearance where the Secretary of State and junior minister will answer at the Dispatch Box questions from backbench MPs
Other Commons Chamber appearances can be:Westminster Hall debates are performed in response to backbench MPs or e-petitions asking for a Minister to address a detailed issue
Written Statements are made when a current event is not sufficiently significant to require an Oral Statement, but the House is required to be informed.
Department for Science, Innovation & Technology does not have Bills currently before Parliament
A bill to make provision about access to customer data and business data; to make provision about services consisting of the use of information to ascertain and verify facts about individuals; to make provision about the recording and sharing, and keeping of registers, of information relating to apparatus in streets; to make provision about the keeping and maintenance of registers of births and deaths; to make provision for the regulation of the processing of information relating to identified or identifiable living individuals; to make provision about privacy and electronic communications; to establish the Information Commission; to make provision about information standards for health and social care; to make provision about the grant of smart meter communication licences; to make provision about the disclosure of information to improve public service delivery; to make provision about the retention of information by providers of internet services in connection with investigations into child deaths; to make provision about providing information for purposes related to the carrying out of independent research into online safety matters; to make provision about the retention of biometric data; to make provision about services for the provision of electronic signatures, electronic seals and other trust services; to make provision about the creation and solicitation of purported intimate images and for connected purposes.
This Bill received Royal Assent on 19th June 2025 and was enacted into law.
e-Petitions are administered by Parliament and allow members of the public to express support for a particular issue.
If an e-petition reaches 10,000 signatures the Government will issue a written response.
If an e-petition reaches 100,000 signatures the petition becomes eligible for a Parliamentary debate (usually Monday 4.30pm in Westminster Hall).
We want the Government to repeal the Online Safety act.
Introduce 16 as the minimum age for children to have social media
Gov Responded - 17 Dec 2024 Debated on - 24 Feb 2025We believe social media companies should be banned from letting children under 16 create social media accounts.
The Department for Science, Innovation and Technology keeps the commencement and implementation of its legislation under review alongside operational readiness and delivery of wider priorities. This work is undertaken alongside established post‑legislative scrutiny processes.
DSIT sought academic advice in designing the pilot study. The Government Chief Scientific Adviser, Dame Angela McLean, convened a roundtable of senior academics, alongside Chief Scientific Advisers from FCDO, DfE and College of Policing.
This advice included consideration of sample size. The pilots, by design, form a social research, qualitative study which is thorough, but not statistically representative. With 300 interviews with teenagers, and their parents, from varied perspectives, we aim to gather first-hand insights into their experience of social media.
DSIT worked closely with our delivery partner, Savanta, to design the study to established ethical standards, including securing informed consent from participants, the right of withdrawal, appropriate safeguarding arrangements, and data protection and confidentiality measures throughout.
DSIT sought academic advice in designing the pilot study. The Government Chief Scientific Adviser, Dame Angela McLean, convened a roundtable of senior academics, alongside Chief Scientific Advisers from FCDO, DfE and College of Policing.
This advice included consideration of sample size. The pilots, by design, form a social research, qualitative study which is thorough, but not statistically representative. With 300 interviews with teenagers, and their parents, from varied perspectives, we aim to gather first-hand insights into their experience of social media.
DSIT worked closely with our delivery partner, Savanta, to design the study to established ethical standards, including securing informed consent from participants, the right of withdrawal, appropriate safeguarding arrangements, and data protection and confidentiality measures throughout.
UK GDPR and the Data Protection Act impose obligations on data controllers – which include age verification services - to process data fairly, lawfully, and transparently. The UK’s data protection legislation provides for extraterritorial scope, which applies to organisations offering goods or services or monitoring the behaviour of data subjects within the UK.
The Information Commissioner’ Office can investigate any concerns raised about the misuse or mishandling of data.
Ofcom and the ICO recently issued a joint statement on age assurance to provide greater clarity on how services can meet their obligations under the OSA and UK data protection legislation.
The Natural Environment Research Council (NERC), a part of UK Research and Innovation (UKRI), is reviewing the value for money of its infrastructure investments to ensure maximum impact for the UK and transition its atmospheric science infrastructure to more flexible, scalable and sustainable technologies. As part of this, NERC has decided to cease funding the Facility for Airborne Atmospheric Measurements (FAAM) aircraft, which is operated by the National Centre for Atmospheric Science (NCAS) at the end of this financial year, with orderly decommissioning taking place in FY 2026/27.
While there are some aspects of atmospheric science that can only be done with an aircraft, the future direction of atmospheric science increasingly favours distributed observing systems, land-based capability, uncrewed aerial vehicles (UAVs), and advanced sensor technologies that offer lower emissions, greater responsiveness and improved cost‑effectiveness through scalability.
NERC and UKRI have already begun investing in these areas, including a Net Zero Aerial Capability scoping programme (in collaboration with Innovate UK) on UAV development, as well as committing additional investment to NCAS’ Atmospheric Measuring and Observation Facility (AMOF) equipment pools. NERC will also invest £1 million in Financial Year 2026/27 to further explore autonomous capabilities, with the intention of scaling successful approaches.
NERC is engaging closely with affected staff and institutions to retain expertise within the wider atmospheric science system (including weather, climate and air quality research) wherever possible. Much of the FAAM equipment will be repurposed and will continue to require skilled operators, helping to maintain capability and minimise impacts on the skills pipeline.
The Government is committed to ensuring that the involvement of private technology companies in the handling of sensitive data held by public authorities and regulators is subject to robust data protection, accountability, and transparency safeguards. All departments undertaking work involving personal data are required to conduct Data Protection Impact Assessments to ensure appropriate privacy, security, and fairness measures are in place. Where private‑sector tools, including algorithmic or AI‑enabled systems, are procured or used, departments must apply mandatory transparency standards and clearly document how such tools are embedded in decision‑making processes, their technical specifications, and relevant risk mitigations.
At a cross‑government level, the Government Digital Service (GDS), within the Department for Science, Innovation and Technology, is strengthening central coordination and oversight of data protection and privacy risks across government. This includes setting consistent standards, supporting departments on the responsible adoption of new technologies, and working closely with the Information Commissioner’s Office to raise data protection and information security standards across the public sector.
These measures are intended to ensure that the use of private technology companies supports innovation and improved public services, while maintaining high standards of data protection, accountability and public trust.
Ofcom data indicates that children aged 8-14 average 2 hours and 59 minutes online per day, between smartphones, tablets and computers. This equates to 20 hours and 53 minutes per week and 1088 hours and 55 minutes annually.
For adults, Ofcom found that the average daily smartphone time is 3 hours and 28 minutes. This is equivalent to 24 hours and 16 minutes per week, and 1265 hours and 20 minutes annually, on average.
The children’s measure will be an overestimate due to the inclusion of tablets and computers as well as smartphones, whereas the adult measure is smartphones only, but it is the most robust measure available.
We are committed to ensuring the UK is the leading adopter of AI in the G7, empowering British workers and businesses to seize its benefits by creating more rewarding jobs, increasing productivity and driving growth in our leading sectors.
AI assurance enables consumers to be confident that the products they buy will work as intended, which is why the Government is taking steps to build the AI assurance ecosystem that underpins safe deployment of AI, as set out in the Roadmap to Trusted Third-Party AI Assurance. This includes establishing the Centre for AI Measurement, led by the National Physical Laboratory, to accelerate the development of new, innovative AI assurance techniques.
The law also requires that all consumer products must be safe before they are placed on the market. The Office for Product Safety and Standards and local authority trading standards have enforcement powers across product safety regulations to take non-compliant or unsafe products off the UK market. The product safety framework will better respond to emerging risks posed by digital technologies, including AI-enabled and smart products, ensuring innovation does not come at the expense of consumer safety.
The Online Safety Act requires user‑to‑user services to assess risks of different kinds of illegal harm on their platforms, including child sexual exploitation and abuse, grooming, intimate image abuse and extreme pornography, and to take proportionate steps to mitigate those risks, including where they are facilitated through groups.
Services must also have effective systems and processes to prevent, detect and act against illegal content and activity, both proactively and in response to notifications. Ofcom, as the independent regulator, sets out expected measures in statutory codes of practice, which came into force in July 2025, including on proactive technologies such as hash‑matching.
Ofcom imposed seven penalties in 2024 and 20 penalties in 2025. A breakdown is provided in Ofcom’s financial penalties publication (available here). https://www.ofcom.org.uk/siteassets/resources/documents/about-ofcom/annual-reports/2024-25/section-400-licence-fees-and-penalties-accounts-2024-2025.pdf?v=400015 Ofcom does not routinely publish information giving specific dates for when companies have paid their fines.
Project Gigabit is the government’s programme to deliver gigabit-capable broadband to UK premises that are not included in suppliers' commercial plans.
As part of Project Gigabit, Openreach is delivering a contract to extend coverage to hard-to-reach areas of Surrey. This contract currently includes approximately 1,900 premises in the Surrey Heath constituency, of which almost 900 premises have already been given coverage.
We will aim to cover the remaining premises that are not currently included in Project Gigabit or suppliers’ commercial delivery plans as far as possible as funding becomes available, in line with the objective of achieving nationwide gigabit coverage by 2032.
Both the gas boilers at the Intellectual Property Office warehouse facility and the air‑conditioning units had reached the end of their operational life and required replacement. They have been replaced with more energy‑efficient systems to improve performance and reduce energy consumption. Further detail on capital costs and estimated energy savings is commercially sensitive and cannot be provided.
The AI Skills Hub contract has received £4 million to cover a range of activities, including: building and maintaining the Hub, engagement and research to inform course curation and approach, ongoing outreach to drive uptake, gather feedback for improvement, and support business upskilling and AI adoption. As of 19 March 2026, there were 55,952 registered learners on the AI Skills Hub. The AI Skills Hub also hosts the AI Skills Boost programme, which in total has delivered over 1 million AI Upskilling courses since June 2025 in partnership with leading industry organisations.
Value for money will be assessed through a comprehensive framework that tracks delivery, outcomes and long-term economic impact. This includes metrics on user engagement, platform usage and training uptake alongside survey-based evidence of improved skills, employability and access to AI resources. It also measures productivity gains such as cost savings, time efficiencies and increased AI adoption within organisations.
The government has supported British companies to develop capabilities to clean up debris through both grants and contracts, including Astroscale UK, ClearSpace UK and respective subcontractors. The procurement process for a single supplier to deliver a research and development contract to remove two defunct UK satellites from orbit is ongoing. Protecting the outer space environment is a priority for the government, so further funding opportunities for British companies to compete for grants will become available in due course.
The Government’s priority is to secure assured access to space for the United Kingdom.
The Government is supporting the development of an operational UK spaceport and a competitive launch market. SaxaVord Spaceport in Shetland is Europe’s first licensed vertical launch site and is expecting multiple launches in 2026.
We will work with launch companies that can meet our assured access objectives to develop reliable, secure, and commercially competitive access to space. We will also develop and strengthen existing partnerships with our NATO and European allies.
As with all Government investments, appropriate financial, technical and legal due diligence was undertaken before funding decisions were made. The Government does not routinely publish internal assurance or due diligence material, which is commercially sensitive.
The Government provided financial support to Orbex through a combination of direct investment and grant funding to support the development of its Prime launch vehicle.
From April 1st 2026, the key responsibilities of the UK Space Agency will continue to include growing the UK’s space sector, working closely with industry and driving successes and opportunities.
This will be a key element of the overarching remit of the UK Space Agency to: Set the national direction on space including cohering policy, strategy and delivery across the whole of government, leading delivery of innovation and world class science programmes in partnership with the sector and international partners.
Ofcom has various powers under the Online Safety Act to obtain information from services, including information regarding their algorithms. However, Ofcom does not publish, or provide to DSIT, details of how many information notices they have issued.
In March 2026, Ofcom updated on its work on safer feeds for children. To inform Ofcom’s assessment of these systems, it issued legally-binding information requests to large platforms and has said it will publicly report on the responses in May.
The Government recognises that the safe, reliable and accountable use of artificial intelligence is important to maintaining public trust in public services.
Departments deploying AI systems are expected to consider risks and impacts throughout the system lifecycle, including during design, development, deployment and operation. This includes compliance with safety, transparency, accountability, data protection rules and regulations.
The Government has published guidance to support this, including the Data and AI Ethics Framework, the AI Playbook for Government and the AI Knowledge Hub, which together provide advice on governance, risk management, testing and oversight.
In addition, the Department for Science, Innovation and Technology has published guidance on AI assurance, and a cross‑government AI Testing and Assurance Framework supports proportionate testing, evaluation and ongoing monitoring.
AI‑enabled services are also expected to meet the GOV.UK Service Standard, including demonstrating that they are safe, secure, reliable and well‑governed.
The Government prioritised the commencement of the Competition and Markets Authority’s (CMA) new powers in digital markets last year to boost competition and fairness in the digital tech sector. Although the CMA operates independently of Government, the Government gave a clear steer for the CMA to use these new powers collaboratively and proportionately.
In March, the CMA announced a package of actions to strengthen competition in business software and cloud services. This includes a Strategic Market Status investigation into Microsoft’s business software under the UK’s digital markets regime, alongside voluntary actions from Amazon and Microsoft that will improve interoperability, reduce data egress fees and make switching easier in cloud services. Taken together, these steps aim to address identified concerns and support a more competitive, resilient cloud market in the UK.
The Government prioritised the commencement of the Competition and Markets Authority’s (CMA) new powers in digital markets last year to boost competition and fairness in the digital tech sector. Although the CMA operates independently of Government, the Government gave a clear steer for the CMA to use these new powers collaboratively and proportionately.
In March, the CMA announced a package of actions to strengthen competition in business software and cloud services. This includes a Strategic Market Status investigation into Microsoft’s business software under the UK’s digital markets regime, alongside voluntary actions from Amazon and Microsoft that will improve interoperability, reduce data egress fees and make switching easier in cloud services.
The ‘Childhood in the Age of AI’ summit will be attended by a diverse group of representatives from civil society, industry, government and representatives of young people. It will address the impacts of AI on children and young people across a wide range of domains, such as education, wellbeing, development and safety. The discussions will not be restricted to any age group.
This work forms part of the government’s work to hear directly from parents and young people across the UK through our National Conversation children’s and young people’s wellbeing online.
Data centres are foundational infrastructure for a modern, competitive UK economy, enabling the digital services that underpin productivity across numerous sector, from financial services and advanced manufacturing to public services and the creative industries. By enabling artificial intelligence, cloud computing and data intensive services, data centres generate productivity gains across the wider economy and reinforce the UK’s attractiveness as a crucial destination for investment.
Tech UK has estimated that UK data centres contribute £4.7 billion pounds in gross value added each year and support-tens of thousands of high-quality jobs across construction, operations and specialist supply chains. Operational employment is generally highly skilled and well paid, with wider employment supported through demand for electrical engineering, cooling, digital infrastructure and maintenance services.
HMG’s AI Growth Zone programme will unlock significant private investment and secure compute to drive AI growth, supporting high‑value local jobs and skills. HMG will also invest up to £5 million per Growth Zone, working with local areas to design tailored schemes to realise local economic benefits and boost AI adoption in local communities.
On 11th November 2025 the government published “Replacing animals in science: A strategy to support the development, validation and uptake of alternative methods” which outlines the steps we will take to achieve this. (Replacing animals in science strategy - GOV.UK)
Sepsis is a complex and multifaceted condition, and its study presents significant scientific challenges. We will consider sepsis during the development of our areas of research interest list to determine the best path forward for new model development that drives scientific innovation, supports improved therapy development, and reduces reliance on animals.
The Government prioritised the commencement of the Competition and Markets Authority’s (CMA) new powers in digital markets last year to boost competition and fairness in the digital tech sector. Although the CMA operates independently of Government, the Government gave a clear steer for the CMA to use these new powers collaboratively and proportionately.
In March, the CMA announced a package of actions to strengthen competition in business software and cloud services. This includes a Strategic Market Status investigation into Microsoft’s business software under the UK’s digital markets regime, alongside voluntary actions from Amazon and Microsoft that will improve interoperability, reduce data egress fees and make switching easier in cloud services.
The Government Office for Science commissioned an evidence review by an external academic group to synthesise the latest published literature on misinformation. It is a pre-registered study looking at the existing published evidence and is not therefore seeking direct contributions from organisations. The review will be published in due course. The findings from this desktop exercise will inform government's thinking on identifying and tackling harmful misinformation.
All organisations processing personal data in the UK must comply with the UK’s data protection framework, including the UK GDPR, regardless of where they are headquartered. This includes requirements that apply when personal data is transferred overseas, and organisations must ensure that appropriate safeguards are in place where required.
The UK has world-leading investigation and enforcement capabilities to ensure that data is collected and handled responsibly and securely. The Information Commissioner’s Office has powers to investigate, issue fines and require corrective action where organisations fail to comply with the UK’s data protection framework, and individuals may seek redress if their data is misused.
As threats to UK data evolve our response will be agile and proportionate. We actively monitor threats to UK data and will not hesitate to take further action if necessary to protect our national security.
AI has huge potential benefits, but can also bring new risks, including new opportunities for criminals. The OSA lists fraud as a priority offence and regulates AI-generated media in the same way as ‘real’ content, placing the same obligations on services to protect users.
The Online Safety Act (OSA) lists certain fraud offences as ‘priority offences’, meaning regulated services must prevent users encountering fraudulent content, swiftly remove it if it appears, and mitigate and manage the risk of their services facilitating fraud. This would include, where appropriate, the use of emerging technologies to stifle criminal abuse of networks. To support compliance, Ofcom issues Codes of Practice advising services on how to be compliant with their regulatory obligations. We expect these Codes to evolve over time to include new technologies.
AI has huge potential benefits, but can also bring new risks, including new opportunities for criminals. The OSA lists fraud as a priority offence and regulates AI-generated media in the same way as ‘real’ content, placing the same obligations on services to protect users.
The Online Safety Act (OSA) lists certain fraud offences as ‘priority offences’, meaning regulated services must prevent users encountering fraudulent content, swiftly remove it if it appears, and mitigate and manage the risk of their services facilitating fraud. This would include, where appropriate, the use of emerging technologies to stifle criminal abuse of networks. To support compliance, Ofcom issues Codes of Practice advising services on how to be compliant with their regulatory obligations. We expect these Codes to evolve over time to include new technologies.
AI has huge potential benefits, but can also bring new risks, including new opportunities for criminals. The OSA lists fraud as a priority offence and regulates AI-generated media in the same way as ‘real’ content, placing the same obligations on services to protect users.
The Online Safety Act (OSA) lists certain fraud offences as ‘priority offences’, meaning regulated services must prevent users encountering fraudulent content, swiftly remove it if it appears, and mitigate and manage the risk of their services facilitating fraud. This would include, where appropriate, the use of emerging technologies to stifle criminal abuse of networks. To support compliance, Ofcom issues Codes of Practice advising services on how to be compliant with their regulatory obligations. We expect these Codes to evolve over time to include new technologies.
The Department for Science, Innovation and Technology does not hold data relating to the number of fraudulent or scam adverts on social media or other regulated services.
There are mechanisms in the Online Safety Act that allow Ofcom to collect information from categorised services on the incidence and dissemination of illegal content, which would include fraudulent advertising content. Ofcom is required under the Act to publish annual transparency reports.
The Department for Science, Innovation and Technology does not hold data relating to the number of fraudulent or scam adverts on social media or other regulated services.
There are mechanisms in the Online Safety Act that allow Ofcom to collect information from categorised services on the incidence and dissemination of illegal content, which would include fraudulent advertising content. Ofcom is required under the Act to publish annual transparency reports.
The Department for Science, Innovation and Technology does not hold data relating to the number of fraudulent or scam adverts on social media or other regulated services.
There are mechanisms in the Online Safety Act that allow Ofcom to collect information from categorised services on the incidence and dissemination of illegal content, which would include fraudulent advertising content. Ofcom is required under the Act to publish annual transparency reports.
The Department for Science, Innovation and Technology does not hold data relating to the number of fraudulent or scam adverts on social media or other regulated services.
There are mechanisms in the Online Safety Act that allow Ofcom to collect information from categorised services on the incidence and dissemination of illegal content, which would include fraudulent advertising content. Ofcom is required under the Act to publish annual transparency reports.
The Online Safety Act (OSA) lists certain fraud offences as ‘priority offences’, meaning regulated services must prevent users encountering fraudulent content, swiftly remove it if it appears, and mitigate and manage the risk of their services facilitating fraud. Ofcom, the independent regulator, has robust powers to act where services are failing in these responsibilities.
Measures under the OSA to specifically tackle fraudulent advertising are still being implemented. In the summer, Ofcom aim to publish a register of categorised services and to launch a consultation on additional duties for those designated as Category 1 or 2A to tackle paid-for fraudulent advertising.
The Online Safety Act (OSA) lists certain fraud offences as ‘priority offences’, meaning regulated services must prevent users encountering fraudulent content, swiftly remove it if it appears, and mitigate and manage the risk of their services facilitating fraud. Ofcom, the independent regulator, has robust powers to act where services are failing in these responsibilities.
Measures under the OSA to specifically tackle fraudulent advertising are still being implemented. In the summer, Ofcom aim to publish a register of categorised services and to launch a consultation on additional duties for those designated as Category 1 or 2A to tackle paid-for fraudulent advertising.
The Government recognises that AI is transforming workplaces, demanding new skills and augmenting existing roles. We have launched the AI and the Future of Work Unit - a cross‑government function dedicated to ensuring AI delivers positive outcomes for the economy, jobs, and workers. We are preparing for a range of possible futures to ensure this transformation boosts productivity and opportunities and the Government launched an assessment of AI impacts on the labour markets in January 2026.
To build a digitally skilled workforce to support long-term economic growth, drive innovation and expand individual opportunity we are supporting AI Skills Boost to upskill 10 million workers in AI skills by 2030. We have already delivered more than 1 million AI training courses have been delivered to workers across the UK.
The Government recognises the importance of memory chips to our economy and critical sectors. We regularly engage with industry to monitor supply chain vulnerabilities and understand potential risks across all chip types. Given the global nature of semiconductor supply chains, the UK is working closely with international partners bilaterally and through multilateral fora – such as the G7 and OECD - to strengthen collective resilience, improve information‑sharing, and develop coordinated approaches to supply chain challenges.
The Government keeps the impacts of data protection legislation under review. As set out in the answer of 20 March 2026 to Question 120026, there is currently no definitive empirical study that isolates the specific, UK‑wide impact of the UK GDPR on productivity since its adoption.
The UK’s data protection framework has been updated through the Data (Use and Access) Act, which makes targeted changes to the UK GDPR and related legislation to make the regime clearer, more proportionate and better suited to supporting responsible data‑driven innovation, while maintaining high standards of protection for individuals. In this context, the Government’s focus is on evaluating the impacts of the UK’s data protection framework as it now operates, including the reforms introduced by the Data (Use and Access) Act.
We are committed to building the evidence base on how our data protection and wider data legislation affects businesses, consumers and the economy, including productivity, as part of our ongoing programme of monitoring and evaluation.
We are working with UKRI, universities, and other partners to ensure the safe and responsible adoption of AI tools while protecting research integrity.
Our AI for Science Strategy recognises that the integration of AI into research holds potential to be the single most impactful application of the technology, setting out 15 actions that will support UK researchers. That will include the provision of compute through the AI Research Resource; delivery of training and upskilling in AI methods; the creation, curation, and scaling of AI-ready datasets; developing access models for AI tools; developing autonomous lab infrastructure, and supporting research into the impacts of AI on the scientific process.
Additionally, the National Data Library will support the foundations for AI-enabled research by improving access to high-quality public sector data, alongside recently published guidance to help public bodies make datasets AI-ready.
The Government is committed to ensuring that any risks from the industry-led migration of the copper based Public Switched Telephone Network (PSTN) to Voice over Internet Protocol (VoIP) are mitigated for everyone across the UK, including rural communities. In 2024/25, there were over 2,600 major incidents on the PSTN, each affecting 500 or more customers.
In November 2024, the Government secured additional safeguards from the telecoms industry. These include the provision of free battery back-ups for vulnerable and landline dependent customers to ensure access to emergency services for at least one hour in a power outage. Many communication providers have gone further, providing battery back-ups of 4-7 hours.
In March 2026, the Government and industry agreed a new Fixed Telecoms Charter to extend these safeguards to all future fixed telecoms modernisation programmes.
The Department for Science, Innovation and Technology’s first set of accounts were for 2023/24 where the expenditure on special severance payments was £99,390. Expenditure in subsequent years can be found in the relevant annual report and accounts.
The Online Safety Act requires platforms to tackle illegal content and protect children from harmful content, including that which is hateful and abusive. For large user-to-user platforms, known as ‘Category 1’ services, it will also provide adult users with more protections from hate speech by offering them more choice over the types of content they engage with, filter content from non-verified accounts and hold platforms to account for their terms of service. Ofcom have robust enforcement powers to enforce these duties.
No assessment has been made of the of the consistency between the number of beagles licensed for use in scientific experiments approved by the Home Office between January and December 2025 and the Government's Replacing Animals in Science strategy. The Labour Manifesto commits to partnering with scientists, industry and civil society as we work towards the phasing out of animal testing. It is not yet possible to replace all animal use due to the complexity of biological systems and regulatory requirements for their use. Any work to phase out animal testing must be science-led, in lock step with partners.
All organisations processing personal data in the UK must comply with the UK’s data protection framework.
The UK has strong safeguards to ensure that data is collected and handled responsibly and securely. Companies registered in the UK are subject to our legal framework and regulatory jurisdiction. Personal data transfers abroad are subject to a high level of legal protection. Failure to comply can result in enforcement action.
As threats to UK data evolve our response will be agile and proportionate. We actively monitor threats to UK data and will not hesitate to take further action if necessary to protect our national security.
The Online Safety Act lists fraud as a priority offence, meaning that in-scope services must now prevent and minimise user-generated fraud content from appearing on their platforms, and swiftly remove it if it does.
Services designated by Ofcom as Category 1 and 2A (large user-to-user and large search services respectively) will have additional duties to tackle paid-for fraudulent advertising. Ofcom aims to publish its categorisation register, and to consult on the additional duties for categorised services – including on fraudulent advertising - around July 2026.
The Department for Science, Innovation and Technology (DSIT) has committed a record £58.5 billion investment in R&D over the next 4 years. This includes £38.6 billion allocated to UKRI. The overall Government spend on R&D over the next 4 years is £86 billion.
The Science and Technology Facilities Council (STFC) within UKRI has a flat budget across this period and is currently working with the sector to model different spending scenarios for its overall portfolio including in particle physics, astronomy and nuclear physics (PPAN). The impacts of different modelled scenarios across the broad and diverse range of STFC-funded facilities and programmes will be considered alongside feedback from the sector when taking final decisions. The current level of post-doctoral researchers and flow of PhD students will be maintained across the SR period.
DSIT has asked UKRI to ensure that its specific investment decisions are informed by meaningful engagement with the scientific research community and a robust assessment of potential consequences for the UK’s scientific capability, research institutions and international standing.
The Information Commissioner’s Office have seen the average days to resolve or close an FOI complaint reduce over the past five years from 134 days in 2021/22 to 76 days in 2025/26 despite cases increasing from 5932 to 8337 over the same period. The ICO are now publishing this information on a monthly basis on their website.
The Government recognises the importance of safeguarding the UK’s research and innovation ecosystem, including the university spinout sector, from risks associated with foreign ownership, influence, or investment. The government will not hesitate to use our powers to protect national security wherever we identify concerns and we have a range of effective measures in place to do so.
The Government is actively protecting the UK’s research and spinout ecosystem from national security risks. The National Protective Security Authority (NPSA), working with the National Cyber Security Centre (NCSC), supports universities and spinouts through the Secure Innovation programme, providing advice on due diligence, investment screening and managing security risks. Targeted Secure Innovation Security Reviews further help early‑stage firms identify and mitigate vulnerabilities linked to foreign engagement.
The Government has powers under the National Security and Investment (NSI) Act 2021 to review and, where required, intervene in investments that may pose a risk to national security. The Government also monitors the market at all times to identify acquisitions of potential national security interest.