Driving innovation that will deliver improved public services, create new better-paid jobs and grow the economy.
Oral Answers to Questions is a regularly scheduled appearance where the Secretary of State and junior minister will answer at the Dispatch Box questions from backbench MPs
Other Commons Chamber appearances can be:Westminster Hall debates are performed in response to backbench MPs or e-petitions asking for a Minister to address a detailed issue
Written Statements are made when a current event is not sufficiently significant to require an Oral Statement, but the House is required to be informed.
Department for Science, Innovation & Technology does not have Bills currently before Parliament
A bill to make provision about access to customer data and business data; to make provision about services consisting of the use of information to ascertain and verify facts about individuals; to make provision about the recording and sharing, and keeping of registers, of information relating to apparatus in streets; to make provision about the keeping and maintenance of registers of births and deaths; to make provision for the regulation of the processing of information relating to identified or identifiable living individuals; to make provision about privacy and electronic communications; to establish the Information Commission; to make provision about information standards for health and social care; to make provision about the grant of smart meter communication licences; to make provision about the disclosure of information to improve public service delivery; to make provision about the retention of information by providers of internet services in connection with investigations into child deaths; to make provision about providing information for purposes related to the carrying out of independent research into online safety matters; to make provision about the retention of biometric data; to make provision about services for the provision of electronic signatures, electronic seals and other trust services; to make provision about the creation and solicitation of purported intimate images and for connected purposes.
This Bill received Royal Assent on 19th June 2025 and was enacted into law.
e-Petitions are administered by Parliament and allow members of the public to express support for a particular issue.
If an e-petition reaches 10,000 signatures the Government will issue a written response.
If an e-petition reaches 100,000 signatures the petition becomes eligible for a Parliamentary debate (usually Monday 4.30pm in Westminster Hall).
Introduce 16 as the minimum age for children to have social media
Gov Responded - 17 Dec 2024 Debated on - 24 Feb 2025We believe social media companies should be banned from letting children under 16 create social media accounts.
The Government Digital Service does not recommend specific suppliers of BSL assurance, accessibility audits or suppliers of technology development generally.
Service Owners will follow their department's own supplier and commercial strategies. A number of Deaf-led agencies and language service providers are available to engage through Crown Commercial Service's digital purchasing frameworks.
Deaf-led BSL suppliers play a crucial role in ensuring that BSL is used correctly and effectively in various settings.
Multiple suppliers exist in the market providing services to the Deaf community, ensuring that BSL is used effectively in all aspects of communication. The Department for Science, Innovation and Technology does not prescribe a particular supplier, it is for service owners to select one based on their service users' needs.
The Government Digital Service (GDS) provides service teams across the public sector with guidance on accessible design, use of AI and requirements under the WCAG regulations.
GDS does not provide BSL expertise. Service Owners must conduct research with disabled people, including Deaf users and where appropriate to the service provision, those who use sign language or a sign language interpreter to interact with the service. Services must seek expertise where appropriate from the BSL community and specialist Deaf-led agencies to test their products.
The Government Digital Service does not record enquiries at this level of granularity.
Depending on the service type, it is likely the service team will both consult with the Service Manual and the Technology Code of Practice - covering the standards services need to meet - and go through a service assessment in order to receive a GOV.UK web address.
The assessment will check compliance with the Service Standard, including assessing evidence it complies with accessibility regulation and avoid excluding any groups within the audience they’re intended to serve.
The government is clear that no one should have to go through the ordeal of these horrendous images online.
Ofcom has confirmed that they have opened an investigation into X and have our full backing to take necessary enforcement action.
The commencement order for the offence of the creation, or requested creation, of intimate images will be signed this week meaning that individuals are committing a criminal offence if they create – or seek to create – such abhorrent content. This will also be made a priority offence, meaning platforms must take proactive action.
This is not about restricting freedom of speech but upholding the law.
The government is clear that no one should have to go through the ordeal of these horrendous images online.
Ofcom has confirmed that they have opened an investigation into X and have our full backing to take necessary enforcement action.
The commencement order for the offence of the creation, or requested creation, of intimate images will be signed this week meaning that individuals are committing a criminal offence if they create – or seek to create – such abhorrent content. This will also be made a priority offence, meaning platforms must take proactive action.
This is not about restricting freedom of speech but upholding the law.
As the independent regulator for telecommunications, Ofcom is responsible for making regulatory decisions in the fixed telecoms sector and is currently finalising its Telecoms Access Review.
DSIT officials regularly engage with Ofcom on these issues. In July, we published our draft updated Statement of Strategic Priorities to Ofcom that sets out the government’s view on the importance of competition to promote investment in broadband deployment across the UK, including in rural areas.
In non-commercially viable, often rural, areas, more than £2.4 billion of Project Gigabit contracts have already been signed to connect over one million premises with gigabit-capable broadband.
Under the Online Safety Act, sharing, or threatening to share, a deepfake intimate image without consent is a criminal offence. The government will also urgently bring into force a new offence which criminalises the creation of intimate images without consent.
We will also legislate to criminalise nudification apps. This new criminal offence will make it illegal for companies to supply tools specifically designed to create non-consensual intimate images.
Ofcom is the enforcement regulator for the Online Safety Act and it has confirmed it is opening a formal investigation into X due to concerns over non consensual intimate images.
Under the Online Safety Act, sharing, or threatening to share, a deepfake intimate image without consent is a criminal offence. The government will also urgently bring into force a new offence which criminalises the creation of intimate images without consent.
We will also legislate to criminalise nudification apps. This new criminal offence will make it illegal for companies to supply tools specifically designed to create non-consensual intimate images.
Ofcom is the enforcement regulator for the Online Safety Act and it has confirmed it is opening a formal investigation into X due to concerns over non consensual intimate images.
The Government recognises the importance of tackling AI-generated CSAM. Creating, possessing, or distributing CSAM, including AI Generated CSAM, is illegal. The Online Safety Act requires services to proactively identify and remove this content. We are taking further action in the Crime and Policing Bill to criminalise CSAM image generators, and to ensure AI developers can directly test for and address vulnerabilities in their models which enable the production of CSAM.
The AISI / Thorn joint publication guidance (Recommended Practice for AI-G CSEA Prevention) sets out practical steps that AI developers, model hosting services and others in the AI ecosystem can take to reduce the risk that their systems are misused to generate CSAM. This guidance is informed by input from industry and child protection organisations, and many of the world’s leading AI developers (including OpenAI, Anthropic, Google and Meta) have signed up to the principles of earlier forms of this guidance.
The Government is clear: no option is off the table when it comes to protecting the online safety of users in the UK, and we will not hesitate to act where evidence suggests that further action is necessary.
The Government recognises the importance of tackling AI-generated CSAM. Creating, possessing, or distributing CSAM, including AI Generated CSAM, is illegal. The Online Safety Act requires services to proactively identify and remove this content. We are taking further action in the Crime and Policing Bill to criminalise CSAM image generators, and to ensure AI developers can directly test for and address vulnerabilities in their models which enable the production of CSAM.
The AISI / Thorn joint publication guidance (Recommended Practice for AI-G CSEA Prevention) sets out practical steps that AI developers, model hosting services and others in the AI ecosystem can take to reduce the risk that their systems are misused to generate CSAM. This guidance is informed by input from industry and child protection organisations, and many of the world’s leading AI developers (including OpenAI, Anthropic, Google and Meta) have signed up to the principles of earlier forms of this guidance.
The Government is clear: no option is off the table when it comes to protecting the online safety of users in the UK, and we will not hesitate to act where evidence suggests that further action is necessary.
The Government recognises the importance of tackling AI-generated CSAM. Creating, possessing, or distributing CSAM, including AI Generated CSAM, is illegal. The Online Safety Act requires services to proactively identify and remove this content. We are taking further action in the Crime and Policing Bill to criminalise CSAM image generators, and to ensure AI developers can directly test for and address vulnerabilities in their models which enable the production of CSAM.
The AISI / Thorn joint publication guidance (Recommended Practice for AI-G CSEA Prevention) sets out practical steps that AI developers, model hosting services and others in the AI ecosystem can take to reduce the risk that their systems are misused to generate CSAM. This guidance is informed by input from industry and child protection organisations, and many of the world’s leading AI developers (including OpenAI, Anthropic, Google and Meta) have signed up to the principles of earlier forms of this guidance.
The Government is clear: no option is off the table when it comes to protecting the online safety of users in the UK, and we will not hesitate to act where evidence suggests that further action is necessary.
This year, the government will be supporting a summit at Wilton Park on the impact of AI on childhood. This will bring together experts, technology companies, civil society and young people to explore how AI can benefit children without exposing them to harm.
To keep our world-leading universities globally competitive, the Post-16 Education and Skills White Paper sets out a joint DfE–DSIT vision for a financially sustainable higher education sector that delivers better value for students, supports local growth, and meets labour market needs. This includes record DSIT R&D investment of £58.5 billion between 2026/27 and 2029/30.
The UK’s immigration offer attracts research talent through visa routes such as the fast-track Global Talent visa, complemented by funding via UKRI and National Academies fellowships and professorships, our association to Horizon Europe, and the Global Talent Fund to retain world-class researchers.
UKRI allows visa costs, including the Immigration Health Surcharge, to be claimed on grants, and many other organisations also allow these costs on their grants. Visa costs are also allowable costs for researchers on Horizon Europe grants.
Ofcom’s effectiveness is kept under regular review and there is ongoing engagement with the regulator on key issues.
We monitor Ofcom’s effectiveness against its key performance indicators and objectives using reporting in Ofcom’s Annual Report and Accounts. These are laid before Parliament, to whom Ofcom is accountable. Ofcom’s leadership also appears regularly before Select Committees to give evidence and to be scrutinised on Ofcom’s work. In addition, DSIT ministers will meet with Ofcom to discuss overall performance twice a year as part of the Regulation Reform Programme.
The government directs Ofcom on parts of its remit through statements of strategic priorities. Ofcom must report annually on how it has had regard to their steers. The online safety statement was designated in July 2025, and the updated telecoms, spectrum and post statement will be designated in the coming months. Ofcom must have regard to the statements when exercising relevant functions and must publish a review of what it has done in consequence of the statements every 12 months.
The Intellectual Property Office (IPO) is an executive agency of the Department of Science, Innovation and Technology (DSIT), with delegated responsibility for operational matters including salaries. Salary costs have increased since 2017 due to two main factors. Headcount has increased over this period, driven both by a sustained increase in demand for IP Services plus investment in a Transformation programme aimed at delivering better digital services to our customers and internal frontline staff. The second reason is the application of the annual pay awards. IPO complies fully with the Cabinet Office annual pay remit guidance and annual pay cases are approved by HMT through a rigorous business case process.
Ofcom is responsible for reporting on mobile network coverage across the UK and their data provides Government with information on mobile coverage gaps.
Ofcom do not publish mobile coverage data on a regional basis such as for East Sussex and West Sussex. However, in their Connected Nations Annual Report 2025, published on 19 November 2025, it is reported that 1% of the East Grinstead and Uckfield constituency has no 4G geographic coverage from any operator.
10% of premises in this constituency have no 5G (combined standalone and non-standalone) outside from any mobile operator, 68% of premises in the constituency do not have standalone 5G outside from any mobile operator.
Ofcom is responsible for reporting on mobile network coverage across the UK and their data provides Government with information on mobile coverage gaps.
Ofcom do not publish mobile coverage data on a regional basis such as for East Sussex and West Sussex. However, in their Connected Nations Annual Report 2025, published on 19 November 2025, it is reported that 1% of the East Grinstead and Uckfield constituency has no 4G geographic coverage from any operator.
10% of premises in this constituency have no 5G (combined standalone and non-standalone) outside from any mobile operator, 68% of premises in the constituency do not have standalone 5G outside from any mobile operator.
Ofcom is responsible for reporting on mobile network coverage across the UK and their data provides Government with information on mobile coverage gaps.
Ofcom do not publish mobile coverage data on a regional basis such as for East Sussex and West Sussex. However, in their Connected Nations Annual Report 2025, published on 19 November 2025, it is reported that 1% of the East Grinstead and Uckfield constituency has no 4G geographic coverage from any operator.
10% of premises in this constituency have no 5G (combined standalone and non-standalone) outside from any mobile operator, 68% of premises in the constituency do not have standalone 5G outside from any mobile operator.
Our ambition is for all populated areas to have access to higher quality standalone 5G by 2030. This ambition includes villages and rural communities as well as towns and cities.
Standalone 5G is a more reliable, secure, generation of technology which has the potential to deliver significant benefits to communities across the UK.
Government continues to work closely with the mobile network operators to ensure that continued investment into the expansion and improvement of mobile networks translates into benefits for communities right across the UK. We are also addressing barriers to deployment where they exist and recently launched a call for evidence to help determine where planning rules could be relaxed to support the deployment of digital infrastructure.
There are statutory obligations on communications providers to take appropriate and proportionate steps to ensure their networks and services remain available, and Ofcom has powers to investigate, rectify and penalise communications providers for any infringement of their duties. Ofcom is undertaking a review of the resilience of mobile services to power cuts and considering whether to update the expectations on mobile operators on the level of power back up required.
Our ambition is for all populated areas to have access to higher quality standalone 5G by 2030. This ambition includes villages and rural communities as well as towns and cities.
Standalone 5G is a more reliable, secure, generation of technology which has the potential to deliver significant benefits to communities across the UK.
Government continues to work closely with the mobile network operators to ensure that continued investment into the expansion and improvement of mobile networks translates into benefits for communities right across the UK. We are also addressing barriers to deployment where they exist and recently launched a call for evidence to help determine where planning rules could be relaxed to support the deployment of digital infrastructure.
There are statutory obligations on communications providers to take appropriate and proportionate steps to ensure their networks and services remain available, and Ofcom has powers to investigate, rectify and penalise communications providers for any infringement of their duties. Ofcom is undertaking a review of the resilience of mobile services to power cuts and considering whether to update the expectations on mobile operators on the level of power back up required.
Our ambition is for all populated areas to have access to higher quality standalone 5G by 2030. This ambition includes villages and rural communities as well as towns and cities.
Standalone 5G is a more reliable, secure, generation of technology which has the potential to deliver significant benefits to communities across the UK.
Government continues to work closely with the mobile network operators to ensure that continued investment into the expansion and improvement of mobile networks translates into benefits for communities right across the UK. We are also addressing barriers to deployment where they exist and recently launched a call for evidence to help determine where planning rules could be relaxed to support the deployment of digital infrastructure.
There are statutory obligations on communications providers to take appropriate and proportionate steps to ensure their networks and services remain available, and Ofcom has powers to investigate, rectify and penalise communications providers for any infringement of their duties. Ofcom is undertaking a review of the resilience of mobile services to power cuts and considering whether to update the expectations on mobile operators on the level of power back up required.
Ofcom is the independent regulator for online safety and is responsible for scrutinising platforms’ risk assessments, requiring safety mitigations, and enforcing safety duties. Ofcom has our full backing in using all available powers to protect users.
On 4 December 2025, Ofcom released a summary of the tech sector's response to the UK's new online safety rules. While there has been notable progress, further action is needed, including by major services.
Government meets Ofcom regularly to discuss online safety, including ensuring the swift implementation of the outstanding duties under the Act, and we continue to monitor outcomes through our joint evaluation programme.
Ofcom is the independent regulator for online safety and is responsible for scrutinising platforms’ risk assessments, requiring safety mitigations, and enforcing safety duties. Ofcom has our full backing in using all available powers to protect users.
On 4 December 2025, Ofcom released a summary of the tech sector's response to the UK's new online safety rules. While there has been notable progress, further action is needed, including by major services.
Government meets Ofcom regularly to discuss online safety, including ensuring the swift implementation of the outstanding duties under the Act, and we continue to monitor outcomes through our joint evaluation programme.
The US-UK Technology Prosperity Deal was signed as a Memorandum of Understanding (MoU) which does not constitute or create any legally binding obligations. The MoU represents a political and policy-level understanding between the two governments.
Performing Right Society Limited (PRS) is a collective management organisation (CMO) and a private commercial entity and the Government does not regulate its commercial affairs. Consequently, the Department has not made an impact assessment in relation to PRS's commercial licensing fees.
Licence fees are usually the outcome of negotiation between a CMO and a trade body representing potential licensees in a sector. Prospective licensees have recourse to the Copyright Tribunal if dissatisfied with the terms of a licence, and the Tribunal’s decisions can be appealed in the High Court or the Court of Session in Scotland.
Performing Right Society Limited (PRS) is a collective management organisation (CMO) and a private commercial entity and the Government does not regulate its commercial affairs. Consequently, the Department has not made an impact assessment in relation to PRS's commercial licensing fees.
Licence fees are usually the outcome of negotiation between a CMO and a trade body representing potential licensees in a sector. Prospective licensees have recourse to the Copyright Tribunal if dissatisfied with the terms of a licence, and the Tribunal’s decisions can be appealed in the High Court or the Court of Session in Scotland.
The Government does not give a running commentary on models being tested or which models we have been granted access to due to commercial and security sensitivities. Where possible, given these sensitivities, the AI Security Institute aims to publish results.
Government is working with the industry to deliver high quality digital connectivity right across the UK, whether this is fixed, or mobile connectivity. Our ambition is for all populated areas to have access to higher-quality standalone 5G by 2030, and we expect this to be delivered through the mobile operators' commercial network rollout plans. Ofcom, as the telecommunications regulator is responsible for reporting on coverage.
For both non-standalone and standalone 5G, high and very high confidence thresholds are used. These thresholds are explained in detail in the methodology annex published alongside the Connected Nations 2025 report. This states that high confidence is associated with a probability of at least 80% of coverage being present in the predicted location and a 95% probability for very high confidence.
The Department for Science, Innovation and Technology (DSIT) has policy responsibility for promoting responsible AI innovation and uptake. Risks related to chemical, biological, radiological, and nuclear weapons (and other dangerous weapons), including defining thresholds for harm in these domains, are managed by a combination of the Home Office, Foreign, Commonwealth and Development Office, Cabinet Office, and the Ministry of Defence. DSIT does not set thresholds for dangerous capabilities in risk domains owned by other departments.
The AI Security Institute (AISI), as part of DSIT, focuses on researching emerging AI risks with serious security implications, such as the potential for AI to help users develop chemical and biological weapons. AISI works with a broad range of experts and leading AI companies to understand the capabilities of advanced AI and advise on technical mitigations. AISI’s research supports other government departments in taking evidence-based action to mitigate risks whilst ensuring AI delivers on its potential for growth. AISI’s Frontier AI Trends Report, published in December 2025, outlines how frontier AI risks are expected to develop in the future.
The Department for Science, Innovation and Technology (DSIT) has policy responsibility for promoting responsible AI innovation and uptake. Risks related to chemical, biological, radiological, and nuclear weapons (and other dangerous weapons), including defining thresholds for harm in these domains, are managed by a combination of the Home Office, Foreign, Commonwealth and Development Office, Cabinet Office, and the Ministry of Defence. DSIT does not set thresholds for dangerous capabilities in risk domains owned by other departments.
The AI Security Institute (AISI), as part of DSIT, focuses on researching emerging AI risks with serious security implications, such as the potential for AI to help users develop chemical and biological weapons. AISI works with a broad range of experts and leading AI companies to understand the capabilities of advanced AI and advise on technical mitigations. AISI’s research supports other government departments in taking evidence-based action to mitigate risks whilst ensuring AI delivers on its potential for growth. AISI’s Frontier AI Trends Report, published in December 2025, outlines how frontier AI risks are expected to develop in the future.
Protecting children from harm online is a top priority for this government.
This year, the government will be supporting a NSPCC summit at Wilton Park on the impact of AI on childhood. This will bring together experts, technology companies, civil society and young people to explore how AI can benefit children without exposing them to harm.
Media literacy is also a key part of our approach, helping children and adults develop critical thinking skills to navigate the growing presence of AI-generated content. DSIT is working with the Department for Education to develop an online ‘parent hub’ providing guidance on media literacy and online safety.
A range of existing rules already apply to AI systems, such as data protection, competition, equality legislation, and online safety. In response to the AI Action Plan, the government committed to work with regulators to boost their capabilities.
This is complemented by the work of the AI Security Institute, which has deepened our understanding of the critical security risks posed by frontier AI. The government remains committed to ensuring our rule book is up to date and future-proofed so the UK is prepared for the changes AI will bring.
The government is also supporting the UK AI assurance market, which will provide ways to measure, evaluate and communicate the trustworthiness and safety of AI systems.
I refer the hon. Member to the answer given on 1 December 2025 to Question UIN 94115.
Ministers in DSIT and DfE are working closely together to ensure adult education keeps pace with the rapid take-up of AI.
As AI is increasingly adopted across the workplace, this will create a high demand for workers to have the skills to deploy AI. This will require adult education and upskilling to evolve for the AI age, which is why we’re jointly reviewing AI skills needs, expanding lifelong learning, and rolling out new scholarships and traineeships so adults can upskill and reskill for the jobs AI is creating.
DSIT has also formed a partnership with 11 major technology companies and leading UK businesses to upskill 7.5 million workers in AI by 2030. This will ensure that UK workers benefit from the transformational impact AI will have in the workplace, including those working for small businesses and in all parts of the country.
We are optimistic about how AI will transform the lives of British people for the better, but advanced AI could lead to serious security risks. The capabilities of AI models continue to increase; this may exacerbate existing risks and present new risks for which the UK needs to be prepared.
The role of the AI Security Institute (AISI) is to build an evidence base on these risks, so the government is equipped to understand their security implications. It recently published an evidence‑based assessment of how the world’s most advanced AI systems are evolving on gov.uk, bringing together results from two years of AISI's frontier model testing.
AISI works with a broad range of experts and companies to assess the potential risks these could pose as the technology continues to develop.
We are optimistic about how AI will transform the lives of British people for the better, but advanced AI could also lead to serious security risks.
The Government believes that AI should be regulated at the point of use, and takes a context-based approach. Sectoral laws give powers to take steps where there are serious risks - for example the Procurement Act 2023 can prevent risky suppliers (including those of AI) from being used in public sector contexts, whilst a range of legislation offers protections against high-risk chemical and biological incidents.
This approach is complemented by the work of the AI Security Institute, which works in partnership with AI labs to understand the capabilities and impacts of advanced AI, and develop and test risk mitigations.
Everyone should be able to benefit from the digital world — from saving on everyday bills to finding better jobs and accessing vital services like the NHS
Last February we published the Digital Inclusion Action Plan, setting out the government's first steps to ensure everyone in the UK – no matter their background – can fully participate in our digital society
Since then, we have launched the £11.9 million Digital Inclusion Innovation Fund with £764,020 in Scotland, £440,368 in Wales and £267,249 in Northern Ireland. Helping more people across the UK get the access, skills and confidence to get online.
The fund is currently supporting 85 community‑led projects across England, designed to support locally delivered, highly tailored and targeted interventions that meet the needs of digitally excluded people, including older and disabled users
We remain committed to ensuring all public services are accessible and inclusive – with published best practice to make sure websites and apps work for everyone, alongside providing alternative routes – like in-person and telephone support – for those that need them.
The Department for Science, Innovation and Technology (DSIT) is taking significant steps to expand skills and ethical training in AI.
Last year, we formed a partnership with 11 major technology companies and leading UK businesses to upskill 7.5 million workers in AI by 2030. This will ensure that UK workers benefit from the transformational impact AI will have in the workplace right across the country by ensuring they have access to high quality skills provision, free at the point of use. This will cover a range of skills, including responsible and ethical use of AI.
Last year we also launched the AI Skills Hub – an online learning platform where learners can access training courses on a range of topics ranging from foundational AI literacy to ethics and responsible AI use, to more advanced skills to develop and deploy AI.
The updated Statement of Strategic Priorities for telecoms, the management of radio spectrum and postal services will be laid before Parliament in the coming weeks and will be designated 40 days later.
Everyone should be able to benefit from the digital world — from saving on everyday bills to finding better jobs and accessing vital services like the NHS.
That is why – as part of the First Steps confirmed in the Digital Inclusion Action Plan – we launched the £11.9 million Digital Inclusion Innovation Fund to help more people across the UK get the access, skills and confidence to get online. This included 85 community led projects in England.
Numerous projects are supporting people to build AI skills, such as the Age UK Westminster project improving AI literacy for older people, and Aston University and FutureDotNow delivering projects that support youth employability through digital inclusion.
This Fund will conclude by 31 March 2026.
More broadly, reducing the AI skills gap is critical for increasing the UK’s productivity and delivering long-term growth. That is why we are working with DfE and Skills England to assess the AI skills gap and map pathways to fill it, and last year announced a joint commitment with industry to upskill 7.5 million workers with vital AI skills.
Alongside this, DSIT is delivering the £187 million TechFirst programme that will support over 4,000 domestic graduates, researchers and innovators and engage 1 million students in digital skills and AI learning.
Lastly, following the independent Curriculum and Assessment Review’s final report last year, national curriculum will be updated to prepare young people for life and work in a changing world. The Government will embed digital, media and AI literacy across the curriculum, introduce a refreshed, broader computing GCSE, and integrate digital content into other subjects.
Our ambition is for all populated areas to have higher quality standalone 5G by 2030 and we have a target to deliver nationwide (99%) gigabit broadband coverage by 2032.
The Government continues to work closely with the mobile network operators to ensure their continued investment into the expansion and improvement of mobile networks and that investment translates into benefits for communities right across the UK.
To improve broadband coverage in the area CityFibre is delivering a Project Gigabit contract across East and West Sussex, which includes premises in the East Grinstead and Uckfield constituency.
We are also working to identify and address barriers to deployment of both mobile and broadband infrastructure. This includes recently launching a call for evidence to help determine where planning rules could be relaxed to support the deployment of digital infrastructure.
Government’s ambition is for all populated areas to have access to higher quality standalone 5G by 2030. This is a UK wide ambition, coverage improvements in the East Grinstead and Uckfield constituency will therefore contribute to the achievement of this ambition.
Government wants to see high quality digital infrastructure available right across the UK, whether this is fixed or mobile, allowing people to participate in the modern digital economy.
Government’s ambition is for all populated areas to have access to higher quality standalone 5G by 2030. This is a UK wide ambition, coverage improvements in the East Grinstead and Uckfield constituency will therefore contribute to the achievement of this ambition.
Government wants to see high quality digital infrastructure available right across the UK, whether this is fixed or mobile, allowing people to participate in the modern digital economy.
Government’s ambition is for all populated areas to have access to higher quality standalone 5G by 2030. This is a UK wide ambition, coverage improvements in the East Grinstead and Uckfield constituency will therefore contribute to the achievement of this ambition.
Government wants to see high quality digital infrastructure available right across the UK, whether this is fixed or mobile, allowing people to participate in the modern digital economy.
Everyone should be able to benefit from the digital world — from saving on everyday bills to finding better jobs and accessing vital services like the NHS.
AI has the potential to transform the economy, how public services are delivered and people's lives, and the government is committed to ensuring people understand and benefit from this.
Last February we published the Digital Inclusion Action Plan, setting out the government's first steps to ensure everyone in the UK – no matter their background – can fully participate in our digital society.
Since then, we have launched the £11.7 million Digital Inclusion Innovation Fund, helping more people across the UK get the access, skills and confidence to get online. This supported a number of projects specifically focused on older and disabled people, and AI training and awareness.
We recognise that some people, including older or disabled people, may face barriers to building AI skills. DSIT is working with DfE and Skills England to assess the AI skills gap and map pathways to fill it. Last year we announced a joint commitment with industry to upskill 7.5 million workers with vital AI skills. We also announced the TechFirst programme, a £187m initiative to bring digital skills and AI learning into classrooms and communities to train people of all ages and backgrounds for future tech careers.
My Department recognises the importance of the .io country code top level domain (ccTLD) and the need for its continuity and stability. We are engaging closely with the Foreign, Commonwealth and Development Office on the potential impact of a change in sovereignty over the British Indian Ocean Territory on the status of the .io ccTLD.
AI is transforming the world of work. The UK must act now to ensure this transformation boosts growth, productivity and opportunity—rather than deepening inequality or eroding job quality. The government is not standing still: we’re investing in skills, monitoring impacts, and working with employers and experts to make sure AI benefits everyone—not just a few.
AI can help give local businesses better insights and improve business efficiency. This government has introduced the Small Business Plan to help all businesses with new tools to unlock access to finance, action to address late payments and regulatory costs, improve digital adoption and create easier pathways to business support through the Business Growth Service regardless of their AI capability.
The UK General Data Protection Regulation (UK GDPR) and the Data Protection Act 2018 (DPA) provide individuals with the right to access their personal data, subject to relevant exemptions. A subject access request must be responded to within one month of receiving the request. The response time may be extended by a further two months if the request is complex, or if the individual has submitted a number of requests, provided the organisation informs the requestor within the one-month period and provides reasons for the delay.
The Information Commissioner’s Office (ICO) is responsible for monitoring and enforcing the data protection legislation independently of government, and is accountable to Parliament.