Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department for Science, Innovation & Technology:
To ask His Majesty's Government what steps they are taking to support UK companies to improve cybersecurity.
Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)
Improving the cyber security of UK companies is critical to the resilience of our wider economy and is a priority for the government.
The Cyber Security and Resilience Bill will improve UK cyber defences and help protect our essential services. Our product security legislation and cyber security codes of practice are helping to ensure the technology people and businesses use is secure by design. We are also developing and growing the cyber security industrial base and skills pipeline to ensure companies have access to the services and capabilities they need. Together these system-wide measures aim to drive a step change in supporting companies across the economy to improve their cyber resilience.
In addition, the government wrote to the Chairs and CEOs of leading UK companies and asked them to better identify and protect themselves from cyber threats by making cyber a board-level priority by using the Cyber Governance Code of Practice, signing up to the National Cyber Security Centre (NCSC) Early Warning service, and requiring Cyber Essentials in their supply chains.
These actions are relevant to all businesses. To support them further, the government has developed a wide range of free resources, including the Cyber Action Toolkit offering tailored advice for small businesses, and NCSC-certified Cyber Advisors who provide advice and guidance on commercial terms, with SMEs eligible for a free 30-minute consultation.
Asked by: Lord Nash (Conservative - Life peer)
Question to the Department for Science, Innovation & Technology:
To ask His Majesty's Government how many investigations Ofcom has commenced under the Online Safety Act 2023 in relation to regulated user-to-user services as defined in that Act; how many penalties have so far been imposed as a result; and how many of the fines imposed have been paid.
Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)
Ofcom, as the independent regulator for online safety, publishes information on its website about the enforcement action it takes, including details of the investigations it has opened into potential breaches of online safety duties. As a result of this work, Ofcom has exercised its powers to issue financial sanctions in several cases, with at least one regulated service having already paid its fine.
Asked by: Al Pinkerton (Liberal Democrat - Surrey Heath)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what assessment she has made of the potential impact of AI on trends in the level of employment in (a) Surrey and (b) Surrey Heath constituency.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
This Government’s recognise that AI is transforming workplaces, demanding new skills and augmenting existing roles, however the future scale of change remains uncertain.
We are planning against a range of plausible outcomes to ensure workers continue to have access to good, meaningful employment.
To support this, we have established the AI and the Future of Work Unit, which will provide robust analysis and evidence on the impact of AI on the labour market. The Unit will coordinate action across government, ensuring our principles are delivered through practical help and support for workers and employers.
As is the case with all new technologies, AI also presents significant opportunities for the labour market. For example, our AI Growth Zones are creating over 15,000 jobs. We are also providing free AI training for all and will provide 10 million workers with essential AI skills by 2030, with the aim to make the UK the fastest adopting AI country in the G7.
Through these initiatives and others, we will drive economic opportunity and support workers to adapt and thrive in workplaces across the UK, including Surrey and Surrey Heath.
Asked by: Al Pinkerton (Liberal Democrat - Surrey Heath)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what progress her Department has made on the rollout of gigabit broadband in Surrey Heath constituency.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
According to Ofcom’s Connected Nations 2025 report, more than 84% of premises in Surrey Heath constituency have access to a gigabit capable connection, slightly below the national average of 86%.
As part of Project Gigabit, Openreach is delivering a contract across Surrey, delivering gigabit-capable broadband to premises not included in suppliers’ commercial rollout plans. Of the approximately 1,950 premises in Surrey Heath constituency included within this contract, approximately 610 have now received access to a gigabit-capable connection.
Asked by: Charlotte Nichols (Labour - Warrington North)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what steps she is taking to ensure that job applicants with protected characteristics are not discriminated against when AI is used to assess applications.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
The Government is committed to removing barriers to AI adoption, unlocking new opportunities, and ensuring that equality is embedded at the heart of every mission. We want AI to work for everyone, and that means supporting innovation while ensuring technologies are fair, inclusive and accessible.
We have published Responsible AI in Recruitment guidance which sets out good practice procuring and deploying AI systems for HR and recruitment. This guidance highlights the mechanisms that can be used to ensure the safe and trustworthy use of AI in recruitment.
As highlighted in the AI Opportunities Action Plan: One Year On, we have taken steps to build the AI assurance ecosystem that underpins safe and responsible use of AI. This includes establishing a new Centre for AI Measurement at the National Physical Laboratory, designed to accelerate the development of secure, transparent and trustworthy AI.
Asked by: Julia Lopez (Conservative - Hornchurch and Upminster)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what discussions she has had with Cabinet colleagues on the impact of (a) VAT and (b) other taxation on the viability of the life sciences sector.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
The Secretary of State has regular engagement with relevant colleagues on the UK business environment for life sciences sector, to drive the growth of the sector and support the delivery of the Life Sciences Sector Plan.
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department for Science, Innovation & Technology:
To ask His Majesty's Government what assessment they have made of the expansion of agentic AI tools in UK legal and professional services; and how this is informing policy on (1) innovation, (2) professional standards and (3) ethical AI use.
Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)
We remain committed to ensuring the trusted and fair use of AI and to facilitating impactful AI adoption across the UK, so that British workers - including those in legal services - can seize the benefits this technology offers.
To support this, the AI Growth Lab will act as a cross‑economy AI sandbox, enabling responsible AI products and services to be deployed under close supervision in live markets. This will drive cross‑economy growth, build trust in new technologies, and create a mechanism for dynamic, evidence‑led regulatory reform.
Alongside this, the Roadmap to Trusted Third‑Party AI Assurance sets out the Government’s ambitions for the UK’s AI assurance market and the immediate actions we are taking to help the sector mature. This includes establishing the £11 million AI Assurance Innovation Fund and convening a national consortium of expert stakeholders to support the quality and growth of the assurance market.
In addition, the Government has established the cross‑government AI and Future of Work Unit to monitor how advanced AI tools are reshaping professional work, ensure innovation is supported responsibly, and coordinate policy so that workers and businesses can adopt these technologies safely and effectively.
Asked by: Nick Timothy (Conservative - West Suffolk)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what steps Ofcom is taking to help tackle websites that provide instructions on committing suicide.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
The Online Safety Act requires in-scope services to prevent all users from encountering illegal suicide and self-harm content, and children from legal content encouraging, promoting, or providing instructions for suicide or self-harm.
The independent regulator Ofcom enforces compliance with the Act. Ofcom’s first investigation under the Act targeted a pro-suicide forum. On 6 January, Ofcom confirmed it has informed the forum provider that Ofcom is working towards issuing a provisional notice of contravention in relation to Act breaches.
Ofcom has also established a dedicated small but risky supervision taskforce, focusing on small services posing the most severe risk of harm.
Asked by: Mike Reader (Labour - Northampton South)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what assessment her Department has made of the effectiveness of the Online Safety Act 2023 in protecting children from harm on online gaming platforms, including Roblox; and whether she plans to undertake a review of the Act’s application to such platforms.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
Gaming platforms that allow users to post or interact, such as Roblox, are in scope of the Online Safety Act. They are required to protect children from illegal and harmful content on their service, including using highly effective age assurance to prevent children encountering the most harmful types of content.
Ofcom is the regulator of the Act and has powers to take robust enforcement action. Ofcom has already used these powers, effectively enforcing against non-compliant services.
We will continue to monitor the effectiveness of the Act, and on 20 January, the government announced a short, swift consultation on further measures to enhance children's wellbeing and ensuring they have a healthy relationship with social media accompanied by a national conversation.
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department for Science, Innovation & Technology:
To ask His Majesty's Government what assessment they have made of reported growth in demand for ethical AI and technology skills in UK financial services; and how this is informing (1) workforce policy, and (2) regulatory policy.
Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)
Government is taking significant steps to expand skills and training in ethical and responsible AI. In January, further public and private sector partners joined the AI Skills Boost, increasing our ambition to upskill 10 million workers by 2030. More than 1 million AI upskilling courses have already been delivered since last summer, helping ensure UK workers - including those in financial services - have access to high‑quality training in the safe and ethical use of AI.
To complement this, the Government has established the cross‑government AI and Future of Work Unit to monitor how advanced AI tools are reshaping professional work, ensure innovation is supported responsibly, and coordinate policy so that workers and businesses can adopt these technologies safely.
We have also concluded a Call for Evidence on proposals for the AI Growth Lab, a cross‑economy AI sandbox that would allow responsible AI products and services to be tested under close supervision in live markets, building trust and supporting economic growth. Alongside this, the FCA’s Supercharged Sandbox and AI Live Testing service provide firms with enhanced access to computing, data and safe real‑world testing environments, enabling the responsible use of AI across UK financial markets.