Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the HM Treasury:
To ask His Majesty's Government what assessment they have made of the impact of workplace financial wellbeing companies on financial inclusion.
Answered by Lord Livermore - Financial Secretary (HM Treasury)
The Government recognises that employers can play an important role in supporting the financial wellbeing of their employees. The Financial Inclusion Strategy seeks to support employers who want to build the financial resilience of their workforce.
Payroll savings schemes are identified in the Strategy as a specific, impactful step employers can take to achieve this goal. The Strategy outlines the Government’s work with the Financial Conduct Authority to provide greater regulatory clarity to employers, so they can offer these schemes with confidence. The Money and Pensions Service is also working with Nest Insight and The Investing and Savings Alliance on the launch of a National Coalition of Employers to encourage uptake among firms.
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department for Science, Innovation & Technology:
To ask His Majesty's Government what assessment they have made of the regulatory and consumer protection implications of the use of AI as financial guidance tools; and what safeguards they are putting in place to protect consumers.
Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)
The Government aims for the UK to be a global leader in AI, using our strengths in financial services and AI to boost growth, productivity and consumer benefits. Safe adoption is central to this.
Organisations must handle personal data fairly, lawfully, transparently and securely, with individuals retaining rights such as access, correction and deletion.
The Financial Conduct Authority is also acting in this space, including publishing guidance for consumers on using AI tools for investment research and highlighting risks like inaccurate or outdated information.
The FCA’s Supercharged Sandbox and AI Live Testing service give firms access to computing, data and safe real‑world environments to support responsible AI use in UK financial markets.
More broadly, the Government recognises that people often lack the support they need when making financial decisions. To improve this, we are introducing a new targeted support regime enabling trusted firms to suggest suitable products or actions based on a customer’s circumstances. Targeted Support will launch in April 2026.
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department for Science, Innovation & Technology:
To ask His Majesty's Government what assessment they have made of the impact of AI adoption in the UK on the labour market; and what plans they have to support workers affected by technological change.
Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)
We recognise that AI is transforming workplaces, demanding new skills and augmenting existing roles, although the future scale of change remains uncertain. This Government is planning against a range of plausible outcomes to ensure workers continue to have access to good, meaningful employment.
To support this work, the Government has established a new Future of Work Unit in DSIT. The Unit will provide robust analysis and evidence on the impact of AI on the labour market and will coordinate action across government, ensuring our principles are delivered through practical help and support for workers and employers.
At the same time, AI presents significant opportunities for the labour market. Around 35% of UK jobs predicted to benefit from productivity gains through AI adoption. Through the AI Opportunities Action Plan, we have committed to upskilling 10 million workers in essential AI skills by 2030. This will support workers to adapt and thrive in workplaces where AI tools are increasingly widespread.
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department of Health and Social Care:
To ask His Majesty's Government what assessment they have made of the quality of surgical outcome data collected by NHS trusts; and what steps they are taking to support NHS trusts to use that data to improve patient safety.
Answered by Baroness Merron - Parliamentary Under-Secretary (Department of Health and Social Care)
The National Clinical Audit and Patient Outcomes Programme (NCAPOP), is commissioned, managed, and developed by the Health Quality Improvement Partnership on behalf of NHS England, the Welsh Government, and other devolved administrations.
The programme currently consists of over 30 national clinical audits, registries, and databases as well as five clinical outcome review programmes.
The audit and registry topics include, for example, the national vascular registry, the national emergency laparotomy audit, and multiple cancer topics, all of which monitor a variety of clinical metrics including surgical outcomes.
The role of the NCAPOP is to detect unwarranted clinical variation and to feed this back to National Health Service trusts in an agile manner. Timely feedback to trusts enables them to make quick improvements to clinical practice. The NCAPOP work programme achieves this by making trust data available in near real time dynamic dashboards. The NCAPOP audits also operate a statistically rigorous outlier process with the aim of detecting negative trust outcomes. Outlier information is provided to the trust concerned, NHS England, and the Care Quality Commission.
The dashboard and outlier data can be used by trusts to influence quality governance, improve patient safety and reduce patient harm, and enable tailored clinical quality improvement programmes.
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department for Science, Innovation & Technology:
To ask His Majesty's Government what steps they are taking to ensure that partnerships with AI companies to develop pilot tools for Gov.uk services deliver benefits for users while protecting data privacy.
Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)
His Majesty’s Government is implementing artificial intelligence partnerships through a phased, test-and-learn approach that embeds data protection from the outset while testing transformational capabilities before committing significant public funds.
Working with the Commercial Innovation Hub, the Government has developed procurement approaches tailored to AI’s unique characteristics, including the National AI Tender for GOV.UK and the Planning Transformation Accelerator for AI-assisted decision-making, ensuring procurement methods are appropriate for evaluating frontier technologies.
All procurement frameworks require GDPR compliance as a mandatory qualification criterion, with partnerships operating under the UK Data Protection Act 2018 and comprehensive privacy-by-design principles. Pilots are deployed in controlled environments with oversight from departmental information security teams and data protection officers, with government retaining intellectual property ownership to prevent vendor lock-in.
Decisions to scale are contingent on pilots demonstrating measurable user benefits and full compliance with data protection standards through defined evaluation criteria and contractual break clauses, ensuring AI capabilities can be advanced while maintaining robust privacy safeguards.
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department for Science, Innovation & Technology:
To ask His Majesty's Government what steps they are taking to support UK companies to improve cybersecurity.
Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)
Improving the cyber security of UK companies is critical to the resilience of our wider economy and is a priority for the government.
The Cyber Security and Resilience Bill will improve UK cyber defences and help protect our essential services. Our product security legislation and cyber security codes of practice are helping to ensure the technology people and businesses use is secure by design. We are also developing and growing the cyber security industrial base and skills pipeline to ensure companies have access to the services and capabilities they need. Together these system-wide measures aim to drive a step change in supporting companies across the economy to improve their cyber resilience.
In addition, the government wrote to the Chairs and CEOs of leading UK companies and asked them to better identify and protect themselves from cyber threats by making cyber a board-level priority by using the Cyber Governance Code of Practice, signing up to the National Cyber Security Centre (NCSC) Early Warning service, and requiring Cyber Essentials in their supply chains.
These actions are relevant to all businesses. To support them further, the government has developed a wide range of free resources, including the Cyber Action Toolkit offering tailored advice for small businesses, and NCSC-certified Cyber Advisors who provide advice and guidance on commercial terms, with SMEs eligible for a free 30-minute consultation.
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department for Science, Innovation & Technology:
To ask His Majesty's Government what assessment they have made of the expansion of agentic AI tools in UK legal and professional services; and how this is informing policy on (1) innovation, (2) professional standards and (3) ethical AI use.
Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)
We remain committed to ensuring the trusted and fair use of AI and to facilitating impactful AI adoption across the UK, so that British workers - including those in legal services - can seize the benefits this technology offers.
To support this, the AI Growth Lab will act as a cross‑economy AI sandbox, enabling responsible AI products and services to be deployed under close supervision in live markets. This will drive cross‑economy growth, build trust in new technologies, and create a mechanism for dynamic, evidence‑led regulatory reform.
Alongside this, the Roadmap to Trusted Third‑Party AI Assurance sets out the Government’s ambitions for the UK’s AI assurance market and the immediate actions we are taking to help the sector mature. This includes establishing the £11 million AI Assurance Innovation Fund and convening a national consortium of expert stakeholders to support the quality and growth of the assurance market.
In addition, the Government has established the cross‑government AI and Future of Work Unit to monitor how advanced AI tools are reshaping professional work, ensure innovation is supported responsibly, and coordinate policy so that workers and businesses can adopt these technologies safely and effectively.
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department for Science, Innovation & Technology:
To ask His Majesty's Government what assessment they have made of reported growth in demand for ethical AI and technology skills in UK financial services; and how this is informing (1) workforce policy, and (2) regulatory policy.
Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)
Government is taking significant steps to expand skills and training in ethical and responsible AI. In January, further public and private sector partners joined the AI Skills Boost, increasing our ambition to upskill 10 million workers by 2030. More than 1 million AI upskilling courses have already been delivered since last summer, helping ensure UK workers - including those in financial services - have access to high‑quality training in the safe and ethical use of AI.
To complement this, the Government has established the cross‑government AI and Future of Work Unit to monitor how advanced AI tools are reshaping professional work, ensure innovation is supported responsibly, and coordinate policy so that workers and businesses can adopt these technologies safely.
We have also concluded a Call for Evidence on proposals for the AI Growth Lab, a cross‑economy AI sandbox that would allow responsible AI products and services to be tested under close supervision in live markets, building trust and supporting economic growth. Alongside this, the FCA’s Supercharged Sandbox and AI Live Testing service provide firms with enhanced access to computing, data and safe real‑world testing environments, enabling the responsible use of AI across UK financial markets.
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department for Science, Innovation & Technology:
To ask His Majesty's Government what steps they are taking to make nationally owned public-sector data available for ethical and secure use in AI development to support innovation and public service delivery.
Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)
As set out in the AI Opportunities Action Plan and the Modern Industrial Strategy, the Government is committed to treating public sector data as a strategic national asset and unlocking high-impact public datasets for AI use.
The Government recently published an update outlining the significant progress made on the Action Plan with 38 of its 50 commitments delivered against in 12 months.
This update demonstrated that 6 of the 7 data recommendations have been delivered – this includes the publishing of best practice guidance on how to make public sector datasets ready for AI (R09), and details on the delivery of ‘kickstarter’ projects making high-impact datasets available to AI researchers and innovators (R07).
DSIT have also launched an open call to understand the opportunities for public sector data among businesses, researchers and public bodies.
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department for Science, Innovation & Technology:
To ask His Majesty's Government what assessment they have made of recent high-value acquisitions involving UK AI companies; and what policies they are pursuing to support domestic AI innovation, investment and the scaling of high-growth firms.
Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)
The UK has a strong track record of high‑growth AI firms. This Government is taking a comprehensive approach to supporting our thriving AI ecosystem – ensuring that we back innovators with the data, compute, and talent they need to succeed.
Our AI Research Resource is providing free compute to British researchers and startups so that they can train new AI models and deliver scientific breakthroughs. We have established five new AI Growth Zones across the UK to deliver large, cutting-edge datacentre capacity. We are training the next generation of experts through Spärck AI Scholarships and the Global Talent taskforce, and we are upskilling 10 million workers in essential AI skills by 2030. Our Sovereign AI Unit, backed by £500 million, will support high-potential start-ups to start and scale, and our £100 million Advance Market Commitments will help UK AI hardware start‑ups compete globally.