Cyber Security and Resilience (Network and Information Systems) Bill

Victoria Collins Excerpts
Victoria Collins Portrait Victoria Collins (Harpenden and Berkhamsted) (LD)
- View Speech - Hansard - -

I wish you and everyone else in the Chamber a happy new year, Madam Deputy Speaker.

It is a pleasure to finally address the long-awaited Cyber Security and Resilience (Network and Information Systems) Bill. As has been pointed out today, it is significant. The National Cyber Security Centre reported that nationally significant cyber-incidents had more than doubled since the previous year. The past year’s surge in cyber-attacks on targets ranging from supply chains to hospitals to critical infrastructure has made one fact clear: there is no economic or societal security without cyber-security. Cyber-attacks cost the UK economy £14.7 billion annually. There have been attacks on companies such as Jaguar Land Rover and Marks & Spencer. More important, however, is the impact on the real economy. Thousands of jobs and businesses are hanging in the balance, and our public services and our private data are also being impacted. As the Minister mentioned this morning, the NHS Synnovis ransomware attack resulted in more than 11,000 postponed appointments and procedures. It has even been linked to one patient’s death, which was attributed to the delay that the attack caused. This matters. We must do all that we can to upgrade protection and our security, because jobs, the economy and lives depend on it.

Our economy—imagine it, if you will, as a house—is under attack. The Liberal Democrats welcome the Bill’s intent to upgrade our home security; the addition of data centres, managed service providers and large load controllers means that we are building stronger fences, and that companies with a master key to all our doors have stronger security. Also, the wiring has been upgraded, and the alarm system is being given an upgrade; there is increased incident reporting. However, the Bill leaves the back door wide open by leaving out key sectors. Our alarm system is not sure when it is supposed to ring, and the companies that have the keys to our doors, and are using our house, are asking for simplicity, clarity and support, so that they can do their job properly. While no single piece of cyber-security legislation can act as a silver bullet, those are gaps that we must address.

We are failing to take the whole-economy approach mentioned by the hon. Member for Warwick and Leamington (Matt Western). We are leaving out the public sector and economically significant sectors, such as retail and manufacturing. The Bill’s stated aim is to protect organisations

“that are so essential that their disruption would affect our daily lives.”—[Official Report, 12 November 2025; Vol. 775, c. 26WS.]

However, the Government apparently do not consider their own public services, provided by local authorities, to be essential enough for protection. The £10 million Redcar council incident proves that voluntary schemes are failing local authorities, but after the Bill is passed, Government institutions and councils will still lack statutory protections and ringfenced funding—and all the while, council budgets are getting tighter. I have no doubt that members of the public whose data, be it from the electoral roll or from social care records, sits in these systems would object to the public sector’s exclusion from the Bill.

As has been mentioned, we are also talking about a potential mandatory digital ID system for the whole country. The Government have already said that it would be built with home-made technology. Where will the cyber-protection be in that? What is more, leaving out sectors such as retail and manufacturing would mean that the JLR and M&S cyber-attacks remained out of scope. These are significant sectors. They involve major employers and major parts of our supply chains, and they handle significant amounts of personal data.

The Bill marks a failure of ambition. The Government claimed in response to a letter that we sent on this topic that they

“do not need to wait for or rely on legislation”

to implement cyber-security requirements in the public sector, and will instead use the Government action plan to ensure that the very same requirements in the Bill will be applied to the public sector. Why must we have this two-tier approach? Why leave out economically and socially significant sectors, such as the public sector? Does the Minister agree that we need mandatory cyber-security standards for those absent sectors of our society, governance and economy? If we are serious about national resilience, about protecting citizens’ data and about aligning with our European partners, let us vote on the issue in primary legislation in this Chamber, so that the issue has the full transparency and accountability that it demands.

A further critical gap in the Bill is the failure to embed security by design, and a lack of clear accountability. This should be board-led, to ensure that each lock, door and window of our house is built securely. In 2019, the NCSC published design principles, and last October the Government launched a secure-by-design framework, which was seen as core to their cyber-security standard. However, the Bill not only excludes Government from critical national infrastructure but abandons that key principle, and fails to include the words “by design”, which matters, particularly as ISC2 research suggests that skills shortages are the No. 1 challenge for compliance with cyber regulation in the UK, with 88% of respondents experiencing at least one cyber-security breach as a result of skills shortages. This is also a missed opportunity for our economy and our cyber-security sector. Prioritising security by design would provide the baseline protection that our critical infrastructure so desperately needs. What consideration have the Government given to ensuring security by design?

Effective regulation does not just mean future-proofing; it must be workable. While we welcome expanded incident reporting, the current definitions risk creating a significant regulatory burden. Over-reporting will overwhelm, rather than strengthen, our cyber-security systems. Those who are coming to upgrade our security systems are not being given clear directions. The definition of a “reportable incident” is so broad that it could extend to every phishing email. How will the NCSC feasibly manage the administrative burden when the alarm may be ringing non-stop? Other critical terms lack clarity for industry, including “managed service provider” and the criteria for “digital critical suppliers”, as has been highlighted by techUK and others. These are not just technical details to be ironed out later; they are the difference between a Bill that works and one that does not, and industry needs clarity on how to comply. Will the Minister work with us and with industry to tighten those definitions, so that the Bill is workable, and will he consider the best way to ensure simplicity and effectiveness in incident reporting?

What is being done to support home-grown cyber-security in the UK? What is being done to defend us from hostile foreign interference? With one of the latest defence contracts going to Palantir, what is being done to support UK tech? Would the Government support a digital sovereignty strategy, as suggested by Open Rights Group? The Bill is yet another missed opportunity to support our domestic tech sector, at a time when we should be building UK cyber-security capabilities and creating highly skilled jobs here at home. How can we claim to be serious about national resilience when the very infrastructure protecting our critical systems could be entirely outsourced abroad?

Supporting UK tech and businesses is not just about the providers in the Bill; it is about the thousands of small and medium-sized enterprises that form the backbone of our economy. For the few SMEs and start-ups that are directly affected by it, the Bill creates a regulatory thicket of overlapping rules, different timelines and multiple bodies. Cyber-security is complicated, and for this legislation to work, it must be simple and easily implementable for UK SMEs. What support will there be for those SMEs and start-ups?

It would be remiss of me not to mention the wider cyber-crime landscape. SMEs make up 99.8% of UK businesses, and are often the most vulnerable link in cyber supply chains. The NCC Group confirms that manufacturing, retail and leisure, dominated by SMEs, were the sectors most targeted for ransomware in 2024. That is why the Liberal Democrats are calling on the Government to establish a digital safety net for SMEs—a nationwide first responder service that would provide free-at-the-point-of-use support for small businesses that have been victims of a cyber-attack. Australia is already doing that, providing person-to-person support during and after attacks. If Australia can do it, why can’t we?

On top of all that, the biggest threat is actually fraud, which costs the economy hundreds of billions a year. Two thirds of all fraud begins online, much of it through social media companies with no liability. That is why the Liberal Democrats are calling for social media platforms to be made financially liable for fraud on their sites, which would create a clear line of accountability for criminal activity. Moreover, fraud is a cyber-security issue; it exploits weak systems and inadequate protections. Families lose life savings, elderly people fall victim to sophisticated phishing, and small businesses shut down. The Bill protects infrastructure, but by leaving the back door open, it ignores the billions of pounds of savings lost and the livelihoods upended through online fraud. The Government must address that in their long-awaited fraud strategy. We cannot protect systems but abandon our businesses and our people.

The Bill is progress, but it is not the finish line. The cyber-threat is real, evolving and urgent. The Liberal Democrats will work constructively to strengthen the Bill through amendments, but we must ensure that we do not leave the back door open, and that we future-proof our security. We owe it to our businesses, our families and our national security to get this right.

Online Safety Act 2023: Repeal

Victoria Collins Excerpts
Monday 15th December 2025

(3 weeks, 5 days ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Victoria Collins Portrait Victoria Collins (Harpenden and Berkhamsted) (LD)
- Hansard - -

It is a pleasure to serve under your chairmanship, Sir John. I congratulate the hon. Member for Sunderland Central (Lewis Atkinson), who made a very eloquent opening speech, and Members from across the Chamber, who have touched on really important matters.

As the hon. Member mentioned, the online space gives us great opportunities for connection and knowledge gathering, but also opportunities for greater harms. What has come across today is that we have addictive algorithms that are pushed in furtherance of commercial and malevolent interests—security interests, for example, although not the security of Great Britain—with no regard for the harm or impact they have on individuals or society.

When it comes to the Online Safety Act, we must get the balance right. Its protections for children and the vulnerable are vital. Of course, it is important to maintain freedom of speech and access to information. The Act is a step in the right direction in protecting children from extreme content, and we have seen changes in pornographic content. However, there are areas where it has not gone far enough, and it is not ready for the changes that are coming at a fast pace. There are websites that serve a public good that are age-gated, and forums for hobbies and communities that are being blocked. As the Liberal Democrats have said, we have to get the balance right. We also have to look at introducing something like a digital Bill of Rights with agile standards in the face of fast-paced changes, to embed safety by design at the base.

The harms that we need to protect children and vulnerable people from online are real. The contributions to this debate from hon. Members from across the House have been, as always, eye-opening and a reminder of how important this issue is. On pornographic content, we heard from the hon. Members for Morecambe and Lunesdale (Lizzi Collinge) and for Milton Keynes Central (Emily Darlington) sickening reminders of the horrific content online that young people see—and not by choice. We must never forget that, as has also been said, people are often not seeking this content, but it comes through, whether on X, which was Twitter, or other platforms. The Molly Rose Foundation highlighted that

“children using TikTok and X were more than twice as likely to have encountered…high risk content compared to users of other platforms.”

The online world coming to life has been mentioned in this debate. One of my constituents in Harpenden wrote to me, horrified that her daughter had been strangled on a dancefloor, because it showed how violent, graphic content is becoming normalised. That struck me to my core. Other content that has been mentioned: suicidal content, violent content and eating disorder misinformation, which the hon. Member for Worcester (Tom Collins) talked so eloquently about. The Molly Rose Foundation also highlighted that one in 10 harmful videos on TikTok have been viewed more than 1 million times, so we have young people seeing that ex content.

Even beyond extreme content, we are starting to see the addictive nature of social media, and the insidious way that this short-form content is becoming such a normalised part of many of our lives. Recent polling by the Liberal Democrats revealed that 80% of parents reported negative behaviours in their child due to excess phone usage, including skipping meals, having difficulty sleeping, or reporting physical discomforts such as eye strain or headaches. Parents and teachers know the real harms that are coming through, but young people themselves do too. I carried out a safer screens tour in my constituency in which I spoke to young people. Many of them said that they are seeing extreme content that they do not want to see, and that, although they have blocked the content, it comes back. The Online Safety Act is helping to change that, but it has not gone far enough. The addictive element of social media is important. In our surveys, two quotes from young people stood out. One sixth-former said that social media is

“as addictive as a drug”,

and that they felt its negative effects every day. Another young person simply wrote, “Help, I can’t stop.” Young people are asking for help and protection; we need to hold social media giants and online spaces to account.

It is welcome that some of those harms have been tackled by the Online Safety Act. On pornography, Pornhub has seen a 77% reduction in visitors to its website; Ofcom has launched 76 investigations into pornography providers and issued one fine of £50,000 for failing to introduce age checks, but we need to ask whether that goes far enough. It has come across loud and clear in this debate that the Online Safety Act has not gone far enough. Analysis has shown that Instagram and TikTok have started to introduce new design features that comply with the Online Safety Act, but game the system to still put forward content that is in those companies’ commercial interests, and not in the interests of young people.

Other extremely important harms include the new harms from AI. Many more people are turning to AI for mental health support. Generative AI is creating graphic content, and the Internet Watch Foundation found that

“reports of AI-generated child sexual abuse material have more than doubled in the past year”

and the IWF says it is at the point where it cannot tell the difference any more—it is horrific.

Jim McMahon Portrait Jim McMahon
- Hansard - - - Excerpts

The hon. Lady is making a very important point. It really concerns me to see just how desensitised young people or adults can become when they see that type of content, and that inhumane content is directly linked to misogyny and racism. While I know no Member of this House would say such a thing, outside this place I could imagine an argument being made that harm depicted in AI-generated content is not real harm, because the content in itself is not real and no real abuse has been carried out. However, does the hon. Lady share my concern that such content is incredibly harmful, and that there is a real danger that it entraps even more people down the very dark route to what is essentially child abuse and to further types of harm, which will then present in the real world in a way that I do not think even Parliament has yet registered? In a sense, this problem is becoming more and more of a public health crisis.

Victoria Collins Portrait Victoria Collins
- Hansard - -

Absolutely. The insidious part of this issue is the normalisation of such harmful content. In a debate on Lords amendments to the then Data (Use and Access) Bill, on creatives and AI, I mentioned the fact that, in the two weeks since the previous vote, we had seen the release of Google Veo 3—the all-singing, all-dancing video creation software. We are moving so quickly that we do not see how good AI-generated content is becoming. Some content that we see online is probably AI-generated, but we do not realise it. On top of that, as the hon. Gentleman said, AI normalises extreme content and produces content that people think is real, but is not. That is very dangerous for society.

My next point concerns deepfakes, which are undermining trust. Some deepfakes are obvious; some Members of Parliament and news presenters have been targeted through deepfakes. Just as important, however, is the fact that much deepfake content seems normal, but is undermining trust in what we see—we do not know what is real and what is not any more. That is going to be very dangerous not only in terms of extreme content, but for our democracy, and that argument has been made by other Members in this debate.

It is also worrying that social media platforms do not seem to see that problem. To produce its risk assessment report, Ofcom analysed 104 platforms and asked them to put in submissions: not a single social media platform classified itself as high risk for suicide, eating disorder or depression—yet much of what we have heard during this debate, including statistics and anecdotal stories, shows that that is just not true.

On the other hand, while there are areas where the Online Safety Act has not gone far enough, in other areas it has overstepped the mark. When the children’s code came into place, Lord Clement-Jones and I wrote to Secretary of State to outline some of our concerns, including political content being age-gated, educational sites such as Wikipedia being designated as category 1, and important forums about LGBTQ+ rights, sexual health or potentially sensitive topics being age-gated, despite being important for many who are learning about the world.

Jamie from Harpenden, a young person who relies on the internet heavily for education, found when he was looking for resources that a lot of them were flagged as threatening to children and blocked, and felt that that prevented his education. Age assurance systems also pose a problem to data protection and privacy. The intention behind this legislation was never to limit access to political or educational content, and it is important that we support access to the content that many rely on—but we must protect our children and vulnerable people online, and we must get that balance right.

I have a few questions for the Minister. Does he agree with the Liberal Democrats that we should have a cross-party Committee of both Houses of Parliament to review the Online Safety Act? Will he confirm what resources Ofcom has been given? Has analysis been conducted to ensure that Ofcom has enough resources to tackle these issues? What are the Government doing about AI labelling and watermarking? What are they doing to tackle deepfakes? Does the Minister agree that it is time to support the wellbeing of our children, rather than the pockets of big tech? Will the Minister support Liberal Democrat calls to increase the age of data consent and ban social media giants from collecting children’s data to power the addictive algorithms against them? We are calling for public health warnings on addictive social media for under-18s and for a doomscroll cap. Most important is a digital bill of rights and standards that, in light of the fast pace of change, need to be agile.

Our young people deserve better. We need to put children, young people and vulnerable people before the profits of big tech. We will not stop fighting until that change is made.

--- Later in debate ---
Ian Murray Portrait Ian Murray
- Hansard - - - Excerpts

My hon. Friend makes a good point; let me come back to him in detail on the VPN issue, as his question relates to what we are planning to do in our review of the Online Safety Act, including both what was written into the legislation and what was not.

My hon. Friend the Member for Darlington (Lola McEvoy), who is no longer in her place, highlighted the really important issue of chatbots, which has also been mentioned by a number of other Members. Generative AI services including chatbots that allow users to share content with one another or search live websites to provide search engines are already regulated under the Online Safety Act. Those services must protect users from illegal content and children from harmful and age-inappropriate content.

Victoria Collins Portrait Victoria Collins
- Hansard - -

Ofcom has said, and my understanding is, that in certain circumstances AI chatbots are covered, but certain new harms—such as emotional dependence—are not. That is an area where the House and many people are asking for clarity.

Ian Murray Portrait Ian Murray
- Hansard - - - Excerpts

I do not disagree with the hon. Lady. There are a whole host of issues around porn bots and AI-generated bots that have now also sprung up. We know that we are committed to the Online Safety Act and its review as its being implemented. As technology moves on quickly, we have to keep pace with what the harms are and how we are able to deal with them. I thank the hon. Lady for raising those particular issues.

We will act on the evidence that comes forward. It is clear that if the evidence shows us that we have to act in various areas, including chatbots, we will do so. The Secretary of State announced plans to support a child safety summit in 2026, which will bring together tech companies, civil society and young people to shape how AI can benefit children and look at online harms and the movements on those.

--- Later in debate ---
Ian Murray Portrait Ian Murray
- Hansard - - - Excerpts

Sir John, you are indeed very kind. My hon. Friend gave two examples during his speech. First, he mentioned brakes that were available only for high-end and expensive cars, and are now on all cars. Secondly, he mentioned building regulations, and how we would not build a balcony without a barrier. Those examples seem fairly obvious and almost flippant, but it seems strange that we would regulate heavily to make sure that people are safe physically—nobody would ever argue that it would be a complete disregard of people’s freedom to have a barrier on an 18th-floor balcony—but not online. We do that to keep people safe, and particularly to keep children safe. As my hon. Friend said, if we are keeping adults safe, we are ultimately keeping children safe too.

We have to continue to monitor and evaluate. I was just about to come on to the post-implementation review of the Act, which I am sure my hon. Friend will be very keen to have an input into. The Secretary of State must complete a review of the online safety regime two to five years after part 3 of the Act, which is about duties of care, fully comes into force. The review will therefore be completed no sooner than 2029. These are long timescales, of course, and technology is moving, so I understand the point that he is making. I recall that in the Parliament from 2010 to 2015, we regulated for the telephone, so we move slowly, although we understand that we also have to be nimble to legislate.

The Lib Dem spokesperson, the hon. Member for Harpenden and Berkhamsted, asked whether the Act has gone far enough. Ofcom, the regulator, is taking an iterative approach and will strengthen codes of practice as online harms, technology and the evidence evolve. We are already making improvements, for example strengthening the law to tackle self-harm, cyber-flashing and strangulation. The hon. Lady also asked whether Ofcom has received an increase in resources. It has—Ofcom spending has increased by nearly 30% in the past year, in recognition of its increased responsibilities. She also asked about a digital age of consent. As I mentioned, we have signed a memorandum of understanding with Australia and will engage with Australia to understand its approach. Any action will be based, of course, on robust evidence.

Victoria Collins Portrait Victoria Collins
- Hansard - -

I would just like to clarify that I made a call for an age of data consent. We put that forward earlier this year as an amendment to the Act. A very first step is to stop social media companies harvesting data and using it to power these addictive algorithms against young people. It is about data consent to 16. Then of course, there is the wider discussion about what is happening with social media in general, but it is that age of data consent that is our first call to action.

Mandatory Digital ID

Victoria Collins Excerpts
Tuesday 21st October 2025

(2 months, 2 weeks ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Victoria Collins Portrait Victoria Collins (Harpenden and Berkhamsted) (LD)
- Hansard - -

It is a pleasure to serve under your chairmanship, Mr Turner. I congratulate the hon. Member for Perth and Kinross-shire (Pete Wishart) on securing this lively debate and eloquently sharing his views and concerns.

As Liberal Democrats, we hold a fundamental principle: freedoms belong to citizens by right. The Secretary of State spoke repeatedly about

“giving people power and control”.—[Official Report, 13 October 2025; Vol. 773, c. 87.]

But I ask the Minister—control over what and whom? This essentially mandatory digital ID for every person with the right to work in this country does not leave much choice or control.

My constituent Julie, from Harpenden, does not have a phone; she does not want one. That is her choice, but she has written to me deeply concerned that she will be excluded from society because of this digital ID policy. The Chief Secretary to the Prime Minister, the right hon. Member for Bristol North West (Darren Jones), cited that 93% of the population have a smartphone, as if that justified digital ID. That statistic means that approximately 4.5 million people—just like Julie—will not gain control but lose it.

As my hon. Friend the Member for North Norfolk (Steff Aquarone) rightly pushed for during debate on the data Bill, people must have the right to a non-digital identification. That includes the right to work with non-digital ID. Where is the fairness for people such as Julie in this mandatory system? That is before we consider the 8.5 million people working in the UK who lack even the most basic digital skills. Leon, who works in IT in Tring, sees this reality every day. He has written to me saying that many of his colleagues struggle with basic smartphone tasks—a digital ID will force them to navigate an entirely new system on top of that. What are the Government’s plans to upskill millions of workers, or will this yet again be another burden dumped on businesses?

Speaking of cost, experts are clear that the proposal will cost taxpayers billions, behind a trail of failed Government IT projects. Ask European citizens in the UK who have been plagued by the e-visa app’s failures, which have resulted in people being wrongly denied work, housing, education and welfare. Analysis commissioned by the Liberal Democrats shows that, of 24 major Whitehall schemes currently under way, two are already rated as undeliverable and 16 are facing significant issues. From NHS patient records to digital tax systems, the total cost of those failed or delayed projects already stands at more than £31 billion.

Ben Maguire Portrait Ben Maguire (North Cornwall) (LD)
- Hansard - - - Excerpts

I have just come from a meeting with WASPI women here in Parliament who are asking for £3 billion in compensation, which they are rightly owed. The Government have said that they do not have that money—they have actually taken that group to court—yet here we are: they have pulled £2 billion out of the hat. Does my hon. Friend agree that the priorities are really wrong here?

Victoria Collins Portrait Victoria Collins
- Hansard - -

Absolutely, and I was about to say that while frontline services are crumbling and people are needing those billions of pounds, we are seeing here is billions being spent, millions being excluded and freedoms eroded—and for what? How much taxpayer money are the Government prepared to waste on this scheme, for which they have no mandate and no public support? While those frontline services are bursting at the seams, the Government have squandered the opportunity to use technology to improve services by instead undermining trust, seemingly flip-flopping on this patchwork policy.

On 26 September, the Prime Minister announced digital ID with promises to control borders and tackle irregular migration. Last week, that narrative had all but vanished, with a shift to talking about anything from handling daycare to buying a drink. The Secretary of State herself admitted that digital ID would not be the “silver bullet” to end migration as initially promised; as the hon. Member for York Central (Rachael Maskell) highlighted, as did many others, we know it will not solve the problem. Meanwhile, the Foreign Secretary defended extending digital ID to 13-year-olds—something that the Government have still not ruled out.

Why are this Government so determined to press ahead? I support improving digital services on a voluntary basis, but we can modernise without mandating and must leave room for non-digital choice. Allegedly, this is about easier access to Government services, but surely we should be working on improving what we already have.

The gov.uk One Login, the voluntary gateway to digital Government, needs much improvement. As the right hon. Member for Goole and Pocklington (David Davis) highlighted, there are many concerns about security as well. Should we not fix those services, rather than create new ones?

--- Later in debate ---
Ian Murray Portrait Ian Murray
- Hansard - - - Excerpts

Of course it is a consultation. It is about how we get this right, what it looks like, how it is built, how federated data is secured, how we deal with digital inclusion and how we deal with the issues in Northern Ireland. That is what the consultation is about. It is about the Government learning from that. [Interruption.] Liberal Democrat Members are heckling from a sedentary position, but their own leader, the right hon. Member for Kingston and Surbiton (Ed Davey), said on 21 September that “times have changed”, and that he had been impressed by a visit to Estonia, where a liberal Government had brought in digital ID. He said that if a system was

“giving individuals power to access public services”,

he could be in favour. Four days later, he said that

“the Liberal Democrats will fight against it tooth and nail”.

It is the same hypocrisy as the Scottish National party; it was their policy five days before they came out against it.

Victoria Collins Portrait Victoria Collins
- Hansard - -

I would just highlight that what was stated was about the system being voluntary and about choice. We are saying that a mandatory system is a problem. Do this Government want to grow this economy or not? Do they want to give people who want to work a real choice? I do not see that at all.

Ian Murray Portrait Ian Murray
- Hansard - - - Excerpts

This is about reconnecting citizens with Government. Everyone will have constituents coming to every one of their surgeries with a form they cannot fill out, a piece of maladministration in public services, something they cannot access or a difficulty in getting access to benefits. There are still people in this country who are entitled to huge parts of the benefit system but do not claim. There are people who will need this for verification of identity and their age in buying alcohol—all those things that are a big inconvenience for people. This is about reconnecting citizens with Government—modernising government, as we have heard from the Opposition spokesperson, the hon. Member for Runnymede and Weybridge (Dr Spencer). It is about making sure that the Government can be effective and can be in the digital age with a digital population. This happens in many other countries around the world. I do not have time to run through all of them now, but hon. Members can look them up.

Let me take on two issues before I finish. The first is data and security. This is a federated data system, so I say to the hon. Member for Dewsbury and Batley (Iqbal Mohamed) that his idea of bringing it all together in one database is the wrong option. The data does not move; it sits with the Government Department, and the digital ID system, or whatever system is used, goes into those datasets and brings out affirmative or otherwise—

Employer National Insurance Contributions: Charities

Victoria Collins Excerpts
Tuesday 7th January 2025

(1 year ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Victoria Collins Portrait Victoria Collins (Harpenden and Berkhamsted) (LD)
- Hansard - -

It is a pleasure to serve under your chairmanship, Ms Vaz, and I congratulate the hon. Member for Isle of Wight East (Joe Robertson) on securing this debate. The Government’s proposed changes pose a severe threat to vital charities at the heart of our communities. Again and again, we have heard that. Age UK in Hertfordshire has calculated that the rises will impose an additional cost of £85,000 per annum and, when combined with unexpected increases in operational costs, they have pushed its total cost increases close to £250,000.

Higher national insurance contributions mean increased costs, reduced capacity to hire and retain staff, and ultimately fewer resources to deliver the services our communities rely on. The wonderful team at the Hospice of Saint Francis in Berkhamsted shared with me the heartbreaking experience of having to turn away people from their health and wellbeing service, their nursing support and their at-home support.

The situation will only get worse. Time and again, charities have spoken to me about how the Government’s snap decision undervalues their essential work, such as supporting covid-19 vaccine roll-out, picking up the pieces after the winter fuel allowance was cut and filling the gaps left by the last Conservative Government. With our NHS and public service in crisis, I urge the Government to reconsider these national insurance rises for charities.