Cyber Security and Resilience (Network and Information Systems) Bill (First sitting) Debate
Full Debate: Read Full DebateEmily Darlington
Main Page: Emily Darlington (Labour - Milton Keynes Central)Department Debates - View all Emily Darlington's debates with the Department for Science, Innovation & Technology
(1 day, 19 hours ago)
Public Bill Committees
The Parliamentary Under-Secretary of State for Science, Innovation and Technology (Kanishka Narayan)
Q
Jen Ellis: Again, that is a hugely complex question to cover in a short amount of the time. One of the challenges that we face in UK is that we are a 99% small and mediums economy. It is hard to think about how to place more burdens on small and medium businesses, what they can reasonably get done and what resources are available. That said, that is the problem that we have to deal with; we have to figure out how to make progress.
There is also a challenge here, in that we tend to focus a lot on the behaviour of the victim. It is understandable why—that is the side that we can control—but we are missing the middle piece. There are the bad guys, who we cannot control but who we can try to prosecute and bring to task; and there are the victims, who we can control, and we focus a lot on that—CSRB focuses on that side. Then there is the middle ground of enablers. They are not intending to be enablers, but they are the people who are creating the platforms, mediums and technology. I am not sure that we are where we could be in thinking about how to set a baseline for them. We have a lot of voluntary codes, which is fantastic—that is a really good starting point—but it is about the value of the voluntary and how much it requires behavioural change. What you see is that the organisations that are already doing well and taking security seriously are following the voluntary codes because they were already investing, but there is a really long tail of organisations that are not.
Any policy approach, legislation or otherwise, comes down to the fact that you can build the best thing in the world, but you need a plan for adoption or the engagement piece—what it looks like to go into communities and see how people are wrestling with this stuff and the challenges that are blocking adoption. You also need to think about how to address and remove those challenges, and, where necessary, how to ensure appropriate enforcement, accountability and transparency. That is critical, and I am not sure that we see a huge amount of that at the moment. That is an area where there is potential for growth.
With CSRB, the piece around enforcement is going to be critical, and not just for the covered entities. We are also giving new authorities to the regulators, so what are we doing to say to them, “We expect you to use them, to be accountable for using them and to demonstrate that your sector is improving”? There needs to be stronger conversations about what it looks like to not meet the requirements. We should be looking more broadly, beyond just telling small companies to do more. If we are going to tell small companies to do more, how do we make it something that they can prioritise, care about and take seriously, in the same way that health and safety is taken seriously?
David Cook: To achieve the outcome in question, which is about the practicalities of a supply chain where smaller entities are relying on it, I can see the benefit of bringing those small entities in scope, but there could be something rather more forthright in the legislation on how the supply chain is dealt with on a contractual basis. In reality, we see that when a smaller entity tries to contract with a much larger entity—an IT outsourced provider, for example—it may find pushback if the contractual terms that it asks for would help it but are not required under legislation.
Where an organisation can rely on the GDPR, which has very specific requirements as to what contracts should contain, or the Digital Operational Resilience Act, which is a European financial services law and is very prescriptive as to what a contract must contain, any kind of entity doing deals and entering into a contract cannot really push back, because the requirements are set out in stone. The Bill does not have a similar requirement as to what a contract with providers might look like.
Pushing that requirement into the negotiation between, for example, a massive global IT outsourced provider and a much smaller entity means either that we will see piecemeal clauses that do not always achieve the outcomes you are after, or that we will not see those clauses in place at all because of the commercial reality. Having a similarly prescriptive set of requirements for what that contract would contain means that anybody negotiating could point to the law and say, “We have to have this in place, and there’s no wriggle room.” That would achieve the outcome you are after: those small entities would all have identical contracts, at least as a baseline.
Emily Darlington (Milton Keynes Central) (Lab)
Q
David Cook: The original NIS regulations came out of a directive from 2016, so this is 10 years old now, and the world changes quickly, especially when it comes to technology. Not only is this supply chain vulnerability systemic, but it causes a significant risk to UK and global businesses. Ransomware groups, threat actors or cyber-criminals—however you want to badge that—are looking for a one-to-many model. Rather than going after each organisation piecemeal, if they can find a route through one organisation that leads to millions, they will always follow it. At the moment, they are out of scope.
The reality is that those organisations, which are global in nature, often do not pay due regard to UK law because they are acting all over the world and we are one of many jurisdictions. They are the threat vector that is allowing an attack into an organisation, but it then sits with the organisations that are attacked to deal with the fallout. Often, although they do not get away scot-free, they are outside legislative scrutiny and can carry on operating as they did before. That causes a vulnerability. The one-to-many attack route is a vulnerability, and at the moment the law is lacking in how it is equipped to deal with the fallout.
Jen Ellis: In terms of what the landscape looks like, our dialogue often has a huge focus on cyber-crime and we look a lot at data protection and that kind of thing. Last year, we saw the impact of disruptive attacks, but in the past few years we have also heard a lot more about state-sponsored attacks.
I do not know how familiar everyone in the room is with Volt Typhoon and Salt Typhoon; they were widespread nation-state attacks that were uncovered in the US. We are not immune to such attacks; we could just as easily fall victim to them. We should take the discovery of Volt Typhoon as a massive wake-up call to the fact that although we are aware of the challenge, we are not moving fast enough to address it. Volt Typhoon particularly targeted US critical infrastructure, with a view to being able to massively disrupt it at scale should a reason to do so arise. We cannot have that level of disruption across our society; the impacts would be catastrophic.
Part of what NIS is doing and what the CSRB is looking to do is to take NIS and update it to make sure that it is covering the relevant things, but I also hope that we will see a new level of urgency and an understanding that the risks are very prevalent and are coming from different sources with all sorts of different motivations. There is huge complexity, which David has spoken to, around the supply chain. We really need to see the critical infrastructure and the core service providers becoming hugely more vigilant and taking their role as providers of a critical service very seriously when it comes to security. They need to think about what they are doing to be part of the solution and to harden and protect the UK against outside interference.
David Cook: By way of example, NIS1 talks about reporting to the regulator if there is a significant impact. What we are seeing with some of the attacks that Jen has spoken about is pre-positioning, whereby a criminal or a threat actor sits on the network and the environment and waits for the day when they are going to push the big red button and cause an attack. That is outside NIS1: if that sort of issue were identified, it would not be reportable to the regulator. The regulator would therefore not have any visibility of it.
NIS2 and the Bill talk about something being identified that is caused by or is capable of causing severe operational disruption. It widens the ambit of visibility and allows the UK state, as well as regulators, to understand what is going in the environment more broadly, because if there are trends—if a number of organisations report to a regulator that they have found that pre-positioning—they know that a malicious actor is planning something. The footprints are there.
Freddie van Mierlo (Henley and Thame) (LD)
Q
Jen Ellis: You have covered a lot of territory there; I will try to break it down. If you look at the attacks last year, all the companies you mentioned were investing in cyber-security. There is a difficulty here, because there is no such thing as being bullet-proof or secure. You are always trying to raise the barriers as high as you can and make it harder for attackers to be successful. The three attacks you mentioned were highly targeted attacks. The example of Volt Typhoon in the US was also highly targeted. These are attackers who are highly motivated to go after specific entities and who will keep going until they get somewhere. It is really hard to defend against stuff like that. What you are trying to do is remove the chances of all the opportunistic stuff happening.
So, first, we are not going to become secure as such, but we are trying to minimise the risk as much as possible. Secondly, it is really complex to do it; we saw last year the examples of companies that, even though they had invested, still missed some things. Even in the discussions that they had had around cyber-insurance, they had massively underestimated the cost of the level of disruption that they experienced. Part of it is that we are still trying to figure out how things will happen, what the impacts will be and what that will look like in the long term.
There is also a long tail of companies that are not investing, or not investing enough. Hopefully, this legislation will help with that, but more importantly, you want to see regulators engaging on the issue, talking to the entities they cover and going on a journey with them to understand what the risks are and where they need to get to. If you are talking about critical providers and essential services, it is really hard for an organisation—in its own mind or in being answerable to its board or investors—to justify spend on cyber-security. If you are a hospital saying that you are putting money towards security programmes rather than beds or diagnostics, that is an incredibly difficult conversation to have. One of the good things about CSRB, hopefully, is that it will legitimise choices and conversations in which people say, “Investing time and resources into cyber-security is investing time and resources into providing a critical, essential service, and it is okay to make those pay-off choices—they have to be made.”
Part of it is that when you are running an organisation, it is so hard to think about all the different elements. The problem with cyber-security—we need to be clear about this—is that with a lot of things that we ask organisations to do, you say, “You have to make this investment to get to this point,” and then you move on. So they might take a loan, the Government might help them in some way, or they might deprioritise other spending for a set period so that they can go and invest in something, get up to date on something or build out something; then they are done, and they can move back to a normal operating state.
Security is not that. It is expensive, complex and multifaceted. We are asking organisations of all sizes in the UK, many of which are not large, to invest in perpetuity. We are asking them to increase investment over time and build maturity. That is not a small ask, so we need to understand that there are very reasonable dynamics at play here that mean that we are not where we need to be. At the same time, we need a lot more urgency and focus. It is really important to get the regulators engaged; get them to prioritise this; have them work with their sectors, bring their sectors along and build that maturity; and legitimise the investment of time and resources for critical infrastructure.
Chris Vince
Q
Matt Houlihan: I am very happy to. Two main comparators come to mind. One is the EU, and we have talked quite a bit about NIS2 and the progress that has made. NIS2 does take a slightly different approach to that of the UK Government, in that it outlines, I think, 18 different sectors, up from seven under NIS1. There is that wide scope in terms of NIS2.
Although NIS2 is an effective piece of legislation, the implementation of it remains patchy over the EU. Something like 19 of the 27 EU member states have implemented it to date in their national laws. There is clearly a bit of work still to do there. There is also some variation in how NIS2 is being implemented, which we feel as an international company operating right across the European Union. As has been touched on briefly, there is now a move, through what are called omnibus proposals, to simplify the reporting requirements and other elements of cyber-security and privacy laws across the EU, which is a welcome step.
I mentioned in a previous answer the work that Australia has been doing, and the Security of Critical Infrastructure Act 2018—SOCI—was genuinely a good standard and has set a good bar for expectations around the world. The Act has rigorous reporting requirements and caveats and guardrails for Government step-in powers. It also covers things like ransomware, which we know the UK Home Office is looking at, and Internet of Things security, which the UK Government recently looked at. Those are probably the two comparators. We hope that the CSRB will take the UK a big step towards that, but as a lot of my colleagues have said, there is a lot of work to do in terms of seeing the guidance and ensuring that it is implemented effectively.
Chris Anley: On the point about where we are perhaps falling behind, with streamlining of reporting we have already mentioned Australia and the EU, which is in progress. On protection of their defenders, other territories are already benefiting from those protections—the EU, the US, and I mentioned Portugal especially. As a third and final point, Australia is an interesting one, as it is providing a cyber-safety net to small and medium-sized enterprises, which provides cyber expertise from the Government to enable smaller entities to get up to code and achieve resilience where those entities lack the personnel and funding.
Emily Darlington
Q
Dr Ian Levy: The previous set of witnesses talked about board responsibility around cyber-security. In my experience, whether a board is engaged or not is a proxy indicator for whether they are looking at risk management properly, and you cannot change corporate culture through regulation—not quickly. There is something to be done around incentives to ensure that companies are really looking at their responsibilities across cyber-security. As the previous panellists have said, this is not just a technical thing.
One of the things that is difficult to reconcile in my head—and always has been—is trying to levy national security requirements on companies that are not set up to do that. In this case I am not talking about Amazon Web Services, because AWS invests hugely in security. We have a default design principle around ensuring that the services are secure and private by design. But something to consider for the Bill is not accidentally putting national security requirements on those entities that cannot possibly meet them.
When I was in government, in the past we accidentally required tiny entities, which could not possibly do so, to defend themselves against the Russians in cyber-space. If you translate that to any other domain—for example, saying that a 10-person company should defend itself against Russian missiles—it is insane, yet we do it in cyber-space. Part of the flow-down requirements that we see for contracting, when there is a Bill like this one, ends up putting those national security requirements on inappropriate entities. I really think we need to be careful how we manage that.
Matt Houlihan: Can I make two very quick points?
The Chair
Very briefly—yes.
Matt Houlihan: My first point is on the scale of the challenge. From Cisco’s own research, we released a cyber-security readiness index, which was a survey of 8,000 companies around the world, including in the UK, where we graded companies by their cyber maturity. In the UK, 8% of companies—these are large companies—were in the mature bracket, which shows the scale of the challenge.
The other point I want to make relates to its being a cyber-security and resilience Bill, and the “resilience” bit is really important. We need to focus on what that means in practice. There are a lot of cyber measures that we need to put in place, but resilience is about the robustness of the technology being used, as well as the cyber-security measures, the people and everything else that goes with it. Looking at legacy technology, for example—obsolete technology, which is more at risk—should also be part of the standards and, perhaps, the regulatory guidance that is coming through. I know that the public sector is not part of the Bill, but I mention the following to highlight the challenge: over a year ago, DSIT published a report that showed, I think, that 28% of Government systems were in the legacy, unsupported, obsolete bracket. That highlights the nature of the challenge in this space.