Cyber Security and Resilience (Network and Information Systems) Bill (Second sitting) Debate
Full Debate: Read Full DebateBen Spencer
Main Page: Ben Spencer (Conservative - Runnymede and Weybridge)Department Debates - View all Ben Spencer's debates with the Department for Science, Innovation & Technology
(1 day, 19 hours ago)
Public Bill Committees
The Chair
Good afternoon. We will now hear oral evidence from Ian Hulme, the interim executive director of regulatory supervision and director of regulatory assurance for the Information Commissioner’s Office; Natalie Black, group director for infrastructure and connectivity for Ofcom; and Stuart Okin, director of cyber regulation and artificial intelligence for Ofgem. We need to stick to the timings in our programme order, so we have until 2.40 pm for this session. Could the witnesses please introduce themselves briefly before we hand over for questions?
Ian Hulme: Good afternoon. My name is Ian Hulme, and I am interim executive director of regulatory supervision at the ICO.
Natalie Black: Good afternoon. I am Natalie Black, and I am group director for infrastructure and connectivity at Ofcom.
Stuart Okin: My name is Stuart Okin; good afternoon. I am the director for cyber regulation and artificial intelligence at Ofgem.
Q
My second question is jointly for Ian and Stuart, from the ICO and Ofgem. Some industry stakeholders have expressed concern about low levels of incident reporting and enforcement under the NIS1—network and information systems—regs. How will your respective approaches to regulation change as a result of this Bill, to ensure that it is implemented and that cyber-resilience is improved across the sectors you are responsible for regulating?
Natalie Black: I will kick off. We have some additional responsibilities, building on the NIS requirements, but the data centre aspect of the Bill is quite a substantial increase in responsibilities for us. It is worth emphasising that we see that as a natural evolution of our responsibilities in the sector. Communications infrastructure is evolving incredibly quickly, as you will be well aware, and data centres are the next big focus. In terms of preparations, we are spending this time getting to know the sector and making sure we have the right relationships in place, so that we do not have a standing start. I have done a number of visits, for example, to hear at first hand from industry representatives about their concerns and how they want to work with us.
We are also focusing on skills and recruitment. We already have substantial cyber-security responsibilities in the communications infrastructure sector. We are building on the credibility of the team, but we are focused on making sure we continue to invest in them. About 60% of the team already come from the private sector. We want that to continue going forward, but we are not naive to how challenging it is to recruit in the cyber-security sector. For example, we are working with colleagues from the National Cyber Security Centre, and looking at universities it is accrediting, to see how we can recruit directly using those kinds of opportunities.
Ian Hulme: On incident reporting, the thresholds in the existing regulations mean that levels are very low. Certainly, the reports we see from identity service providers do not meet those thresholds. I anticipate that we will see more incidents reported to us. With our enhanced regulatory powers and the expanded scope of organisations we will be responsible for, I anticipate that our oversight will deepen and we will have more ability to undertake enforcement activity. Certainly from our perspective, we welcome the enhanced reporting requirements.
Stuart Okin: To pick up on the incident side of things, I agree with Ian. The thresholds will change. With the new legislation, any type of incident that could potentially cause an issue will obviously be reported, whereas that does not happen today under the NIS requirements.
On enforcement, in seven years we have used all the enforcement regimes available to us, including penalties, and we will continue to do so. We absolutely welcome the changes in the Bill to simplify the levels and to bring them up, similar to the sectorial powers that we have today.
Chris Vince (Harlow) (Lab/Co-op)
Q
Stuart Okin: In the energy sector, we tend to use operational technology rather than IT systems. That might mean technology without a screen, so an embedded system. It is therefore important to be able to customise our guidance. We do that today. We use the cyber assessment framework as a baseline, and we have a 335-page overlay on our website to explain how that applies to operational technology in our particular space. It is important to be able to customise accordingly; indeed, we have added physical elements to the cyber assessment framework, which is incredibly important. We welcome that flexibility being maintained in the Bill.
Ian Hulme: Just to contrast with colleagues from Ofcom and Ofgem, ICO’s sector is the whole economy, so it is important that we are able to produce guidance that speaks to all the operators in that sector. Because our sector is much bigger, we currently have something like 550 trust service providers registered, and that will grow significantly with the inclusion of managed service providers. So guidance will be really important to set expectations from a regulatory perspective.
Natalie Black: To round this off, at the end of the day we always have to come back to the problem we are trying to solve, which is ensuring cyber-security and resilience. As you will have heard from many others today, cyber is a threat that is always evolving. The idea that we can have a stagnant approach is for the birds. We need to be flexible as regulators. We need to evolve and adapt to the threat, and to the different operators we will engage with over the next couple of years. Collectively, we all appreciate that flexibility.
Tim Roca
Q
Ian Hulme: There is a balance to be struck. When something is written on the face of the Bill and things change—and we know that this is a fast-moving sector—it makes it incredibly difficult to change things. There is a balance to be struck between primary and secondary, but what we are hearing and saying is that more precision around some of the definitions will be critical.
Natalie Black: I strongly agree with Ian. A regulator is only as good as the rules that it enforces. If you want us to hold the companies to account, we need to be absolutely clear on what you are asking us to do. The balance is just about right in terms of primary and secondary, particularly because the secondary vehicle gives us the opportunity to ensure that there is a lot of consultation. The Committee will have heard throughout the day—as we do all the time from industry—that that is what industry is looking for. They are looking for periods of business adjustment—we hear that loud and clear—and they really want to be involved in the consultation period. We also want to be involved in looking at what we need to take from the secondary legislation into codes of practice and guidance.
Q
Natalie Black: That is a great question, and I am not at all surprised that you have asked it, given everything that is going on at the moment. As well as being group director for infrastructure and connectivity, I am also the executive member of the board, sitting alongside our chief executive officer, so from first-hand experience I can say that Ofcom really recognises how fast technology is changing. I do not think there is another sector that is really at the forefront of change in this way, apart from the communications sector. There are a lot of benefits to being able to sit across all that, because many of the stakeholders and issues are the same, and our organisation is learning to evolve and adapt very quickly with the pace of change. That is why the Bill feels very much like a natural evolution of our responsibility in the security and resilience space.
We already have substantial responsibilities under NIS and the Telecommunications (Security) Act 2021. We are taking on these additional responsibilities, particularly over data centres, but we already know some of the actors and issues. We are using our international team to understand the dynamics that are affecting the Online Safety Act, which will potentially materialise in the security and resilience world. As a collective leadership team, we look across these issues together. The real value comes from joining the dots. In the current environment, that is where you can make a real difference.
Q
Natalie Black: That is definitely not what I am saying. You can cut the cake in many different ways. From where I sit—from my experience to date—you need specific sector regulators because you need regulators that understand the business dynamics, the commercial dynamics, the people dynamics and the issues on a day-to-day basis.
We have many people who have worked at Ofcom for a very long time, and who know the history and have seen these issues before. When it comes to threats, which is ultimately what we are dealing with—cyber-security is a threat—it is cross-cutting. It adapts, evolves and impacts in different ways. The knack is having a sector regulator that really understands what is going on. That means that when you are dealing with cyber-incidents, you understand the impact on real people and businesses, and ultimately you can do something more quickly about it.
Q
Stuart Okin: We have a clear understanding of the responsibilities within Ofgem. We are the joint competent authority with the Department for Energy Security and Net Zero. The Department does the designation and instant handling, and we do all the rest of the operations, including monitoring, enforcement and inspections. We understand our remit with NCSC. GCHQ is part of the cyber-security incident response team; it is ultimately responsible there.
Going back to your main concern, we are part of an ecosystem. We have to understand where our lines are drawn, where NCSC’s responsibilities are and what the jobs are. To go back to us specifically, we can talk about engineering aspects, electrical engineering, gas engineering and the cyber elements that affect that, including technology resilience—not cyber. As long as we have clear gateways and communication between each other—and I think that the Bill provides those gateways—that will also assist, but there are clear lines of responsibilities.
Natalie Black: It is clear that there is work to do to get in the same place for the Bill. Exactly as Stuart said, the information gateways will make a massive difference. It is too hard, at the moment, to share information between us and with the National Cyber Security Centre. The fact that companies will have to report within 24 hours not only to us but to the NCSC is very welcome.
To return to my earlier point, we think that there is a bit of work for DSIT to do to help to co-ordinate this quite complicated landscape, and I think that industry would really welcome that.
Ian Hulme: I agree with colleagues. From an ICO perspective, we see our responsibilities as a NIS competent authority as complementary to our role as a data protection regulator. If you want secure data, you have to have secure and resilient networks, which are obviously used to process data. We see it as a complementary set of regulations to our function as a data protection regulator.
David Chadwick (Brecon, Radnor and Cwm Tawe) (LD)
Q
It strikes me that, if one of the things that this legislation is to guard against is pre-positioning, and there are 14 parallel reporting systems in place, it could be the case that those pre-positioning attacks are not picked up as co-ordinated attacks from another nation state or organisation, because they are not pulled together in time.
Natalie Black: I point to my earlier remarks about information sharing. You are right: that is one of the great benefits of the Bill. To be able to do more, particularly when it comes to pre-positioning attacks, is really important. You will have heard from the NCSC, among others, that that is certainly a threat that we are seeing more and more of.
At the moment, it is too difficult to share information between us. The requirement to have an annual report to the NCSC is a good mechanism for consolidating what we are all seeing, and then for the NCSC to play the role of drawing conclusions. It is worth emphasising that Ofcom is not an operational organisation; we are a regulator. We look to the NCSC to provide threat leadership for what is going on across the piece. I think that that answers your question about where it all comes together.
Stuart Okin: I fully support that. The NSCS will be the hub for that type of threat intel and communications, in terms of risks such as pre-positioning and other areas. The gateways will help us to communicate.
Ian Hulme: Bringing it back to the practicalities of instant reporting, you said that there are potentially 14 lines of incident reporting because there are 14 competent authorities. How that can be consolidated is something to be explored. Put yourself in a position of an organisation that is having to make a report: there needs to be clarity on where it has to make it to and what it needs to report.
Q
Secondly, it has been reported recently that communications of senior Government aides were hacked by Chinese state affiliates between 2021 and 2024. In view of that threat to telecoms networks, what are the potential cyber-risks to communications infrastructure that you see arising from the intended location of China’s super-embassy in the City of London?
Chung Ching Kwong: On the first question, about what can be done to help sectors understand the risks, education is paramount. At this point, we do not have a comprehensive understanding of what kind of risks state actors like China pose. We are very used to the idea that private entities are private entities, because that is how the UK system works; we do not see that organisations, entities or companies associated with China or the Chinese state are not independent actors as we would expect, or want to expect.
There is a lot of awareness-raising to be done and guidance to be issued around how to deal with these actors. There is a lot of scholarly work that says that every part of Chinese society—overseas companies and so on—is a node of intelligence collection within the system of the CCP. Those things are very important when it comes to educating.
Also, the burden of identifying what is a national security risk and what is not should not be put on small and medium-sized businesses, or even big companies, because they are not trained to understand what the risks are. If you are not someone specialising in the PLA and a lot of other things academically, it would be very difficult to have to deal with those things on a day-to-day basis and identify, “That’s a threat, and that’s a threat.”
Sorry, what was the second question?
Q
Chung Ching Kwong: There is not a lot of publicly available information on the sensitive cabling that is around the area, so I cannot confidently say what is really going to happen if they start to build the embassy and have such close contact with those cables. The limit of this Bill when it comes to the Chinese embassy is that it cannot mitigate the risks that are posed by this mega-embassy in the centre of London, because it regulates operators and not neighbours or any random building in the City. If the embassy uses passive interception technology to harvest data from local wi-fi or cellular networks, no UK water or energy company is breached. There is no breach if they are only pre-positioning there to collect information, instead of actually cutting off the cables, so when they do cut off the cables, it will be too late. There will be no report filed under the Bill, even if it is under the scope of the Bill when it comes to regulation. The threat in this case is environmental and really bypasses the Bill’s regulatory scope.
Dave Robertson (Lichfield) (Lab)
Q
Chung Ching Kwong: I think that to a certain extent they will. For hackers or malicious actors aiming for financial gain with more traditional hacking methods, it will definitely do a job in protecting our national security. But the Bill currently views resilience through an IT lens. It is viewing this kind of regulatory framework as a market regulatory tool, instead of something designed to address threats posed by state-sponsored actors. It works for cyber-criminals, but it does not work for state actors such as China, which possess structural leverage over our infrastructure.
As I said before, we have to understand that Chinese vendors are legally obliged to compromise once they are required to. The fine under the Bill is scary, but not as scary as having your existence threatened in China—whether you still have access to that market or you can still exist as a business there. It is not doing the job to address state-sponsored hackers, but it really does help when it comes to traditional hacking, such as phishing attempts, malware and those kinds of things.
The Chair
We will now hear evidence from Professor John Child, professor of criminal law at the University of Birmingham and co-founding director of the Criminal Law Reform Now Network. For this session, we have until 3.20 pm.
Q
Professor John Child: My specialism is in criminal law, so this is a bit of a side-step from a number of the pieces of evidence you have heard so far. Indeed, when it comes to the Bill, I will focus on—and the group I work for focuses on—the potential in complementary pieces of legislation, and particularly the Computer Misuse Act 1990, for criminalisation and the role of criminalisation in this field.
I think that speaks directly to the first question, on effective collaboration. It is important to recognise in this field, where you have hostile actors and threats, that you have a process of potential criminalisation, which is obviously designed to be effective as a barrier. But the reality is that, where you have threats that are difficult to identify and mostly originating overseas, the actual potential for criminalisation and criminal prosecution is slight, and that is borne out in the statistics. The best way of protecting against threats is therefore very much through the use of our cyber-security expertise within the jurisdiction.
When we think about pure numbers, and the 70,000-odd cyber-security private experts, compared with a matter of hundreds in the public sector, police and others, better collaboration is absolutely vital for effective resilience in the system. Yet what you have at the moment is a piece of legislation, the Computer Misuse Act, that—perfectly sensibly for 1990—went with a protective criminalisation across-the-board approach, whereby any unauthorised access becomes a criminal offence, without mechanisms to recognise a role for a private sector, because essentially there was not a private sector doing this kind of work at the time.
When we think about potential collaboration, first and foremost for me—from a criminal law perspective—we should make sure we are not criminalising effective cyber-security. The reality is that, when we look at the current system, if any authorised access of any kind becomes a criminal offence, you are routinely criminalising engagement in legitimate cyber-security, which is a matter of course across the board. If you are encouraging those cyber-security experts to step back from those kinds of practices—which may make good sense—you are also lessening that level of protection and/or outsourcing to other jurisdictions or other cyber-security firms, with which you do not necessarily have that effective co-operation, reporting and so on. That is my perspective. Yes, you are absolutely right, but we now have mechanisms in place that actively disincentivise that close collaboration and professionalisation.
Sarah Russell
Q
Professor John Child: Yes. It is not the easiest criminal law tale, if you like. If there were a problem of overcriminalisation in the sense of prosecutions, penalisation, high sentences and so on, the solution would be to look at a whole range of options, including prosecutorial discretion, sentencing or whatever it might be, to try to solve that problem. That is not the problem under the status quo. The current problem is purely the original point of criminalisation. Think of an industry carrying out potentially criminalised activity. Even if no one is going to be prosecuted, the chilling effect is that either the work is not done or it is done under the veil of potential criminalisation, which leads to pretty obvious problems in terms of insurance for that kind of industry, the professionalisation of the industry and making sure that reporting mechanisms are accurate.
We have sat through many meetings with the CPS and those within the cyber-security industry who say that the channels of communication—that back and forth of reporting—is vital. However, a necessary step before that communication can happen is the decriminalisation of basic practices. No industry can effectively be told on the one hand, “What you are doing is vital,” but on the other, “It is a criminal offence, and we would like you to document it and report it to us in an itemised fashion over a period of time.” It is just not a realistic relationship to engender.
The cyber-security industry has evolved in a fragmented way both nationally and internationally, and the only way to get those professionalisation and cyber-resilience pay-offs is by recognising that the criminal law is a barrier—not because it is prosecuting or sentencing, but because of its very existence. It does not allow individuals to say, “If, heaven forbid, I were prosecuted, I can explain that what I was doing was nationally important. That is the basis on which I should not be convicted, not because of the good will of a prosecutor.”
Dave Robertson
Q
Professor John Child: Yes. As I understand it, it does. This is part of the reason, incidentally, why my organisation, which focuses very much on criminal law aspects, ended up doing some collaborative work with the CyberUp campaign. That is because, from the industry perspective, they can do that kind of business modelling in a way that we do not. Whereas we can make the case for sensible criminal law reform, they can talk about how that reform translates into both the security environment and the commercial environment. Their perspective on this is, first, that we can see that there is already outsourcing of these kinds of services, particularly to the US, Israel and other more permissive jurisdictions. That is simply because, if you are a cyber-security expert in one of those jurisdictions, you are freer to do the work companies would like you to do to make sure their systems are safe here.
There are also the sectoral surveys and so on, and the predictions about what it is likely to do to the profession if you allow it to do these kinds of services in this jurisdiction. That is about the security benefits, but they are also talking about something like a 10% increase in the likely projection of what cyber-security looks like in this jurisdiction—personnel, GDP and so on.
Q
Professor John Child: There are obviously a number. It is always more comfortable when you have a beginning point of criminalisation. The argument to decriminalise in an environment where you want to protect against threats is sometimes a slightly unintuitive sell. Is the criminalisation that we have doing the necessary work in terms of actually fighting the threats? To some extent, yes, but it is limited. Is it doing harms? There is an argument to say that it is doing harms.
This comes back to the point that was made earlier, which was perfectly sensible. When you speak to the CPS and others, their position as prosecutors is to say, “Very few people are being prosecuted, and we certainly don’t want to be prosecuting legitimate cyber-security experts, so there is no problem.” Admittedly, that means there is no problem in terms of actual criminalisation and prosecution, but that is the wrong problem. If you focus on the problem being the chilling effect of the existence of the criminalisation in the first place, you simply cannot solve that through prosecutorial discretion, and nor should you, when it comes to identifying what a wrong is that deserves to be criminalised. You certainly cannot resolve it through sentencing provisions.
The only way that you can sensibly resolve this is either by changing the offence—that is very difficult, not least because, from a position of criminalisation, it might be where other civil jurisdictions begin—or by way of defence, which realistically is the best solve from the point we are at now. If you have a defence that can be specifically tailored for cyber-security and legitimate actors, you can build in reverse burdens of proof. You can build in objective standards of what is required in terms of public interest.
The point here is that the worry is one of bad actors taking advantage. The reality is that that is very unlikely. The idea that the bad actors we identify within the system would be able to demonstrate how they are acting in the public best interest is almost ridiculous. Indeed, the prospect of better threat intelligence, better securities and so on provides more information and better information-sharing to the NCSC and others and actually leads to more potential for prosecution of nefarious actors rather than less.
It is a more complicated story than we might like in terms of a standard case for changing the criminal law, but it is nevertheless an important one.
The Chair
That brings us to the end of the time allotted to ask questions. On behalf of the Committee, I thank our witness for his evidence. We move on to our next panel.
Examination of witness
Detective Chief Superintendent Andrew Gould gave evidence.
The Chair
We will now hear oral evidence from Detective Chief Superintendent Andrew Gould, programme lead for the National Police Chiefs’ Council cyber-crime programme. For this session, we have until 3.40 pm. I call Dr Ben Spencer.
Q
Secondly, on ransomware attacks, you will know that the Government review states that ransomware is
“the greatest of all serious and organised cybercrime threats”.
In your view, what is the scale of that threat and what sectors and businesses are the primary targets?
DCS Andrew Gould: To take the actors first, they are probably quite well known, in terms of the general groupings. Yes, we have our state actors—the traditional adversaries that we regularly talk about—and they generally offer very much a higher-end capability, as you will all be aware.
The next biggest threat group is organised crime groups. You see a real diversity of capability within that. You will see some that are highly capable, often from foreign jurisdictions—Russian jurisdictions or Russian-speaking. The malware developers are often the more sophisticated as service-type offerings. We see more and more ransomware and other crime types almost operating as franchises—“Here is the capability, off you go, give us a cut.” Then they have less control over how those capabilities are used, so we are seeing a real diversification of the threat, particularly when it comes to ransomware.
Then, where you have that proximity to state-directed, if not quite state-controlled—that crossover between some of those high-end crime groups and the state; I am thinking primarily of Russia—it is a lot harder to attribute the intent behind an attack. There is a blurring of who was it and for what purpose was it done, and there is that element of deniability because it is that one further step away.
Moving back down the levels of the organised crime groups, you have a real profusion of less capable actors within that space, from all around the world, driving huge volumes, often using quite sophisticated tools but not really understanding how they work.
What we have seen is almost like a fragmentation in the criminal marketplace. The barrier to criminal entry is probably lower than it has ever been. You can download these capabilities quite readily—you can watch a tutorial on YouTube or anywhere else on how to use them, and off you go, even if you do not necessarily understand the impact. We certainly saw a real shift post pandemic from traditional criminals and crime groups into more online crime, because it was easier and less risky.
You look more broadly at hacktivists, terrorists—who are probably a lot less capable; they might have the intent but not so much the capability—and then the group that are sometimes slightly patronisingly described as script kiddies. These are young individuals with a real interest in developing their skills. They have an understanding that what they are doing is wrong, but they are probably not financially or criminally motivated. If they were not engaging in that kind of cyber-crime, they probably would not be engaging in other forms of criminality, but they can still do a lot of damage with the tools they can get their hands on, given that so many organisations seem to struggle to deliver even a basic level of cyber-resilience and cyber-security.
One of the things that we really noticed changing over the last 18 months is the diversification of UK threats. Your traditional UK cyber-criminal, if there is such a thing, is primarily focused on hacking for personal benefit, ransomware and other activity. Now we are seeing a diversification, and more of a hybrid, cross-organised crime threat. There are often two factors to that. We often hear it described in the media or by us within law enforcement publicly as the common threat—this emerging community online—otherwise known as Scattered Spider.
There, we are seeing two elements to those sorts of groups. You see an element of maybe more traditional cyber-skills engaged in hacking or using those skills for fraud, but we also see those skills being used for Computer Misuse Act offences, in order to enable other offences. One of the big areas for that at the moment that we see is around intimate image abuse. We see more and more UK-based criminals hacking individuals’ devices to access, they hope, intimate images. They then identify the subject of those intimate images, most predominantly women, and then engage in acts of extortion, bullying or harassment. We have seen some instances of real-world contact away from that online contact.
Think of the scale of that and the challenge that presents to policing. I can think of cases in cyber-crime unit investigations across the country where you have got a handful of individuals who have victimised thousands of women in the UK and abroad. You have got these small cyber-crime units of a handful of people trying to manage 4,000 or 10,000 victims.
It is very difficult and very challenging, but the flipside of that is that, if they are UK-based, we have a much better chance of getting hold of them, so we are seeing a lot more arrests for those cross-hybrid threats, which is a positive. There is definitely an emerging cohort that then starts to blend in with threats like Southport and violence-fixated individuals. There seems to be a real mishmash of online threat coming together and then separating apart in a way that we have never seen historically. That is a real change in the UK threat that is driving a lot of policing activity.
Turning to your ransomware question, what is interesting, in terms of the kinds of organisations that are impacted by ransomware, a lot of the ransomware actors do not want to come to notice for hitting critical national infrastructure. They do not want to do the cloning of pipelines. They do not want to be taking out hospitals and the NHS. They know they will not get paid if they hit UK critical national infrastructure, for starters, so there is a disincentive, but they also do not want that level of Government or law enforcement attention.
Think of the disruptive effect that the UK NCA and policing had on LockBit the year before last. LockBit went from being the No. 1 ransomware strain globally to being out of the top 10 and struggling to come back. We saw a real fragmentation of the ransomware market post that. There is no dominant strain or group within that that has emerged to cover that. A lot of those groups that are coming into that space may be a bit less skilled, sophisticated and successful.
The overall threat to organisations is pretty much the same. The volume is the volume, but it is probably less CNI and more smaller organisations because they are more vulnerable and it is less likely to play out very publicly than if there is a big impact on the economy or critical national infrastructure. As such, there is probably not the level of impact in the areas that people would expect, notwithstanding some of the really high-profile incidents we had last year.
David Chadwick
Q
DCS Andrew Gould: That is a really good question. The international jurisdiction challenge for us is huge. We know that is where most of the volumes are driven from, and obviously we do not have the powers to just go over and get hold of the people we would necessarily want to. You will not be surprised to hear that it really varies between jurisdictions. Some are a lot more keen to address some of the threats emanating from their countries than others. More countries are starting to treat this as more of a priority, but it can take years to investigate an organised crime group or a network, and it takes them seconds to commit the crime. It is a huge challenge.
There are two things that we could do more of better—these are things that are in train already. If you think about the wealth of cyber-crime, online fraud and so on, all the data, and a lot of the skills and expertise to tackle that sit within the private sector, whereas in law enforcement, we have the law enforcement powers to take action to address some of it.
With a recent pilot in the City funded by the Home Office, we have started to move beyond our traditional private sector partnerships. We are working with key existing partners—blockchain analytic companies or open-source intelligence companies—and we are effectively in an openly commercial relationship; we are paying them to undertake operational activity on our behalf. We are saying, “Company a, b or c, we want you to identify UK-based cyber-criminals, online fraudsters, money-laundering and opportunities for crypto-seizure under the Proceeds of Crime Act 2002”. They have the global datasets and the bigger picture; we have only a small piece of the puzzle. By working with them jointly on operations, they might bring a number of targets for us, and we can then develop that into operational activity using some of the other tools and techniques that we have.
It is quite early days with that pilot, but the first investigation we did down in the south-east resulted in a seizure of about £40 million-worth of cryptocurrency. That is off a commercial contract that cost us a couple of hundred grand. There is potential for return on investment and impact as we scale it up. It is a capability that you can point at any area of online threat, not just cyber-crime and fraud, so there are some huge opportunities for it to really start to impact at scale.
One of the other things we do in a much more automated and technical way—again funded by the Home Office—is the replacement of the Action Fraud system with the new Report Fraud system. That will, over the next year or so, start to ingest a lot of private sector datasets from financial institutions, open-source intelligence companies and the like, so we will have a much broader understanding of all those threats and we will also be able to engage in takedowns and disruptions in an automated way at scale, working with a lot of the communication service providers, banks and others.
Instead of the traditional manual way we have always been doing a lot of that protection, we can, through partnerships, start doing it in a much more automated and effective way at scale. Over time, we will be able to design out and remove a lot of the volume you see impacting the UK public now. That is certainly the plan.
The Chair
We will now hear oral evidence from Richard Starnes, chair of the information security panel for the Worshipful Company of Information Technologists. We have until 4 pm for this session.
Q
Richard Starnes: The question about effectiveness is difficult to answer. There is the apparent effectiveness and the actual effectiveness. The reason I answer in that way is that you have regulators that are operating in environments where they may choose to not publicly disclose how they are regulating; it may be classified due to the nature of the company that was compromised, or who compromised the company. There may not necessarily be a public view of how much of that regulation is actually going on. That is understandable, but it has the natural downside of creating instances where somebody is being taken to task for not doing it correctly, but that is not exposed to the rest of the world. You do not know that it is happening, so the deterrent effect is not there.
Information sharing and analysis centres started in the United States 20 or 25 years ago, when different companies were in the same boat. The first one that I was aware of was the Financial Services ISAC, which comprises large entities—banks, clearing houses and so on—that share intelligence about the types of attacks that they are receiving internationally. They may be competing with one another in their chosen businesses, but they are all in the same boat with regard to being attacked by whatever entities are attacking them. Those have been relatively good at helping develop defences for those industries.
Q
Richard Starnes: Yes. We have FS-ISAC operating in the United Kingdom and in Europe, with all the major banks, but if you took this and replicated it on an industry-by-industry basis, particularly ones in CNI, that would be helpful. It would also help with information sharing with entities like NCSC and GCHQ.
David Chadwick
Q
Richard Starnes: On what you say about the 18-month tenure, one of the problems is stress. A lot of CISOs are burning out and moving to companies that they consider to have boards that are more receptive to what they do for a living. Some companies get it. Some companies support the CISOs, and maybe have them reporting to a parallel to the CIO, or chief information officer. A big discussion among CISOs is that having a CISO reporting to a CIO is a conflict of interest. A CISO is essentially a governance position, so you wind up having to govern your boss, which I would submit is a bit of a challenge.
How do we help CISOs? First, with stringent application of regulatory instruments. We should also look at or discuss the idea of having C-level or board-level executives specifically liable for not doing proper risk governance of cyber-security—that is something that I think needs to be discussed. Section 172 of the Companies Act 2006 states that you must act in the best interests of your company. In this day and age, I would submit that not addressing cyber-risk is a direct attack on your bottom line.
The Chair
We will now hear oral evidence from Brian Miller, head of IT security and compliance, and Stewart Whyte, data protection officer, both from NHS Greater Glasgow and Clyde and joining us online. For this session we have until 4.20 pm. Will the witnesses please introduce themselves for the record?
Brian Miller: Good afternoon, Chair. It is nice to see you all. I am Brian Miller and I head up IT security and compliance at NHS Greater Glasgow and Clyde. It is a privilege to be here, albeit remotely. I have worked at NHS Greater Glasgow and Clyde for four years. Prior to that, I was infrastructure manager at a local authority for 16 years and I spent 10 years at the Ministry of Defence in infrastructure management. I look at the Bill not only through the lens of working with a large health board, but from a personal perspective with a philosophy of “defenders win” across the entire public sector.
Stewart Whyte: Good afternoon, Chair, and everyone. My name is Stewart Whyte and I am the data protection officer at NHS Greater Glasgow and Clyde. I am by no means a cyber-security expert, but hopefully I can provide some insight into the data protection side and how things fit together.
Q
Brian Miller: That is a good question. Some of our colleagues mentioned the follow-up secondary legislation that will help us to identify those kinds of things. I suppose there is no difference from where we are at now. We would look at any provision of services from a risk management perspective and say what security controls apply. For example, would they be critical suppliers in terms of infrastructure and cyber-security? Does a cleaning service hold identifiable data? What are the links? Is it intrinsically linked from a technological perspective?
I mentioned looking at this through a “defenders win” lens. Yes, some of these technologies are covered. I saw some of the conversations earlier about local authorities not being in scope, but services are so intrinsically linked that they can well come into scope. It might well be that some of the suppliers you mentioned fall under the category of critical suppliers, but that might be the case just now. There might be provision of a new service for medical devices, which are a good example because they are unique and different compliance standards apply to them. For anything like that, where we stand just now—outside the Bill—we risk assess it. There is such an intrinsic link. A colleague on another panel mentioned data across the services; that is why Stewart is here alongside me. I look after the IT security element and Stewart looks after the data protection element.
Q
Brian Miller: Sometimes, but sometimes not. I do not think we had any physical links with Synnovis, but it did work on our behalf. Emails might have been going back and forward, so although there were no physical connections, it was still important in terms of business email compromise and stuff like that—there was a kind of ancillary risk. Again, when things like that come up, we would look at it: do we have connections with a third party, a trusted partner or a local authority? If we do, what information do we send them and what information do we receive?
Chris Vince
Q
Stewart Whyte: Anything that increases or improves our processes in the NHS for a lot of the procured services that we take in, and anything that is going to strengthen the framework between the health board or health service and the suppliers, is welcome for me. One of our problems in the NHS is that the systems we put in are becoming more and more complex. Being able to risk assess them against a particular framework would certainly help from our perspective. A lot of our suppliers, and a lot of our systems and processes, are procured from elsewhere, so we are looking for anything at all within the health service that will improve the process and the links with third party service providers.
Lincoln Jopp
Q
Brian Miller: I think we would work with the regulator, but we are looking for more detail in any secondary legislation that comes along. We have read what the designation of critical suppliers would be. I would look to work with the Scottish Health Competent Authority and colleagues in National Services Scotland on what that would look like.
Stewart Whyte: On how we would make that decision, from our perspective we are looking at what the supplier is providing and what sort of data they are processing on our behalf. From the NHS perspective, 90% of the data that we process will be special category, very sensitive information. It could be that, from our side, a lot of the people in the supply chain would fall into that designation, but for some other sectors it might not be so critical. We have a unique challenge in the NHS because of the service we provide, the effect that cyber-crime would have on our organisations, and the sensitivity of the data we process.
Q
Stewart Whyte: For me, it would be a slightly different assessment from Brian’s. We would be looking at anything where there is no processing of personal data. For me, that would not be a critical supplier from a data protection perspective. But there might be some other integration with NHS board systems that Brian might have concerns about. There is a crossover in terms of what we do, but my role is to look at how we manage data within the NHS. If there are suppliers where there is no involvement with identifiable data of either staff or patients, I would not see them as a critical supplier under this piece of legislation.
Lincoln Jopp
Q
Brian Miller: I do not want to step out of my lane. There will be clinical stuff that absolutely would be essential. I would not be able to speak in any depth on that part of it; I purely look at the cyber element of it. As an organisation, we would be identifying those kinds of aspects.
In terms of suppliers, you are absolutely right. We have suppliers that supply some sort of IT services to us. If we are procuring anything, we will do a risk assessment—that might be a basic risk assessment because it is relatively low risk, it might be a rapid risk assessment, or it may be a really in-depth assessment for someone that would be a critical supplier or we could deem essential—but there are absolutely suppliers that would not fall under any of that criteria for the board. The board is large in scale, with 40,000 users. It is the largest health board in the country.
Q
Stewart Whyte: Yes. There is a lot of information sharing between acute services and primary care via integrated systems. We send discharge letters and information directly to GP practices that then goes straight into the patient record with the GP. There is a lot of integration there, yes.
Q
Stewart Whyte: Yes, there is integration between ourselves and the local authorities.
The Chair
If there are no further questions from Members, I thank witnesses for their evidence. We will move on to the next panel.
Examination of Witnesses
Chris Parker MBE and Carla Baker gave evidence.
The Chair
We will now hear oral evidence from Chris Parker, director of government strategy at Fortinet and co-chair of the UK cyber resilience committee at techUK, and Carla Baker, senior director of government affairs in the UK and Ireland at Palo Alto Networks. For this session, we have until 4.50 pm.
Q
Carla, from the Palo Alto Networks perspective, what are your views on the changes to the incident reporting regime under the Bill? Will the approach help or hinder regulators in identifying and responding to the most serious threats quickly?
Chris Parker: I should point out that Carla is also co-chair of the cyber resilience committee, so you have both co-chairs here today.
As large cyber companies, we are very proud of one thing that is pertinent to the sector that may not be clear to everybody outside. I have worked in many sectors, and this is the most collaborative—most of it unseen—and sharing sector in the world. It has to be, because cyber does not respect borders. When we go to the most vulnerable organisations, which one would expect cannot afford things and therefore there must be a function of price, such as SMEs—I was an SME owner in a previous life—that is very dear to us. With the technology that is available, what is really good news is that when people buy cyber-security for their small business—in the UK or anywhere in the world—they are actually buying the same technology; it is effectively just a different engine size in most cases. There are different phases of technology. There is the latest stuff that is coming in, which they may not be getting into yet. However, the first thing to say is that it is a very fair system, and pricing-wise, it is a very fair system indeed for SMEs.
The second point is about making sure we are aware of the amount of free training going on across the world, and most of the vendors—the manufacturers—do that. Fortinet has a huge system of free training available for all people. What does that give? It is not just technical training for cyber-security staff; it is for ordinary people, including administrative workers and the people who are sometimes the ones who let the bad actor in. There are a lot of efforts. There is a human factor, as well as technological and commercial factors.
The other thing I would like to mention is that the cyber resilience committee, which Carla and I are lucky to co-chair, is elected. We have elected quite a large proportion of SME members. There is also a separate committee run by techUK. You heard from Stuart McKean earlier today, and he is one of the co-chairs, or the vice chair, of that committee.
Carla Baker: On incident reporting, as I am sure you are aware, the Bill states that organisations must report an incident if it is
“likely to have an impact”.
Our view, and I think that of techUK, is that the definition is far too broad. Anything that is likely to cause an impact could be a phishing email that an organisation has received. Organisations receive lots and lots of spoof emails.
I will give an example. Palo Alto Networks is one of the largest pure-play cyber-security companies. Our security operations centre—the hub of our organisation—processes something like 90 billion alerts a day. That is just our organisation. Through analysis and automation, the number is whittled down to just over 20,000. Then, through technology and capabilities, it is further whittled down, so that we are analysing about 75 alerts.
You can equate it to a car, for example. If you are driving and see a flashing yellow light, something is wrong. That is like 20,000 alerts. It is then whittled down to about 75, so we would potentially have to report up to 75 incidents per day, and that is just one organisation. There are a lot more. The burden on the regulator would be massive because there would be a lot of noise. It would struggle to ascertain what is the real problem—the high-risk incidents that impact the UK as a whole—and the noise would get in the way of that.
We have come up with a suggestion, an amendment to the legislation, that would involve a more tiered approach. There would be a more measurable and proportionate reporting threshold, with three tiers. The first is an incident that causes material service disruption, affecting a core service, a critical customer or a significant portion of users. The second is unauthorised, persistent access to a system. The third is an incident that has compromised core security controls—that is, security systems. Having a threshold that is measurable and proportionate is easier for organisations to understand than referring to an incident that is
“likely to have an impact”,
because, as I said, a phishing email is likely to cause an impact if an organisation does not have the right security measures in place.
David Chadwick
Q
Chris Parker: That is an excellent question. The good news is that a lot is happening already. An enormous amount of collaborative effort is going on at the moment. We must also give grace to the fact that it is a very new sector and a new problem, so everybody is going at it. That leads me on to the fact that the UK has a critical role in this, but it is a global problem, and therefore the amount of international collaboration is significant—not only from law enforcement and cyber-security agencies, but from businesses. Of course, our footprints, as big businesses, mean that we are always collaborating and talking to our teams around the world.
In terms of what the UK can do more of, a lot of the things that have to change are a function of two words: culture and harmonisation—harmonisation of standards. It is about trying not to be too concerned about getting everything absolutely right scientifically, which is quite tempting, but to make sure we can harmonise examples of international cyber-standards. It is about going after some commonality and those sorts of things.
I think the UK could have a unique role in driving that, as we have done with other organisations based out of London, such as the International Maritime Organisation for shipping standards. That is an aspiration, but we should all drive towards it. I think it is something the UK could definitely do because of our unique position in looking at multiple jurisdictions. We also have our own responsibilities, not only with the Commonwealth but with other bodies that we are part of, such as the United Nations.
It is not all good news. The challenge is that, as much as we know that harmonisation is okay, unfortunately everyone is moving. Things have started, and everyone is running hot. An important point to make is that it is one of the busiest sectors in the world right now, and everybody is very busy. This comes back to the UK having a particular eye on regulatory load, versus the important part that other elements of our society want, which is growth and economic prosperity. We talked earlier about SMEs. They do not have the capability to cover compliance and regulatory load easily, and we would probably all accept that. We have to be careful when talking about things such as designating critical suppliers.
All of this wraps up into increasing collaboration through public-private partnerships and building trust, so that when the Government and hard-working civil servants want to see which boundaries are right to push and which are not, bodies such as the UK cyber resilience committee, which Carla and I are on, can use those collaborative examples as much as possible.
There is quite a lot there, but something the UK certainly should be pushing to do is culture change, which we know has to be part of it—things have been talked about today by various speakers—as well as the harmonisation of standards.
Carla Baker: I think we are in a really interesting and exciting part of policy development: we have the Bill, and we have recently had the Government cyber action plan, which you may have heard about; and the national cyber action plan is coming in a few months’ time. The Government cyber action plan is internally facing, looking at what the Government need to do to address their resilience. The national cyber action plan is wider and looks at what the UK must do. We are at a really exciting point, with lots of focus and attention on cyber-security.
To address your point, I think there are three overarching things that we should be looking at. First is incentivising organisations, which is part of the Bill and will hopefully be a big part of the national cyber action plan. We must incentivise organisations to do more around cyber-security to improve their security posture. We heard from previous panellists about the threats that are arising, so organisations have to take a step forward.
Secondly, I think the Government should use their purchasing power and their position to start supporting organisations that are doing the right thing and are championing good cyber-security. There is more that the Government can do there. They could use procurement processes to mandate certain security requirements. We know that Cyber Essentials is nearly always on procurement tenders and all those types of things, but more can be done here to embed the need for enhanced security requirements.
Thirdly, I think a previous witness talked about information sharing. There is a bit of a void at the moment around information sharing. The cyber security information sharing partnership was set up, I think, 10 years ago—
Chris Parker: Yes, 10 years ago.
Carla Baker: It was disbanded a couple of months ago, and that has left a massive void. How does industry share intelligence and information about the threats they are seeing? Likewise, how can they receive information about the threat landscape? We have sector-specific things, but there isn’t a global pool, and there is a slight void at the moment.
Andrew Cooper (Mid Cheshire) (Lab)
Q
I can imagine that the legislation has been worded as it is to try to capture that situation where activity might occur, but not have an impact. Would you accept that that is important, and how would that fit in with the tiered approach that you described?
Carla Baker: I completely get your point. We have looked at that; my legal colleagues have looked at things such as spyware, where you have malware in the system that is not doing anything but is living there, for example, or pre-emptive, where they are waiting to launch an attack, and we think this amendment would still cover those scenarios. It is not necessarily cause and impact: the lights have not gone out, but if there is, for example, a nation state actor in your network, we think the amendment would still cover that.
Q
Chris Parker: Yes, absolutely.
Carla Baker: Yes, completely. That is similar to my point, which was probably not explained well enough: how you are deemed critical should be more about your criticality to the entire ecosystem, not just to one organisation.
Q
Carla Baker: I think that is part of the issue about not having clear criteria about how regulators will designate. That also means that different regulators will take different approaches, so we would welcome more clarity and early consultation around the criteria that will be used for the regulators to designate a critical dependency, which prevents having different regulatory approaches across the 12 different regulators, which we obviously do not want, and gives greater harmonisation and greater clarity for organisations to know, “Okay, I might be brought in, because those are the clear criteria the Government will be using.”
David Chadwick
Q
Chris Parker: The consultation has been a best effort and I think it is a best effort as a function of three things. First, we have a new sector, a new Bill—something very new, it is not repeating something. Secondly, we are doing something at pace, it is a moving target, we have to get on with this, and so there is some compulsion involved. Thirdly, there are already some collaborative areas set up, such as techUK, that have been used. Would I personally have liked to have seen more? Yes—but I am realistic about how much time is needed; when you only have a certain resource, some people have got to do some writing and crafting as well as discussing.
One thing that we could look at, if we did the process again, would be more modelling, exercising and testing the Bill until it shakes a bit more—that is something that perhaps we could do, if we were to do this again. With the Telecommunications (Security) Act 2021, that was done at length and collaboratively with industry, on a nearly fortnightly basis, for some time. Beyond that, I think that we are realistic in industry because we understand the pressures on the people trying to bring legislation in. A second point to remember is that we are all volunteers. Carla and I, and all those on the Cyber Resilience Committee, volunteer away from our day jobs—which are busy—to do all this. There is a realistic expectation, if you like—but I would say there has been a best effort.
Carla Baker: I would like to look to the future. We have all the secondary legislation that is coming—and there will be lot—so we recommend early insights, and time to review and consult, in order to provide that industry insight that we are happy to provide. Let us look to the secondary legislation and hope that there is good consultation there.
The Chair
We will now hear oral evidence from the Minister for AI and Online Safety, Kanishka Narayan. For this session, we have until 5.10 pm.
Q
Kanishka Narayan: Thank you for the question on definitions. I have two things to say on that. First, observing the evidence today, it is interesting that there are views in both directions on pretty much every definitional question. For example, on the definition of “incident thresholds”, I heard an expert witness at the outset of the day say that it is in exactly the right place, precisely because it adds incidents that have the capability to have an impact, even if not a directness of impact, to cover pre-positioning threats. A subsequent witness said that they felt that that precise definitional point made it not a fitting definition. The starting point is that there is a particular intent behind the definitions used in the Bill, and I am looking forward to going through it clause by clause, but I am glad that some of those tensions have been surfaced.
Secondly, in answer to your question on consultation, a number of the particular priority measures in the Bill were also consulted on under the previous Government. We have been engaging with industry and, in the course of implementation, the team has started setting up engagement with regulators and a whole programme of engagement with industry as well.
Q
Kanishka Narayan: I have met a number of companies, but the relevant Minister has also had extensive engagement with both companies and regulators, including on the question of definitions. I do not have a record of her meetings, but if that is of interest, I would be very happy to follow up on it.
Q
Kanishka Narayan: I am referring to the Minister for Digital Economy, who is in the other place.
Q
Kanishka Narayan: I have had some meetings but, as the Minister in charge of this Bill, she has been very engaged with businesses, so I think that is fitting. We have obviously worked very closely together, as we normally do, in the course of co-ordinating across the two Chambers.
Q
Kanishka Narayan: I have spoken to the Secretary of State about the Bill, including the reserve powers, and we have agreed that the policy objective is very clear. I do not think I am in a position to divulge particular details of policy discussions that we have had; I do not think that would be either appropriate or a fitting test of my memory.
Q
Kanishka Narayan: I think the guardrails in the Bill are very important, absolutely. The Bill provides that, where there is an impact on organisations or regulators, there is an appropriate requirement for both deep consultation and an affirmative motion of the House. I think that is exactly where it ought to be, and I do not think anything short of that would be acceptable.
Chris Vince
Q
Kanishka Narayan: The primary thing to say is that the range of organisations—commercial ones as well as those from the cyber-security world more generally—coming out to welcome the Bill is testament to the fact that it is deeply needed. I pay tribute to the fact that some of the provisions were engaged on and consulted on by the prior Government, and there is widespread consensus across industry and in the regulatory and enforcement contexts about the necessity and the quality of the Bill. On that front, I feel we are in a good place.
On specific questions, of course, there is debate—we have heard some of that today—but I am very much looking forward to going through clause by clause to explain why the intent of the Bill is reflected in the particular definitions.