(1 day, 19 hours ago)
Public Bill Committees
The Chair
Good afternoon. We will now hear oral evidence from Ian Hulme, the interim executive director of regulatory supervision and director of regulatory assurance for the Information Commissioner’s Office; Natalie Black, group director for infrastructure and connectivity for Ofcom; and Stuart Okin, director of cyber regulation and artificial intelligence for Ofgem. We need to stick to the timings in our programme order, so we have until 2.40 pm for this session. Could the witnesses please introduce themselves briefly before we hand over for questions?
Ian Hulme: Good afternoon. My name is Ian Hulme, and I am interim executive director of regulatory supervision at the ICO.
Natalie Black: Good afternoon. I am Natalie Black, and I am group director for infrastructure and connectivity at Ofcom.
Stuart Okin: My name is Stuart Okin; good afternoon. I am the director for cyber regulation and artificial intelligence at Ofgem.
Q
My second question is jointly for Ian and Stuart, from the ICO and Ofgem. Some industry stakeholders have expressed concern about low levels of incident reporting and enforcement under the NIS1—network and information systems—regs. How will your respective approaches to regulation change as a result of this Bill, to ensure that it is implemented and that cyber-resilience is improved across the sectors you are responsible for regulating?
Natalie Black: I will kick off. We have some additional responsibilities, building on the NIS requirements, but the data centre aspect of the Bill is quite a substantial increase in responsibilities for us. It is worth emphasising that we see that as a natural evolution of our responsibilities in the sector. Communications infrastructure is evolving incredibly quickly, as you will be well aware, and data centres are the next big focus. In terms of preparations, we are spending this time getting to know the sector and making sure we have the right relationships in place, so that we do not have a standing start. I have done a number of visits, for example, to hear at first hand from industry representatives about their concerns and how they want to work with us.
We are also focusing on skills and recruitment. We already have substantial cyber-security responsibilities in the communications infrastructure sector. We are building on the credibility of the team, but we are focused on making sure we continue to invest in them. About 60% of the team already come from the private sector. We want that to continue going forward, but we are not naive to how challenging it is to recruit in the cyber-security sector. For example, we are working with colleagues from the National Cyber Security Centre, and looking at universities it is accrediting, to see how we can recruit directly using those kinds of opportunities.
Ian Hulme: On incident reporting, the thresholds in the existing regulations mean that levels are very low. Certainly, the reports we see from identity service providers do not meet those thresholds. I anticipate that we will see more incidents reported to us. With our enhanced regulatory powers and the expanded scope of organisations we will be responsible for, I anticipate that our oversight will deepen and we will have more ability to undertake enforcement activity. Certainly from our perspective, we welcome the enhanced reporting requirements.
Stuart Okin: To pick up on the incident side of things, I agree with Ian. The thresholds will change. With the new legislation, any type of incident that could potentially cause an issue will obviously be reported, whereas that does not happen today under the NIS requirements.
On enforcement, in seven years we have used all the enforcement regimes available to us, including penalties, and we will continue to do so. We absolutely welcome the changes in the Bill to simplify the levels and to bring them up, similar to the sectorial powers that we have today.
Chris Vince (Harlow) (Lab/Co-op)
Q
Stuart Okin: In the energy sector, we tend to use operational technology rather than IT systems. That might mean technology without a screen, so an embedded system. It is therefore important to be able to customise our guidance. We do that today. We use the cyber assessment framework as a baseline, and we have a 335-page overlay on our website to explain how that applies to operational technology in our particular space. It is important to be able to customise accordingly; indeed, we have added physical elements to the cyber assessment framework, which is incredibly important. We welcome that flexibility being maintained in the Bill.
Ian Hulme: Just to contrast with colleagues from Ofcom and Ofgem, ICO’s sector is the whole economy, so it is important that we are able to produce guidance that speaks to all the operators in that sector. Because our sector is much bigger, we currently have something like 550 trust service providers registered, and that will grow significantly with the inclusion of managed service providers. So guidance will be really important to set expectations from a regulatory perspective.
Natalie Black: To round this off, at the end of the day we always have to come back to the problem we are trying to solve, which is ensuring cyber-security and resilience. As you will have heard from many others today, cyber is a threat that is always evolving. The idea that we can have a stagnant approach is for the birds. We need to be flexible as regulators. We need to evolve and adapt to the threat, and to the different operators we will engage with over the next couple of years. Collectively, we all appreciate that flexibility.
Dr Allison Gardner (Stoke-on-Trent South) (Lab)
Q
The ICO is a horizontal regulator working across all sectors. In your experience, would a single cyber regulator be a good idea? What would be the benefits and the challenges? I will allow Ofcom and Ofgem to jump in and defend themselves.
Ian Hulme: I suppose the challenge with having a single regulator is that—like ourselves, as a whole-economy regulator—it will have to prioritise and direct its resources at the issues of highest harm and risk. One benefit of a sectoral approach is that we understand our sectors at a deeper level; we certainly work together quite closely on a whole range of issues, and my teams have been working with Natalie and Stuart’s teams on the Bill over the last 18 months, and thinking about how we can collaborate better and co-ordinate our activities. It is really pleasing to see that that has been recognised in the Bill with the provisions for information sharing. That is going to be key, because the lack of information-sharing provisions in the current regs has been a bit of a hindrance. There are pros and cons, but a single regulator will need to prioritise its resources, so you may not get the coverage you might with a sectoral approach.
Natalie Black: Having worked in this area for quite some time, I would add that the challenge with a single regulator is that you end up with a race to the bottom, and minimum standards you can apply everywhere. However, with a tailored approach, you can recognise the complexity of the cyber risk and the opportunity to target specific issues—for example, prepositioning and ransomware. That said, we absolutely recognise the challenge for operators and companies in having to bounce between regulators. We hear it all the time, and you will see a real commitment from us to do something about it.
Some of that needs to sit with the Department for Science, Innovation and Technology, which is getting a lot of feedback from all of us about how we need it to co-ordinate and make things as easy as possible for companies—many of which are important investors in our economy, and we absolutely recognise that. We are also doing our bit through the UK Regulators Network and the Digital Regulation Cooperation Forum to find the low-hanging fruit where we can make a difference. To give a tangible example, we think there should be a way to do single reporting of incidents. We do not have the answer for that yet, but that is something we are exploring to try and make companies’ lives easier. To be honest, it will make our lives easier as well, because it wastes our time having to co-ordinate across multiple operators.
Bradley Thomas (Bromsgrove) (Con)
Q
Ian Hulme: Again, to contrast the ICO’s position with that of other colleagues, we have a much larger sector, as it currently exists, and we will have a massively larger sector again in the future. We are also funded slightly differently. The ICO is grant in aid funded from Government, so we are dependent on Government support.
To move from a reactive footing, which is our position at the moment—that is the Government’s guidance to competent authorities and to the ICO specifically—to a proactive footing with a much expanded sector, will need significant uplift in our skills and capability, as well as system development in order to register and ingest intelligence from MSPs and relevant digital service providers in the future.
From our perspective at the ICO, we need significant support from DSIT so that we can transition into the new regulatory regime. It will ultimately be self-funding—it is a sustainable model—but we need continued support during the transition period.
Bradley Thomas
Q
Ian Hulme: At the moment, to give you a few broad numbers our teams are around 15 people, and we anticipate doubling that. In the future, with self-funding, we will be a bit more in control of our own destiny. It is a significant uplift from our perspective.
Natalie Black: The challenge is that the devil is in the detail. Until that detail has worked through secondary legislation, we will have to reserve our position, so that we give you accurate numbers in due course. From Ofcom’s point of view, it is about adding 10s rather than significant numbers. I do not think we are that far off the ICO.
But I want to emphasise that this is about quality, not necessarily quantity. Companies want to work with expert regulators who really know what they are doing. Ofcom is building on the work we are already doing under the Telecommunications (Security) Act 2021. It will be a question of reinforcing that team, rather than setting up a separate one. We want to get the best, high-quality individuals who know how to talk to industry and really know cyber-security, to make sure people have a good experience when engaging with us.
Ian Hulme: To add to that, the one challenge we will face as a group is that we are all fishing in the same pond for skills. MSPs and others will also be fishing in that pond from the sector side. There needs to be recognition that there is going to be a skills challenge in this implementation.
Stuart Okin: To specifically pick up on the numbers, we have a headcount of 43 who are dedicated within cyber regulation. That also includes the investment side. We also have access to the engineering team—the engineering directorate—which is a separate team. There is also our enforcement directorate, as well as the legal side of things. The scope changes proposed in the Bill are just the large load controllers and supply chain, so we are not expecting a major uplift. These will be small numbers in comparison. Unlike my colleagues, we are not expecting a big uplift in resourcing.
Tim Roca (Macclesfield) (Lab)
Q
Ian Hulme: There are two angles to that. From a purely planning and preparation perspective, it is incredibly difficult, without having seen the detail, to know precisely what is expected of MSPs and IDSPs in the future, and therefore what the regulatory activity will be. That is why, when I am answering questions for colleagues, it is difficult to be precise about those numbers.
Equally, we are hearing from industry that it wants that precision as well. What is the expectation on it regarding incident reporting? What does “significant impact” mean? Similarly, with the designation of critical suppliers, precision is needed around the definitions. From a regulatory perspective, without that precision, we will probably find ourselves in a series of potential cases arguing about the definition of an issue. To give an example, if the definition of MSP is vague, and we are saying to an MSP that we think it is in scope, and it is saying, “No, we are not,” then a lot of our time and attention will be taken up with those types of arguments and disputes. Precision will be key for us.
Tim Roca
Q
Ian Hulme: There is a balance to be struck. When something is written on the face of the Bill and things change—and we know that this is a fast-moving sector—it makes it incredibly difficult to change things. There is a balance to be struck between primary and secondary, but what we are hearing and saying is that more precision around some of the definitions will be critical.
Natalie Black: I strongly agree with Ian. A regulator is only as good as the rules that it enforces. If you want us to hold the companies to account, we need to be absolutely clear on what you are asking us to do. The balance is just about right in terms of primary and secondary, particularly because the secondary vehicle gives us the opportunity to ensure that there is a lot of consultation. The Committee will have heard throughout the day—as we do all the time from industry—that that is what industry is looking for. They are looking for periods of business adjustment—we hear that loud and clear—and they really want to be involved in the consultation period. We also want to be involved in looking at what we need to take from the secondary legislation into codes of practice and guidance.
Q
Natalie Black: That is a great question, and I am not at all surprised that you have asked it, given everything that is going on at the moment. As well as being group director for infrastructure and connectivity, I am also the executive member of the board, sitting alongside our chief executive officer, so from first-hand experience I can say that Ofcom really recognises how fast technology is changing. I do not think there is another sector that is really at the forefront of change in this way, apart from the communications sector. There are a lot of benefits to being able to sit across all that, because many of the stakeholders and issues are the same, and our organisation is learning to evolve and adapt very quickly with the pace of change. That is why the Bill feels very much like a natural evolution of our responsibility in the security and resilience space.
We already have substantial responsibilities under NIS and the Telecommunications (Security) Act 2021. We are taking on these additional responsibilities, particularly over data centres, but we already know some of the actors and issues. We are using our international team to understand the dynamics that are affecting the Online Safety Act, which will potentially materialise in the security and resilience world. As a collective leadership team, we look across these issues together. The real value comes from joining the dots. In the current environment, that is where you can make a real difference.
Q
Natalie Black: That is definitely not what I am saying. You can cut the cake in many different ways. From where I sit—from my experience to date—you need specific sector regulators because you need regulators that understand the business dynamics, the commercial dynamics, the people dynamics and the issues on a day-to-day basis.
We have many people who have worked at Ofcom for a very long time, and who know the history and have seen these issues before. When it comes to threats, which is ultimately what we are dealing with—cyber-security is a threat—it is cross-cutting. It adapts, evolves and impacts in different ways. The knack is having a sector regulator that really understands what is going on. That means that when you are dealing with cyber-incidents, you understand the impact on real people and businesses, and ultimately you can do something more quickly about it.
Q
Stuart Okin: We have a clear understanding of the responsibilities within Ofgem. We are the joint competent authority with the Department for Energy Security and Net Zero. The Department does the designation and instant handling, and we do all the rest of the operations, including monitoring, enforcement and inspections. We understand our remit with NCSC. GCHQ is part of the cyber-security incident response team; it is ultimately responsible there.
Going back to your main concern, we are part of an ecosystem. We have to understand where our lines are drawn, where NCSC’s responsibilities are and what the jobs are. To go back to us specifically, we can talk about engineering aspects, electrical engineering, gas engineering and the cyber elements that affect that, including technology resilience—not cyber. As long as we have clear gateways and communication between each other—and I think that the Bill provides those gateways—that will also assist, but there are clear lines of responsibilities.
Natalie Black: It is clear that there is work to do to get in the same place for the Bill. Exactly as Stuart said, the information gateways will make a massive difference. It is too hard, at the moment, to share information between us and with the National Cyber Security Centre. The fact that companies will have to report within 24 hours not only to us but to the NCSC is very welcome.
To return to my earlier point, we think that there is a bit of work for DSIT to do to help to co-ordinate this quite complicated landscape, and I think that industry would really welcome that.
Ian Hulme: I agree with colleagues. From an ICO perspective, we see our responsibilities as a NIS competent authority as complementary to our role as a data protection regulator. If you want secure data, you have to have secure and resilient networks, which are obviously used to process data. We see it as a complementary set of regulations to our function as a data protection regulator.
David Chadwick (Brecon, Radnor and Cwm Tawe) (LD)
Q
It strikes me that, if one of the things that this legislation is to guard against is pre-positioning, and there are 14 parallel reporting systems in place, it could be the case that those pre-positioning attacks are not picked up as co-ordinated attacks from another nation state or organisation, because they are not pulled together in time.
Natalie Black: I point to my earlier remarks about information sharing. You are right: that is one of the great benefits of the Bill. To be able to do more, particularly when it comes to pre-positioning attacks, is really important. You will have heard from the NCSC, among others, that that is certainly a threat that we are seeing more and more of.
At the moment, it is too difficult to share information between us. The requirement to have an annual report to the NCSC is a good mechanism for consolidating what we are all seeing, and then for the NCSC to play the role of drawing conclusions. It is worth emphasising that Ofcom is not an operational organisation; we are a regulator. We look to the NCSC to provide threat leadership for what is going on across the piece. I think that that answers your question about where it all comes together.
Stuart Okin: I fully support that. The NSCS will be the hub for that type of threat intel and communications, in terms of risks such as pre-positioning and other areas. The gateways will help us to communicate.
Ian Hulme: Bringing it back to the practicalities of instant reporting, you said that there are potentially 14 lines of incident reporting because there are 14 competent authorities. How that can be consolidated is something to be explored. Put yourself in a position of an organisation that is having to make a report: there needs to be clarity on where it has to make it to and what it needs to report.
David Chadwick
Q
Ian Hulme: As we have already explained, the current regs do not allow us to share the information, which is a bit of a barrier for us. In the future, certainly, we will be working together to try to figure it out. I think that there is also a role for DSIT in that.
Natalie Black: First, we currently have a real problem in that information sharing is much harder than it should be. The Bill makes a big difference in addressing that point, not only among ourselves but with DSIT and NCSC. Secondly, we think that there is an opportunity to improve information reporting, particularly incident reporting, and we would welcome working with DSIT and others—I have mentioned the Digital Regulation Cooperation Forum—to help us find a way to make it easier for industry, because the pace at which we need to move means that we want to ensure that there is no unnecessary rub in the system.
Emily Darlington (Milton Keynes Central) (Lab)
Q
Ian Hulme: We need to think about this as essentially two different regimes. The requirements under data protection legislation to report a data breach are well established, and we have teams, systems and processes that manage all that. There are some notable cases that have been in the public domain in recent months where we have levied fines against organisations for data breaches.
The first thing to realise is that we are still talking about only quite a small sub-sector—digital service providers, including cloud computing service providers, online marketplaces, search engines and, when they are eventually brought into scope, MSPs. A lot of MSPs will provide services for a lot of data controllers so, as I explained, if you have the resilience and security of information networks, that should help to make data more secure in the future.
Lincoln Jopp (Spelthorne) (Con)
Q
I have dealt with the ICO before. Maybe it was the company that I worked in and led, but there was a culture there that, if you had a data breach, you told the ICO. There was no question about it. How are you going to develop your reactions and the behaviours you reward in order to encourage a set of behaviours and cultures of openness within the corporate sector, bearing in mind that, as was said this morning, by opening that door, companies could be opening themselves up to a hefty fine?
Stuart Okin: In the energy sector, we have that culture. It is one of safety and security, and the chief executives and the heads of security really lean into it and understand that particular space. There are many different forums where they communicate and share that type of information with each other and with us. Incident response is really the purview of DESNZ rather than us, but they will speak to us about that from a regulatory perspective.
Ian Hulme: From the ICO’s perspective, we receive hundreds of data-breach reports. The vast majority of those are dealt with through information and guidance to the impacted organisation. It is only a very small number that go through to enforcement activity, and it is in only the most egregious cases—where failures are so egregious that, from a regulatory perspective, it would be a failure on our part not to take action.
I anticipate that is the approach we will take in the future when dealing with the instant reporting regime that the Bill sets out. Our first instinct would be to collaborate with organisations. Only in the most egregious cases would I imagine that we would look to exercise the full range of our powers.
Natalie Black: From Ofcom’s point of view, we have a long history, particularly in the telecoms sector, of dealing with a whole range of incidents, but I certainly hear your point about the victim. When I have personally dealt with some of these incidents, often you are dealing with a chief executive who has woken up that morning to the fact that they might lose their job and they have very stressed-out teams around them. It is always hard to trust the initial information that is coming out because no one really knows what is going on, certainly for the first few hours, so it is the maturity and experience that we would want to bring to this expanded role when it comes to data centres.
Ultimately the best regulatory relationships I have seen is where there is a lot of trust and openness that a regulator is not going to overreact. They are really going to understand what is going on and are very purposeful about what they are trying to achieve. From Ofcom’s point of view it is always about protecting consumers and citizens, particularly with one eye on security, resilience and economic growth. The experience we have had over the years means that we can come to those conversations with a lot of history, a lot of perspective, and, to be honest, a bit of sympathy because sometimes those moments are very difficult for everyone involved.
The Chair
We have only five minutes left for this session, so if we can have concise questions and answers we might get everyone in.
Sarah Russell (Congleton) (Lab)
Q
Stuart Okin: Essentially, we would not go all the way down the supply chain. First, the operators of essential services are defined very much by the thresholds. Ultimately, they are the first point of responsibility. On the critical third party suppliers that have been brought in by the Bill, there will be a small number of those that, for energy, are for the entire systemic system of the UK, not the smaller entities. So we will hold those to account. On the enforcement side of things, if and when it comes to that, they will be in the same situation as the current operators of essential services are today. We welcome the simplification in the Bill and bringing those into the same sectorial powers and the same types of fines that we see today. It will not go down to those minutiae of detail. Again, the secondary legislation gives you the ability to define that.
Natalie Black: To keep it brief, we welcome the supply chain being brought into scope because we are all well aware that the most high-profile recent incidents often emanated from the supply chain. That said, we should be very honest about the complexity of entering this space, exactly for all the points that you have alluded to in terms of volume and scale and everything. We are already using this time to work through what our methodology will be. Engaging with the operators of essential services who are ultimately the customer of these suppliers has to be a starting point in terms of who they are most worried about in their supply chain. As Stuart says, you will see some commonality across all our sectors, so the numbers might not be as big as we might at first think, but this is what we need to work through over the coming months.
Ian Hulme: From an ICO perspective, one of the big tasks that we are going to have in understanding the MSP market is what their supply chains look like. We are perhaps a little behind colleagues in other regulators because of the difference in the regulatory regime, but that is one of the tasks that we will have to get to grips with.
Freddie van Mierlo (Henley and Thame) (LD)
Q
Ian Hulme: Certainly from an ICO perspective, many IDSPs that we currently regulate are operating across boundaries. From our perspective, the focus is on the outcome. If they have operations in other jurisdictions that are providing services into the UK, our focus is on the outcome and getting to understand the UK side of things more than anything else.
Natalie Black: This is a challenge for us every day. Many of the companies that we regulate have a footprint in the UK or multiple footprints around the world. The issue is in making sure that the UK requirements are as clear as possible to give them no excuse to argue exceptionalism. That is why we really welcome the opportunity to get into the detail through secondary legislation, which will be very important in holding all the companies to account that we think need to be held to account.
The Chair
That brings us the end of the allotted time for the Committee to ask questions. On behalf of the Committee, I thank our witnesses for their evidence.
Examination of Witness
Chung Ching Kwong gave evidence.
The Chair
We will now hear oral evidence from Chung Ching Kwong, senior analyst for the Inter-Parliamentary Alliance on China. We have until 3 pm for this session.
Chris Vince
Q
Chung Ching Kwong: Just to give some background, I am a senior analyst for the Inter-Parliamentary Alliance on China, and a PhD candidate in law at the University of Hamburg, focusing on data protection and data transfer. My expertise is not entirely on critical infrastructure security, but I do a lot of analysis on China’s legal system and also how it works in general. That is how I can contribute to this evidence session.
The threat posed by the CCP to our critical national infrastructure, such as water, energy and transportation, has shifted from espionage—stealing secrets—to pre-positioning, or preparing for sabotage. We cannot understand the threat without understanding the civil-military fusion of the Chinese state. Chinese companies operating in our CNI are not independent per se, in the way we would normally think about that in our country—in other words, private entities that operate on their own and have their own decision-making mechanisms. They are legally obligated under at least article 7 of China’s national intelligence law to co-operate with the state, to provide information, to provide help with decryption and to gather information at the request of the Government.
As highlighted by the NCSC, groups such as Volt Typhoon are pre-positioning within utility networks in the States. They do not use malware; they live off the land, using legitimate administrative credentials to proceed undetected for years. That is not for financial gain; they do it until the time is right for them to pull the trigger and cause a crisis.
In the transportation sector, there are a lot of cellular IOT modules embedded in e-buses and EVs. These devices require constant communication with servers in China to function, so they are constantly feeding data back to China for maintenance, remote access of data and that kind of thing. It could all be innocent and a feature for operational and functional purposes, but if—and only if—Beijing orders that data to be handed over and actions to be taken, it will become a problem.
That is the context of the risk we are facing when it comes to China, especially in terms of state-sponsored attacks. All entities, be they foreign companies in China or local Chinese-founded companies, have an obligation under Chinese law.
Chris Vince
Q
Chung Ching Kwong: Gathering information and data is definitely one of the main goals, but it is not limited to data transfer. Right now, in the UK, they do not need to rely only on access to critical infrastructure; under the Data Protection Act here in the UK, it is legal to transfer personal data through contractual clauses, so they can have access to personal data as long as they have that.
Of course, gathering data gives them insight into what is happening in the UK; if they want transportation data or power grid data, they can gather those data by different means. But it is also very important to understand Xi Jinping’s comprehensive national security concept. I think this is the reason why they are so determined to collect information, not only in the UK but worldwide.
In that kind of comprehensive security concept, political security, defined as the survival of the regime, is paramount. It overrides anything—not economic gain, not whether or not the GDP of China is going to grow in the next year, but any information or action that they see as necessary to make sure that the CCP is in control. That means it is gathering data of dissidents overseas, it is gathering data on the power grid, it is gathering data on transportation—anything they might find useful for a different purpose, which is, ultimately, to serve the goal of the survival of the regime.
Q
Secondly, it has been reported recently that communications of senior Government aides were hacked by Chinese state affiliates between 2021 and 2024. In view of that threat to telecoms networks, what are the potential cyber-risks to communications infrastructure that you see arising from the intended location of China’s super-embassy in the City of London?
Chung Ching Kwong: On the first question, about what can be done to help sectors understand the risks, education is paramount. At this point, we do not have a comprehensive understanding of what kind of risks state actors like China pose. We are very used to the idea that private entities are private entities, because that is how the UK system works; we do not see that organisations, entities or companies associated with China or the Chinese state are not independent actors as we would expect, or want to expect.
There is a lot of awareness-raising to be done and guidance to be issued around how to deal with these actors. There is a lot of scholarly work that says that every part of Chinese society—overseas companies and so on—is a node of intelligence collection within the system of the CCP. Those things are very important when it comes to educating.
Also, the burden of identifying what is a national security risk and what is not should not be put on small and medium-sized businesses, or even big companies, because they are not trained to understand what the risks are. If you are not someone specialising in the PLA and a lot of other things academically, it would be very difficult to have to deal with those things on a day-to-day basis and identify, “That’s a threat, and that’s a threat.”
Sorry, what was the second question?
Q
Chung Ching Kwong: There is not a lot of publicly available information on the sensitive cabling that is around the area, so I cannot confidently say what is really going to happen if they start to build the embassy and have such close contact with those cables. The limit of this Bill when it comes to the Chinese embassy is that it cannot mitigate the risks that are posed by this mega-embassy in the centre of London, because it regulates operators and not neighbours or any random building in the City. If the embassy uses passive interception technology to harvest data from local wi-fi or cellular networks, no UK water or energy company is breached. There is no breach if they are only pre-positioning there to collect information, instead of actually cutting off the cables, so when they do cut off the cables, it will be too late. There will be no report filed under the Bill, even if it is under the scope of the Bill when it comes to regulation. The threat in this case is environmental and really bypasses the Bill’s regulatory scope.
Dave Robertson (Lichfield) (Lab)
Q
Chung Ching Kwong: I think that to a certain extent they will. For hackers or malicious actors aiming for financial gain with more traditional hacking methods, it will definitely do a job in protecting our national security. But the Bill currently views resilience through an IT lens. It is viewing this kind of regulatory framework as a market regulatory tool, instead of something designed to address threats posed by state-sponsored actors. It works for cyber-criminals, but it does not work for state actors such as China, which possess structural leverage over our infrastructure.
As I said before, we have to understand that Chinese vendors are legally obliged to compromise once they are required to. The fine under the Bill is scary, but not as scary as having your existence threatened in China—whether you still have access to that market or you can still exist as a business there. It is not doing the job to address state-sponsored hackers, but it really does help when it comes to traditional hacking, such as phishing attempts, malware and those kinds of things.
Bradley Thomas
Q
Chung Ching Kwong: The US is probably a good example. It passed Executive order 14028 in May 2021, which requires any software vendor selling to the US federal Government to provide something called a software bill of materials—SBOM. That is technically a table of ingredients, but for software, so you can see exactly what components the software is made of. A lot of the time people who code are quite lazy; they will pull in different components that are available on databases online to form a piece of software that we use. By having vendors provide an SBOM, when anything happens, or whenever any kind of vulnerability is detected, you can very easily find out what happened.
That is due to a hack in 2021, in which a tiny, free piece of code called Log4j was found to have a critical vulnerability. It was buried inside thousands of commercial software products. Without that list of ingredients, it would be very difficult for people who had been using the software to find out, because, first, they may not have the technological capabilities and, secondly, they would not even know if their software had that component. This is one of the things the US is doing to mitigate the risks when it comes to software.
Something that is not entirely in the scope of the Bill but is also worth considering is the US’s Uyghur Forced Labour Prevention Act. That is designed to prevent goods made with forced labour from entering the supply chain. The logic of preventing forced labour is probably something that the UK can consider. Because the US realised that it could not inspect every factory in Xinjiang to prove forced labour, it flipped the script: the law creates a rebuttable presumption that all goods from that region are tainted, so the burden of proof is now on the importer to prove, with clear and convincing evidence, that their supply chain is clean.
A similar logic could be considered when it comes to this Bill to protect cyber-security. Any entities that are co-operating with the PLA—the People’s Liberation Army—for example, should be considered as compromised or non-trustworthy until proven otherwise. That way, you are not waiting until problems happen, when you realise, “Oh, this is actually tainted,” but you prevent it before it happens. That is the comparison that I would make.
Tim Roca
Q
Thank you for speaking to us today. May I turn the conversation a little on its head? We have been talking about national security and the threat from China and others. You were an activist in Hong Kong and made a great deal of effort to fight the Chinese Communist party’s invasion of privacy—privacy violations using the national security law—and other things. Do you see any risk in this legislation as regards civil liberties and privacy? We have had a bit of discussion about how much will go into secondary legislation and how broad the Secretary of State’s powers might be.
Chung Ching Kwong: The threat to privacy, especially to my community—the Hong Kong diaspora community in this country—will be in the fact that, under clause 9, we will be allowing remote access for maintenance, patches, updates and so on. If we are dealing with Chinese vendors and Chinese providers, we will have to allow, under the Bill, certain kinds of remote access for those firms to maintain the operation of software of different infrastructures. As a Hongkonger I would be worrying, because I do not know what kind of tier 2 or tier 3 supplier will have access to all those data, and whether or not they will be transmitted back to China or get into the wrong hands. It will be a worry that our data might fall into the wrong hands. Even though we are not talking specifically about personal data, personal data is definitely in scope. Especially for people with bounties on their head, I imagine that it will be a huge worry that there might be more legitimate access to data than there is right now under the Data Protection Act.
Tim Roca
Q
Chung Ching Kwong: It is always a double-edged sword when it comes to regulating against threats. The more that the Secretary of State or the Government are allowed to go into systems and hold powers to turn off, or take over, certain things, the more there is a risk that those powers will be abused, to a certain extent, or cause harm unintentionally. There is always a balance to be struck between giving more protection to privacy for ordinary users and giving power to the Government so that they can act. Obviously, for critical infrastructure like the power grid and water, the Government need control over those things, but for communications and so on, there is, to a certain extent, a question about what the Government can and cannot do. But personally I do not see a lot of concerns in the Bill.
Emily Darlington
Q
Chung Ching Kwong: It should definitely be covered by the Bill, because if we are not regulating to protect hardware as well, we will get hardware that is already embedded with, for example, an opcode attack. Examples in the context of China include the Lenovo Superfish scandal in 2015, in which originally implemented ad software had hijacked the https certificate, which is there to protect your communication with the website, so that nobody sees what activity is happening between you and the website. Having that Superfish injection made that communication transparent. That was done before the product even came out of the factory. This is not a problem that a software solution can fix. If you were sourcing a Lenovo laptop, for example, the laptop, upon arrival, would be a security breach, and a privacy breach in that sense. We should definitely take it a step further and regulate hardware as well, because a lot of the time that is what state-sponsored attacks target as an attack surface.
The Chair
That brings us nicely to the end of the time allotted for the Committee to ask questions. On behalf of the Committee, I thank our witness for her evidence.
Examination of Witness
Professor John Child gave evidence.
The Chair
We will now hear evidence from Professor John Child, professor of criminal law at the University of Birmingham and co-founding director of the Criminal Law Reform Now Network. For this session, we have until 3.20 pm.
Q
Professor John Child: My specialism is in criminal law, so this is a bit of a side-step from a number of the pieces of evidence you have heard so far. Indeed, when it comes to the Bill, I will focus on—and the group I work for focuses on—the potential in complementary pieces of legislation, and particularly the Computer Misuse Act 1990, for criminalisation and the role of criminalisation in this field.
I think that speaks directly to the first question, on effective collaboration. It is important to recognise in this field, where you have hostile actors and threats, that you have a process of potential criminalisation, which is obviously designed to be effective as a barrier. But the reality is that, where you have threats that are difficult to identify and mostly originating overseas, the actual potential for criminalisation and criminal prosecution is slight, and that is borne out in the statistics. The best way of protecting against threats is therefore very much through the use of our cyber-security expertise within the jurisdiction.
When we think about pure numbers, and the 70,000-odd cyber-security private experts, compared with a matter of hundreds in the public sector, police and others, better collaboration is absolutely vital for effective resilience in the system. Yet what you have at the moment is a piece of legislation, the Computer Misuse Act, that—perfectly sensibly for 1990—went with a protective criminalisation across-the-board approach, whereby any unauthorised access becomes a criminal offence, without mechanisms to recognise a role for a private sector, because essentially there was not a private sector doing this kind of work at the time.
When we think about potential collaboration, first and foremost for me—from a criminal law perspective—we should make sure we are not criminalising effective cyber-security. The reality is that, when we look at the current system, if any authorised access of any kind becomes a criminal offence, you are routinely criminalising engagement in legitimate cyber-security, which is a matter of course across the board. If you are encouraging those cyber-security experts to step back from those kinds of practices—which may make good sense—you are also lessening that level of protection and/or outsourcing to other jurisdictions or other cyber-security firms, with which you do not necessarily have that effective co-operation, reporting and so on. That is my perspective. Yes, you are absolutely right, but we now have mechanisms in place that actively disincentivise that close collaboration and professionalisation.
Sarah Russell
Q
Professor John Child: Yes. It is not the easiest criminal law tale, if you like. If there were a problem of overcriminalisation in the sense of prosecutions, penalisation, high sentences and so on, the solution would be to look at a whole range of options, including prosecutorial discretion, sentencing or whatever it might be, to try to solve that problem. That is not the problem under the status quo. The current problem is purely the original point of criminalisation. Think of an industry carrying out potentially criminalised activity. Even if no one is going to be prosecuted, the chilling effect is that either the work is not done or it is done under the veil of potential criminalisation, which leads to pretty obvious problems in terms of insurance for that kind of industry, the professionalisation of the industry and making sure that reporting mechanisms are accurate.
We have sat through many meetings with the CPS and those within the cyber-security industry who say that the channels of communication—that back and forth of reporting—is vital. However, a necessary step before that communication can happen is the decriminalisation of basic practices. No industry can effectively be told on the one hand, “What you are doing is vital,” but on the other, “It is a criminal offence, and we would like you to document it and report it to us in an itemised fashion over a period of time.” It is just not a realistic relationship to engender.
The cyber-security industry has evolved in a fragmented way both nationally and internationally, and the only way to get those professionalisation and cyber-resilience pay-offs is by recognising that the criminal law is a barrier—not because it is prosecuting or sentencing, but because of its very existence. It does not allow individuals to say, “If, heaven forbid, I were prosecuted, I can explain that what I was doing was nationally important. That is the basis on which I should not be convicted, not because of the good will of a prosecutor.”
Dr Gardner
Q
Professor John Child: I think the Bill does a lot of things quite effectively. It modernises in a sensible way and it allows for the recognition of change in type of threat. This goes back to my criminalisation point. Crucially, it also allows modernisation and flexibility to move through into secondary legislation, rather than us relying purely on the maturations of primary legislation.
In terms of board-level responsibility, I cannot speak too authoritatively on the civil law aspects, but drawing on my criminal law background, there is something in that as well. At the moment, the potential for criminalisation applies very much to those making unauthorised access to another person’s system. That is the way the criminal law works. We also have potential for corporate liability that can lead all the way up to board rooms, but only if you have a directing mind—so only if a board member is directing that specific activity, which is unlikely, apart from in very small companies.
You can have a legal regime that says, whether through accreditation or simple public interest offences, that there are certain activities that involve unauthorised access to another person’s system, which may be legitimate or indeed necessary. However, we want a professional culture within that; we do not want that outsourced to individuals around the world. You can then build in sensible corporate liability based on consent or connivance, which goes to individuals in the boardroom, or a failure-to-prevent model of criminalisation, which is more popular when it comes to financial crimes. That is where you say, “If this exists in your sector, as an industry and as a company, you can be potentially liable as an entity if you do not make sure these powers are used responsibly, and if you essentially outsource to individuals in order to avoid personal liabilities”.
Dr Gardner
Q
Professor John Child: Again, I have to draw back to the criminal law aspects. I think the Bill does the things it needs to do well; certainly, from the conversations I have had with those in cyber-security and so on, these are welcome steps in the right direction.
However, when you look at critical national infrastructure, although you can create layers of civil responsibility and regulation—which is entirely sensible—most of that will filter down to individuals doing cyber-security and resilience work. It is about empowering those individuals; within a state apparatus, that is one thing, but even with regulators and in-house cyber-security experts, individuals are working only within the confines of what they are allowed to do under the criminal law, as well as the civil regulatory system.
The reason I have been asked here, and what a lot of my work has focused on, is this: if you filter responsibility down to individuals doing security work for national as well as commercial infrastructure, you need to empower them to do that work effectively. The current law does not do that; it creates the problem of either doing that work under the veil of criminalisation, or not doing it, with work being outsourced to places where you do not have the back-and-forth communication and reporting regime you would need.
Dr Gardner
I think you are touching on the old problem of where liability lies when you have this long supply chain of diffused responsibility, but thank you.
Dave Robertson
Q
Professor John Child: That is a good question. It is certainly fair to say that all jurisdictions are somewhat in flux about how to deal with cyber threats, which are mushrooming in ways people would not have expected—certainly not in 1990, but even many years after.
The various international conventions—the OECD, the Budapest convention and so on—require regulation and criminalisation, but those are not nearly as wide as the blanket approach that was taken in this country. Some comparative civil law jurisdictions in the rest of Europe start from a slightly different place, in that they did not necessarily take the maximalist approach to criminalisation we did.
In a number of jurisdictions, you do not have direct criminalisation of all activities, regardless of the intention of the actor, in the same way that we do. So we are starting from a slightly different position. Having said that, we do see a number of jurisdictions making positive strides in this direction, because they need to; indeed, we see that at European Union level as well, where directives are being created to target this area of concern.
There are a few examples. We wrote a comparative report, incidentally, which is openly available. In terms of some highlights from that, there is a provision in French law, for example, where, despite mandatory prosecution being the general model within French criminal law, there is a carve-out relating to cyber-security and legitimate actors, where there is not the same requirement to prosecute. In the Netherlands, there was a scandal around hacking of keycards for public transport. That was done for responsible reasons, and there was a backlash in relation to prosecution there. There were measures taken in terms of prosecutorial discretion. Most recently, in Portugal, we saw a specific cyber-security defence created within the criminal law just last year.
In the US, it varies between states. In a lot of states, you have quite an unhelpful debate between minimalist and maximalist positions, where they either want to have complete hack-back on the one hand or no action at all on the other, but you have a slightly more tolerant regime in terms of prosecution.
So there are varying degrees, but certainly that is the direction of travel. For sensible, criminal law reasons that I would speak to, as well as the commercial benefits that come with a sector that is allowed to do its work properly, and the security benefits, that is certainly the direction of travel.
Dave Robertson
Q
Professor John Child: Yes. As I understand it, it does. This is part of the reason, incidentally, why my organisation, which focuses very much on criminal law aspects, ended up doing some collaborative work with the CyberUp campaign. That is because, from the industry perspective, they can do that kind of business modelling in a way that we do not. Whereas we can make the case for sensible criminal law reform, they can talk about how that reform translates into both the security environment and the commercial environment. Their perspective on this is, first, that we can see that there is already outsourcing of these kinds of services, particularly to the US, Israel and other more permissive jurisdictions. That is simply because, if you are a cyber-security expert in one of those jurisdictions, you are freer to do the work companies would like you to do to make sure their systems are safe here.
There are also the sectoral surveys and so on, and the predictions about what it is likely to do to the profession if you allow it to do these kinds of services in this jurisdiction. That is about the security benefits, but they are also talking about something like a 10% increase in the likely projection of what cyber-security looks like in this jurisdiction—personnel, GDP and so on.
Q
Professor John Child: There are obviously a number. It is always more comfortable when you have a beginning point of criminalisation. The argument to decriminalise in an environment where you want to protect against threats is sometimes a slightly unintuitive sell. Is the criminalisation that we have doing the necessary work in terms of actually fighting the threats? To some extent, yes, but it is limited. Is it doing harms? There is an argument to say that it is doing harms.
This comes back to the point that was made earlier, which was perfectly sensible. When you speak to the CPS and others, their position as prosecutors is to say, “Very few people are being prosecuted, and we certainly don’t want to be prosecuting legitimate cyber-security experts, so there is no problem.” Admittedly, that means there is no problem in terms of actual criminalisation and prosecution, but that is the wrong problem. If you focus on the problem being the chilling effect of the existence of the criminalisation in the first place, you simply cannot solve that through prosecutorial discretion, and nor should you, when it comes to identifying what a wrong is that deserves to be criminalised. You certainly cannot resolve it through sentencing provisions.
The only way that you can sensibly resolve this is either by changing the offence—that is very difficult, not least because, from a position of criminalisation, it might be where other civil jurisdictions begin—or by way of defence, which realistically is the best solve from the point we are at now. If you have a defence that can be specifically tailored for cyber-security and legitimate actors, you can build in reverse burdens of proof. You can build in objective standards of what is required in terms of public interest.
The point here is that the worry is one of bad actors taking advantage. The reality is that that is very unlikely. The idea that the bad actors we identify within the system would be able to demonstrate how they are acting in the public best interest is almost ridiculous. Indeed, the prospect of better threat intelligence, better securities and so on provides more information and better information-sharing to the NCSC and others and actually leads to more potential for prosecution of nefarious actors rather than less.
It is a more complicated story than we might like in terms of a standard case for changing the criminal law, but it is nevertheless an important one.
The Chair
That brings us to the end of the time allotted to ask questions. On behalf of the Committee, I thank our witness for his evidence. We move on to our next panel.
Examination of witness
Detective Chief Superintendent Andrew Gould gave evidence.
The Chair
We will now hear oral evidence from Detective Chief Superintendent Andrew Gould, programme lead for the National Police Chiefs’ Council cyber-crime programme. For this session, we have until 3.40 pm. I call Dr Ben Spencer.
Q
Secondly, on ransomware attacks, you will know that the Government review states that ransomware is
“the greatest of all serious and organised cybercrime threats”.
In your view, what is the scale of that threat and what sectors and businesses are the primary targets?
DCS Andrew Gould: To take the actors first, they are probably quite well known, in terms of the general groupings. Yes, we have our state actors—the traditional adversaries that we regularly talk about—and they generally offer very much a higher-end capability, as you will all be aware.
The next biggest threat group is organised crime groups. You see a real diversity of capability within that. You will see some that are highly capable, often from foreign jurisdictions—Russian jurisdictions or Russian-speaking. The malware developers are often the more sophisticated as service-type offerings. We see more and more ransomware and other crime types almost operating as franchises—“Here is the capability, off you go, give us a cut.” Then they have less control over how those capabilities are used, so we are seeing a real diversification of the threat, particularly when it comes to ransomware.
Then, where you have that proximity to state-directed, if not quite state-controlled—that crossover between some of those high-end crime groups and the state; I am thinking primarily of Russia—it is a lot harder to attribute the intent behind an attack. There is a blurring of who was it and for what purpose was it done, and there is that element of deniability because it is that one further step away.
Moving back down the levels of the organised crime groups, you have a real profusion of less capable actors within that space, from all around the world, driving huge volumes, often using quite sophisticated tools but not really understanding how they work.
What we have seen is almost like a fragmentation in the criminal marketplace. The barrier to criminal entry is probably lower than it has ever been. You can download these capabilities quite readily—you can watch a tutorial on YouTube or anywhere else on how to use them, and off you go, even if you do not necessarily understand the impact. We certainly saw a real shift post pandemic from traditional criminals and crime groups into more online crime, because it was easier and less risky.
You look more broadly at hacktivists, terrorists—who are probably a lot less capable; they might have the intent but not so much the capability—and then the group that are sometimes slightly patronisingly described as script kiddies. These are young individuals with a real interest in developing their skills. They have an understanding that what they are doing is wrong, but they are probably not financially or criminally motivated. If they were not engaging in that kind of cyber-crime, they probably would not be engaging in other forms of criminality, but they can still do a lot of damage with the tools they can get their hands on, given that so many organisations seem to struggle to deliver even a basic level of cyber-resilience and cyber-security.
One of the things that we really noticed changing over the last 18 months is the diversification of UK threats. Your traditional UK cyber-criminal, if there is such a thing, is primarily focused on hacking for personal benefit, ransomware and other activity. Now we are seeing a diversification, and more of a hybrid, cross-organised crime threat. There are often two factors to that. We often hear it described in the media or by us within law enforcement publicly as the common threat—this emerging community online—otherwise known as Scattered Spider.
There, we are seeing two elements to those sorts of groups. You see an element of maybe more traditional cyber-skills engaged in hacking or using those skills for fraud, but we also see those skills being used for Computer Misuse Act offences, in order to enable other offences. One of the big areas for that at the moment that we see is around intimate image abuse. We see more and more UK-based criminals hacking individuals’ devices to access, they hope, intimate images. They then identify the subject of those intimate images, most predominantly women, and then engage in acts of extortion, bullying or harassment. We have seen some instances of real-world contact away from that online contact.
Think of the scale of that and the challenge that presents to policing. I can think of cases in cyber-crime unit investigations across the country where you have got a handful of individuals who have victimised thousands of women in the UK and abroad. You have got these small cyber-crime units of a handful of people trying to manage 4,000 or 10,000 victims.
It is very difficult and very challenging, but the flipside of that is that, if they are UK-based, we have a much better chance of getting hold of them, so we are seeing a lot more arrests for those cross-hybrid threats, which is a positive. There is definitely an emerging cohort that then starts to blend in with threats like Southport and violence-fixated individuals. There seems to be a real mishmash of online threat coming together and then separating apart in a way that we have never seen historically. That is a real change in the UK threat that is driving a lot of policing activity.
Turning to your ransomware question, what is interesting, in terms of the kinds of organisations that are impacted by ransomware, a lot of the ransomware actors do not want to come to notice for hitting critical national infrastructure. They do not want to do the cloning of pipelines. They do not want to be taking out hospitals and the NHS. They know they will not get paid if they hit UK critical national infrastructure, for starters, so there is a disincentive, but they also do not want that level of Government or law enforcement attention.
Think of the disruptive effect that the UK NCA and policing had on LockBit the year before last. LockBit went from being the No. 1 ransomware strain globally to being out of the top 10 and struggling to come back. We saw a real fragmentation of the ransomware market post that. There is no dominant strain or group within that that has emerged to cover that. A lot of those groups that are coming into that space may be a bit less skilled, sophisticated and successful.
The overall threat to organisations is pretty much the same. The volume is the volume, but it is probably less CNI and more smaller organisations because they are more vulnerable and it is less likely to play out very publicly than if there is a big impact on the economy or critical national infrastructure. As such, there is probably not the level of impact in the areas that people would expect, notwithstanding some of the really high-profile incidents we had last year.
David Chadwick
Q
DCS Andrew Gould: That is a really good question. The international jurisdiction challenge for us is huge. We know that is where most of the volumes are driven from, and obviously we do not have the powers to just go over and get hold of the people we would necessarily want to. You will not be surprised to hear that it really varies between jurisdictions. Some are a lot more keen to address some of the threats emanating from their countries than others. More countries are starting to treat this as more of a priority, but it can take years to investigate an organised crime group or a network, and it takes them seconds to commit the crime. It is a huge challenge.
There are two things that we could do more of better—these are things that are in train already. If you think about the wealth of cyber-crime, online fraud and so on, all the data, and a lot of the skills and expertise to tackle that sit within the private sector, whereas in law enforcement, we have the law enforcement powers to take action to address some of it.
With a recent pilot in the City funded by the Home Office, we have started to move beyond our traditional private sector partnerships. We are working with key existing partners—blockchain analytic companies or open-source intelligence companies—and we are effectively in an openly commercial relationship; we are paying them to undertake operational activity on our behalf. We are saying, “Company a, b or c, we want you to identify UK-based cyber-criminals, online fraudsters, money-laundering and opportunities for crypto-seizure under the Proceeds of Crime Act 2002”. They have the global datasets and the bigger picture; we have only a small piece of the puzzle. By working with them jointly on operations, they might bring a number of targets for us, and we can then develop that into operational activity using some of the other tools and techniques that we have.
It is quite early days with that pilot, but the first investigation we did down in the south-east resulted in a seizure of about £40 million-worth of cryptocurrency. That is off a commercial contract that cost us a couple of hundred grand. There is potential for return on investment and impact as we scale it up. It is a capability that you can point at any area of online threat, not just cyber-crime and fraud, so there are some huge opportunities for it to really start to impact at scale.
One of the other things we do in a much more automated and technical way—again funded by the Home Office—is the replacement of the Action Fraud system with the new Report Fraud system. That will, over the next year or so, start to ingest a lot of private sector datasets from financial institutions, open-source intelligence companies and the like, so we will have a much broader understanding of all those threats and we will also be able to engage in takedowns and disruptions in an automated way at scale, working with a lot of the communication service providers, banks and others.
Instead of the traditional manual way we have always been doing a lot of that protection, we can, through partnerships, start doing it in a much more automated and effective way at scale. Over time, we will be able to design out and remove a lot of the volume you see impacting the UK public now. That is certainly the plan.
The Parliamentary Under-Secretary of State for Science, Innovation and Technology (Kanishka Narayan)
Q
DCS Andrew Gould: I love the fact that you have heard of it. One of the things that we struggle with is promoting a lot of these initiatives. Successive Governments actually deserve a lot of credit for the range of services that are provided. We aspire to be a global cyber-power, and in many ways we are. When you look at the range of services, tools, advice and guidance that organisations or the public can get, there is quite a positive story to tell there. I think we struggle to bring that into one single narrative and promote it, which is a real challenge. People just do not know that those services are there.
For those who are not familiar with Police CyberAlarm, it is a Home Office-funded policing tool focused on small and medium-sized organisations that probably do not have the skills or understanding to protect themselves as effectively. They can download that piece of software, and it will sit on their external networks and monitor for attacks. For the first time, it helps us in policing to build a domestic threat picture for small and medium-sized organisations, because everybody has a different piece of the puzzle. GCHQ has great insight into what is coming into the UK infrastructure, but it obviously cannot monitor domestically. Big organisations that provide cyber-security services and monitoring know what is impacting their clients or their organisation, but not everybody else. At policing, we get what is reported, which is a tiny piece of the puzzle. So everyone has a different bit of the jigsaw, and none of it fits together, and, even if it did, there would still be gaps. For SMEs, that is a particular gap.
For us, we get the threat intelligence to drive our operational activity, which has been quite successful for us. The benefit for member organisations—we are up to about 12,000 organisations at the moment, which are mostly schools, because we know that they are the most vulnerable to attack for a variety of reasons—is that, having the free tool available, it can do the monthly vulnerability scans and assessments. So they are getting a report from the police that tells them what they need to fix and what they need to patch.
We do not publicly offer a lifetime monitoring service, because we would not want the liability and responsibility, and we do not have the infrastructure to run that scale of security operation centre. But, in effect, that is actually what we have been doing for a long time—maybe not 24/7, but most of the time—because we have been able to identify precursor activity to ransomware attacks on schools or other organisations, and have been able to step in and prevent it from happening. There have been instances where officers have literally got in cars and gone on a blue light to organisations to say, “You need to shut some stuff off now, because you are about to lose control of your whole organisation.”
To that extent, it has been really impactful, but the challenge for us is how to scale. How do you scale so that people understand that it is there? How do you make it easier for organisations to install? That is one of the things that we are working on at the moment, so that everybody can benefit from the scans and the threat reporting, and we can benefit from a bigger understanding of what is going on.
The flip side of the SME offer from our point of view is our cyber-resilience centres. By working with some of the top student talent in the country, we can scale to offer our member organisations across the country the latest advice and guidance, help them understand what the NCSC advice and guidance is, and then help them to get the right level of security policies, patch their systems and all that kind of thing. It helps them to take the first steps on their cyber-resilience journey, and hopefully be more mature consumers of cyber-security industry services going forward. We are helping to create a market for growth, but also helping those organisations to understand their specific vulnerabilities and improve from a very base level.
Bradley Thomas
Q
DCS Andrew Gould: That is another really good question. Generally, it is financial, but you will often get what is called the double dip, so there is the extraction of data as well as the encryption of it, so that you no longer have access to it. They might take that data as well, primarily personal data, because of the regulatory pressures and challenges that that brings. There is a sense among a lot of criminal groups that, if they have personal data, you are more likely to pay, because you do not want that reputation, embarrassment and all the rest of it, as opposed to if they take intellectual property, for example. But it is not that that does not happen as well. Primarily, it is financial gain.
Chris Vince
Q
DCS Andrew Gould: It is a tricky one. It feels like the technology change is getting ever faster and ever more challenging, but I first went into cyber-crime in the Met back in 2014, and we are giving the same advice now as we were giving then. Sometimes your head can explode with the technical complexity of it, but a lot of the solution just comes down to doing the really boring basics in a world-class way. It is things like patching and doing your software updates. Whether you are a member of the public or running an organisation, finding a way to do those updates and patches means that 50% of the threat has gone, there and then. With something like multi-factor authentication, it seems like most organisations do not want to inconvenience their staff or customers by putting it in place, but that would be another 40% of the problem solved. It is not infallible—nothing is—but if you are thinking about how attacks are still successful, it is pretty basic: a lot of our protections are not in place. Solving that means that 90% of the threat is gone, there and then. That then leaves the 10% of more sophisticated threats—let’s make the criminals work a bit harder.
The Chair
Order. That brings us to the end of the time allotted for the Committee to ask questions. I thank the witness for his evidence.
Examination of Witness
Richard Starnes gave evidence.
The Chair
We will now hear oral evidence from Richard Starnes, chair of the information security panel for the Worshipful Company of Information Technologists. We have until 4 pm for this session.
Q
Richard Starnes: The question about effectiveness is difficult to answer. There is the apparent effectiveness and the actual effectiveness. The reason I answer in that way is that you have regulators that are operating in environments where they may choose to not publicly disclose how they are regulating; it may be classified due to the nature of the company that was compromised, or who compromised the company. There may not necessarily be a public view of how much of that regulation is actually going on. That is understandable, but it has the natural downside of creating instances where somebody is being taken to task for not doing it correctly, but that is not exposed to the rest of the world. You do not know that it is happening, so the deterrent effect is not there.
Information sharing and analysis centres started in the United States 20 or 25 years ago, when different companies were in the same boat. The first one that I was aware of was the Financial Services ISAC, which comprises large entities—banks, clearing houses and so on—that share intelligence about the types of attacks that they are receiving internationally. They may be competing with one another in their chosen businesses, but they are all in the same boat with regard to being attacked by whatever entities are attacking them. Those have been relatively good at helping develop defences for those industries.
Q
Richard Starnes: Yes. We have FS-ISAC operating in the United Kingdom and in Europe, with all the major banks, but if you took this and replicated it on an industry-by-industry basis, particularly ones in CNI, that would be helpful. It would also help with information sharing with entities like NCSC and GCHQ.
David Chadwick
Q
Richard Starnes: On what you say about the 18-month tenure, one of the problems is stress. A lot of CISOs are burning out and moving to companies that they consider to have boards that are more receptive to what they do for a living. Some companies get it. Some companies support the CISOs, and maybe have them reporting to a parallel to the CIO, or chief information officer. A big discussion among CISOs is that having a CISO reporting to a CIO is a conflict of interest. A CISO is essentially a governance position, so you wind up having to govern your boss, which I would submit is a bit of a challenge.
How do we help CISOs? First, with stringent application of regulatory instruments. We should also look at or discuss the idea of having C-level or board-level executives specifically liable for not doing proper risk governance of cyber-security—that is something that I think needs to be discussed. Section 172 of the Companies Act 2006 states that you must act in the best interests of your company. In this day and age, I would submit that not addressing cyber-risk is a direct attack on your bottom line.
Dr Gardner
Q
Richard Starnes: I think this should flow from the board to the C-level executives. Most boards have a risk committee of some sort, and I think the chair of the risk committee would be a natural place for that responsibility to sit, but there has to be somebody who is ultimately responsible. If the board does not take it seriously, the C-levels will not, and if the C-levels will not, the rest of the company will not.
Dr Gardner
Q
Richard Starnes: That is a very broad question.
Dr Gardner
I know, sorry. I collapsed it down from quite a few.
Richard Starnes: There is any number of different reasons. You have 12 competent authorities, at last count, with varying funding models and access to talent. Those could vary quite a bit, depending on those factors. I am not really sure how to answer that question.
Dr Gardner
Q
Richard Starnes: True, but I would submit that under the Companies Act that liability is already there for all the directors; it just has not been used that way.
Emily Darlington
Q
Richard Starnes: You just stepped on one of my soapbox issues. I would like to see the code of practice become part of the annual Companies House registrations for every registered company. To me, this is an attestation that, “We understand cyber-security, we’ve had it put in front of us, and we have to address it in some way.”
One of the biggest problems, which Andy talked about earlier, is that we have all these wonderful things that the Government are doing with regard to cyber-security, down to the micro-level companies, but there are 5.5 million companies in the United Kingdom that are not enterprise-level companies, and the vast majority of them have 25 employees or fewer. How do we get to these people and say, “This is important. You need to look at this”? This is a societal issue. The code of practice and having it registered through Companies House are the way to do that. We need to start small and move big. Only 3% of businesses are involved in Cyber Essentials, which is just that: the essentials. It is the baseline, so we need to start there.
David Chadwick
Q
Richard Starnes: Throughout my career, I have been involved in cyber incidents from just about day one. One of the biggest problems that you run into in the first 72 hours, for example, is actually determining whether you have been breached. Just because it looks bad does not mean it is bad. More times than not, you have had indicators of compromise, and you have gone through the entire chain, which has taken you a day, or maybe two or three days, of very diligent work with very clever people to determine that, no, you have not been breached; it was a false positive that was difficult to track down. Do you want to open the door to a regulator coming in and then finding out it is a false positive?
You are also going to have a very significant problem with the amount of alerts that you get with a 24-hour notification requirement, because there is going to be an air of caution, particularly with new legislation. Everybody and his brother is going to be saying, “We think we’ve got a problem.” Alternatively, if they do not, then you have a different issue.
The Chair
If there are no further questions, I thank our witness for his evidence. I will suspend the Committee for a few minutes because our next witnesses, who will give evidence online, are not ready yet.
The Chair
We will now hear oral evidence from Brian Miller, head of IT security and compliance, and Stewart Whyte, data protection officer, both from NHS Greater Glasgow and Clyde and joining us online. For this session we have until 4.20 pm. Will the witnesses please introduce themselves for the record?
Brian Miller: Good afternoon, Chair. It is nice to see you all. I am Brian Miller and I head up IT security and compliance at NHS Greater Glasgow and Clyde. It is a privilege to be here, albeit remotely. I have worked at NHS Greater Glasgow and Clyde for four years. Prior to that, I was infrastructure manager at a local authority for 16 years and I spent 10 years at the Ministry of Defence in infrastructure management. I look at the Bill not only through the lens of working with a large health board, but from a personal perspective with a philosophy of “defenders win” across the entire public sector.
Stewart Whyte: Good afternoon, Chair, and everyone. My name is Stewart Whyte and I am the data protection officer at NHS Greater Glasgow and Clyde. I am by no means a cyber-security expert, but hopefully I can provide some insight into the data protection side and how things fit together.
Q
Brian Miller: That is a good question. Some of our colleagues mentioned the follow-up secondary legislation that will help us to identify those kinds of things. I suppose there is no difference from where we are at now. We would look at any provision of services from a risk management perspective and say what security controls apply. For example, would they be critical suppliers in terms of infrastructure and cyber-security? Does a cleaning service hold identifiable data? What are the links? Is it intrinsically linked from a technological perspective?
I mentioned looking at this through a “defenders win” lens. Yes, some of these technologies are covered. I saw some of the conversations earlier about local authorities not being in scope, but services are so intrinsically linked that they can well come into scope. It might well be that some of the suppliers you mentioned fall under the category of critical suppliers, but that might be the case just now. There might be provision of a new service for medical devices, which are a good example because they are unique and different compliance standards apply to them. For anything like that, where we stand just now—outside the Bill—we risk assess it. There is such an intrinsic link. A colleague on another panel mentioned data across the services; that is why Stewart is here alongside me. I look after the IT security element and Stewart looks after the data protection element.
Q
Brian Miller: Sometimes, but sometimes not. I do not think we had any physical links with Synnovis, but it did work on our behalf. Emails might have been going back and forward, so although there were no physical connections, it was still important in terms of business email compromise and stuff like that—there was a kind of ancillary risk. Again, when things like that come up, we would look at it: do we have connections with a third party, a trusted partner or a local authority? If we do, what information do we send them and what information do we receive?
Chris Vince
Q
Stewart Whyte: Anything that increases or improves our processes in the NHS for a lot of the procured services that we take in, and anything that is going to strengthen the framework between the health board or health service and the suppliers, is welcome for me. One of our problems in the NHS is that the systems we put in are becoming more and more complex. Being able to risk assess them against a particular framework would certainly help from our perspective. A lot of our suppliers, and a lot of our systems and processes, are procured from elsewhere, so we are looking for anything at all within the health service that will improve the process and the links with third party service providers.
Dr Gardner
Q
Brian Miller: That is a great question. I will touch on some different parts, because I might have slightly different information from some of the information you have heard previously. On reporting—Stewart will deal with the data protection element for reporting into the Information Commissioner’s Office—we report to the Scottish Health Competent Authority. It is important that we have an excellent relationship with the people there. To put that in context, I was speaking to them yesterday regarding our transition to the CAF, as part of our new compliance for NHS Greater Glasgow and Clyde. If there was a reportable incident, we would report into the SHCA. The thresholds are really well defined against the confidentiality, integrity and availability triad—it will be patient impact and stuff like that.
Organisationally, we report up the chain to our director of digital services, and we have an information governance steering group. Our senior information risk officer is the director of digital, and the chief information security officer role sits with our director of digital. We report nationally, and we work really closely with National Services Scotland’s Cyber Security Centre of Excellence, which does a lot of our threat protection and secure operations, 24/7, 365 days a year. We work with the Scottish Government through the Scottish Cyber Co-ordination Centre and what are called CREW—cyber resilience early warning—notices for a lot of threat intelligence. If something met the threshold, we would report to the SHCA. Stewart, do you want to come in on the data protection officer?
Stewart Whyte: We would report to the Information Commissioner, and within 72 hours we also report to the Scottish Government information governance and data protection team. We would risk assess the breaches and determine whether they meet the threshold for reporting. Not every data breach is required to be reported.
From the reporting perspective, it would be helpful to report into one individual organisation. I noticed that in the reporting requirements we are looking at doing it within 24 hours, which could be quite difficult, because sometimes we do not know everything about the breach within that time. We might need more information to be able to risk assess it appropriately. Making regulators aware of the breach as soon as possible is always going to be a good thing.
Lincoln Jopp
Q
Brian Miller: We would work with the Scottish Health Competent Authority as our regulator; I cannot speak for other regulators and what that might look like. We are doing work on what assurance for critical suppliers outside the Bill looks like just now, and we are working across the boards in Scotland on identifying critical suppliers. Outside of that, for any suppliers or any new services, we will assess the risk individually, based on the services they are providing.
The Bill is really valuable for me, particularly when it comes to managed service provision. One of the questions I was looking at is: what has changed since 2018? The biggest change for me is that identity has went to the cloud, because of video conferencing and stuff like that. When identity went to the cloud, it then involved managed service providers and data centres. We have put additional controls around that, because the network perimeter extended out into the cloud. We might want to take advantage of those controls for new things that come online, integrating with national identity, but we need to be assured that the companies integrating with national identity are safe. For me, the Bill will be a terrific bit of legislation that will help me with that—if that makes sense.
Lincoln Jopp
Q
Brian Miller: I think we would work with the regulator, but we are looking for more detail in any secondary legislation that comes along. We have read what the designation of critical suppliers would be. I would look to work with the Scottish Health Competent Authority and colleagues in National Services Scotland on what that would look like.
Stewart Whyte: On how we would make that decision, from our perspective we are looking at what the supplier is providing and what sort of data they are processing on our behalf. From the NHS perspective, 90% of the data that we process will be special category, very sensitive information. It could be that, from our side, a lot of the people in the supply chain would fall into that designation, but for some other sectors it might not be so critical. We have a unique challenge in the NHS because of the service we provide, the effect that cyber-crime would have on our organisations, and the sensitivity of the data we process.
Q
Stewart Whyte: For me, it would be a slightly different assessment from Brian’s. We would be looking at anything where there is no processing of personal data. For me, that would not be a critical supplier from a data protection perspective. But there might be some other integration with NHS board systems that Brian might have concerns about. There is a crossover in terms of what we do, but my role is to look at how we manage data within the NHS. If there are suppliers where there is no involvement with identifiable data of either staff or patients, I would not see them as a critical supplier under this piece of legislation.
Lincoln Jopp
Q
Brian Miller: I do not want to step out of my lane. There will be clinical stuff that absolutely would be essential. I would not be able to speak in any depth on that part of it; I purely look at the cyber element of it. As an organisation, we would be identifying those kinds of aspects.
In terms of suppliers, you are absolutely right. We have suppliers that supply some sort of IT services to us. If we are procuring anything, we will do a risk assessment—that might be a basic risk assessment because it is relatively low risk, it might be a rapid risk assessment, or it may be a really in-depth assessment for someone that would be a critical supplier or we could deem essential—but there are absolutely suppliers that would not fall under any of that criteria for the board. The board is large in scale, with 40,000 users. It is the largest health board in the country.
Q
Stewart Whyte: Yes. There is a lot of information sharing between acute services and primary care via integrated systems. We send discharge letters and information directly to GP practices that then goes straight into the patient record with the GP. There is a lot of integration there, yes.
Q
Stewart Whyte: Yes, there is integration between ourselves and the local authorities.
The Chair
If there are no further questions from Members, I thank witnesses for their evidence. We will move on to the next panel.
Examination of Witnesses
Chris Parker MBE and Carla Baker gave evidence.
The Chair
We will now hear oral evidence from Chris Parker, director of government strategy at Fortinet and co-chair of the UK cyber resilience committee at techUK, and Carla Baker, senior director of government affairs in the UK and Ireland at Palo Alto Networks. For this session, we have until 4.50 pm.
Q
Carla, from the Palo Alto Networks perspective, what are your views on the changes to the incident reporting regime under the Bill? Will the approach help or hinder regulators in identifying and responding to the most serious threats quickly?
Chris Parker: I should point out that Carla is also co-chair of the cyber resilience committee, so you have both co-chairs here today.
As large cyber companies, we are very proud of one thing that is pertinent to the sector that may not be clear to everybody outside. I have worked in many sectors, and this is the most collaborative—most of it unseen—and sharing sector in the world. It has to be, because cyber does not respect borders. When we go to the most vulnerable organisations, which one would expect cannot afford things and therefore there must be a function of price, such as SMEs—I was an SME owner in a previous life—that is very dear to us. With the technology that is available, what is really good news is that when people buy cyber-security for their small business—in the UK or anywhere in the world—they are actually buying the same technology; it is effectively just a different engine size in most cases. There are different phases of technology. There is the latest stuff that is coming in, which they may not be getting into yet. However, the first thing to say is that it is a very fair system, and pricing-wise, it is a very fair system indeed for SMEs.
The second point is about making sure we are aware of the amount of free training going on across the world, and most of the vendors—the manufacturers—do that. Fortinet has a huge system of free training available for all people. What does that give? It is not just technical training for cyber-security staff; it is for ordinary people, including administrative workers and the people who are sometimes the ones who let the bad actor in. There are a lot of efforts. There is a human factor, as well as technological and commercial factors.
The other thing I would like to mention is that the cyber resilience committee, which Carla and I are lucky to co-chair, is elected. We have elected quite a large proportion of SME members. There is also a separate committee run by techUK. You heard from Stuart McKean earlier today, and he is one of the co-chairs, or the vice chair, of that committee.
Carla Baker: On incident reporting, as I am sure you are aware, the Bill states that organisations must report an incident if it is
“likely to have an impact”.
Our view, and I think that of techUK, is that the definition is far too broad. Anything that is likely to cause an impact could be a phishing email that an organisation has received. Organisations receive lots and lots of spoof emails.
I will give an example. Palo Alto Networks is one of the largest pure-play cyber-security companies. Our security operations centre—the hub of our organisation—processes something like 90 billion alerts a day. That is just our organisation. Through analysis and automation, the number is whittled down to just over 20,000. Then, through technology and capabilities, it is further whittled down, so that we are analysing about 75 alerts.
You can equate it to a car, for example. If you are driving and see a flashing yellow light, something is wrong. That is like 20,000 alerts. It is then whittled down to about 75, so we would potentially have to report up to 75 incidents per day, and that is just one organisation. There are a lot more. The burden on the regulator would be massive because there would be a lot of noise. It would struggle to ascertain what is the real problem—the high-risk incidents that impact the UK as a whole—and the noise would get in the way of that.
We have come up with a suggestion, an amendment to the legislation, that would involve a more tiered approach. There would be a more measurable and proportionate reporting threshold, with three tiers. The first is an incident that causes material service disruption, affecting a core service, a critical customer or a significant portion of users. The second is unauthorised, persistent access to a system. The third is an incident that has compromised core security controls—that is, security systems. Having a threshold that is measurable and proportionate is easier for organisations to understand than referring to an incident that is
“likely to have an impact”,
because, as I said, a phishing email is likely to cause an impact if an organisation does not have the right security measures in place.
David Chadwick
Q
Chris Parker: That is an excellent question. The good news is that a lot is happening already. An enormous amount of collaborative effort is going on at the moment. We must also give grace to the fact that it is a very new sector and a new problem, so everybody is going at it. That leads me on to the fact that the UK has a critical role in this, but it is a global problem, and therefore the amount of international collaboration is significant—not only from law enforcement and cyber-security agencies, but from businesses. Of course, our footprints, as big businesses, mean that we are always collaborating and talking to our teams around the world.
In terms of what the UK can do more of, a lot of the things that have to change are a function of two words: culture and harmonisation—harmonisation of standards. It is about trying not to be too concerned about getting everything absolutely right scientifically, which is quite tempting, but to make sure we can harmonise examples of international cyber-standards. It is about going after some commonality and those sorts of things.
I think the UK could have a unique role in driving that, as we have done with other organisations based out of London, such as the International Maritime Organisation for shipping standards. That is an aspiration, but we should all drive towards it. I think it is something the UK could definitely do because of our unique position in looking at multiple jurisdictions. We also have our own responsibilities, not only with the Commonwealth but with other bodies that we are part of, such as the United Nations.
It is not all good news. The challenge is that, as much as we know that harmonisation is okay, unfortunately everyone is moving. Things have started, and everyone is running hot. An important point to make is that it is one of the busiest sectors in the world right now, and everybody is very busy. This comes back to the UK having a particular eye on regulatory load, versus the important part that other elements of our society want, which is growth and economic prosperity. We talked earlier about SMEs. They do not have the capability to cover compliance and regulatory load easily, and we would probably all accept that. We have to be careful when talking about things such as designating critical suppliers.
All of this wraps up into increasing collaboration through public-private partnerships and building trust, so that when the Government and hard-working civil servants want to see which boundaries are right to push and which are not, bodies such as the UK cyber resilience committee, which Carla and I are on, can use those collaborative examples as much as possible.
There is quite a lot there, but something the UK certainly should be pushing to do is culture change, which we know has to be part of it—things have been talked about today by various speakers—as well as the harmonisation of standards.
Carla Baker: I think we are in a really interesting and exciting part of policy development: we have the Bill, and we have recently had the Government cyber action plan, which you may have heard about; and the national cyber action plan is coming in a few months’ time. The Government cyber action plan is internally facing, looking at what the Government need to do to address their resilience. The national cyber action plan is wider and looks at what the UK must do. We are at a really exciting point, with lots of focus and attention on cyber-security.
To address your point, I think there are three overarching things that we should be looking at. First is incentivising organisations, which is part of the Bill and will hopefully be a big part of the national cyber action plan. We must incentivise organisations to do more around cyber-security to improve their security posture. We heard from previous panellists about the threats that are arising, so organisations have to take a step forward.
Secondly, I think the Government should use their purchasing power and their position to start supporting organisations that are doing the right thing and are championing good cyber-security. There is more that the Government can do there. They could use procurement processes to mandate certain security requirements. We know that Cyber Essentials is nearly always on procurement tenders and all those types of things, but more can be done here to embed the need for enhanced security requirements.
Thirdly, I think a previous witness talked about information sharing. There is a bit of a void at the moment around information sharing. The cyber security information sharing partnership was set up, I think, 10 years ago—
Chris Parker: Yes, 10 years ago.
Carla Baker: It was disbanded a couple of months ago, and that has left a massive void. How does industry share intelligence and information about the threats they are seeing? Likewise, how can they receive information about the threat landscape? We have sector-specific things, but there isn’t a global pool, and there is a slight void at the moment.
David Chadwick
Q
Chris Parker: It is a national problem. We have had a lot of discussion on that at the techUK cyber resilience committee. We think it is not just about skills and bunging lots of training at people, because you have to work out cyber as a whole. A very small component of cyber is people at the wonderfully high-tech end, where they are coding and writing software. There are an awful lot of jobs in places out there that a lot of people are just not aware of, and perhaps would therefore not be volunteering or aiming towards it—even at their school. There are lots of jobs in cyber sales, marketing and analysis that do not require a very high level of mathematics, for example. Some of them do not need a very high level of mathematics at all. I think that some awareness needs to be built there.
Personally, I would like to see more championing of the people who are in the sector at the moment. We have some fantastic young men and women in the sector, but we also need to make sure they are able to have chartered status. It is out there, now that we are starting, but it needs to gather pace, because we need to make sure these people are represented and feel professional, so that it can be reflected.
Another thing to mention is that there is a lot of effort in the cyber growth partnership, which is run through DSIT and techUK. It is initiating an idea where people will be lent from industry into academia, to offer inspiration but also to improve lecture quality and standards, because things move fast and we are running so hot. It is very hard for academia to keep up. There is quite a lot that can be done to increase the workforce and skills, but going back to our original points, with greater public-private collaboration and discussion, we will get it absolutely right on focusing on the right places to spend resources.
Dr Gardner
Q
Carla Baker: My comment on information sharing was about what else the Government could do. It was not necessarily specifically to do with the Bill. If you want me to elaborate on the wider issue of information sharing, I am happy to.
Dr Gardner
Particularly between regulators, and how that would work.
Carla Baker: I cannot necessarily talk in much detail about information sharing across regulators. It is more about information sharing across the technology industry that I can talk about.
Dr Gardner
Q
I will ask my actual question, and I am trying to get my head around this. You recommend mandating that company boards be accountable for mitigating cyber-risks, and as we know from the annual cyber-security breaches survey, there are declining levels of board responsibility for cyber in recent years, which links to whether there should be a statutory duty. I am a little worried about small and microbusinesses having to deal with that regulatory burden, especially if they are designated as critical suppliers. I am trying to marry those two things together, and the concern of where liability sits, because we are very dependent on service providers. I do not know if that makes any sense to you, but could you clarify my thinking?
Chris Parker: It is a concern. I will start off with a small point about where there is a statutory requirement, certainly for large companies. I personally believe—and I am pretty sure that most industry people I speak to would say this—that it would be very surprising if we did not have cyber-focused people on boards and in much bigger governance, as we would in a financial services company, where people who are expert in financial risk are able to govern appropriately. As we get smaller and smaller in scale, that is much harder to do.
The good news is that there are some brilliant—and I really mean that—resources available from probably the most underused website in the world, but the best one, which is the National Cyber Security Centre website. It has some outstanding advice for boards and governance on there. You can effectively make a pack and write a checklist, even if you are a very small company with a board of two people, and go through your own things and make sure your checklists are there.
The data and the capability are there to give support. Whether it is signposted enough, and whether we are helping on a local level, to make sure that people are aware of those things is perhaps something we could do better at in this country. But I am sure that industry will do our part, and we do, to share and reinforce the good sharing of things like that website, to guide good governance for SMEs especially.
Carla Baker: That board-level accountability is really important, and it is crucial for cyber-security. I think it is getting better—from the senior execs that I speak to in industry, there is more understanding—but generally speaking, there is a view that cyber-security is an IT issue, not a business issue. I am sure you have heard throughout the day about understanding the risks we have seen around vulnerabilities, and the incidents that have affected the retail or manufacturing sectors. Those are substantial incidents that have impacted the UK and have systemic knock-on effects. Organisations have to understand the serious nature of cyber-security, and therefore put more emphasis on cyber at the board level.
Should we be mandating board-level governance? That is useful for this debate to seek information and input on, but the burden on SMEs has to be risk-based and proportionate, however it is framed.
Dr Gardner
Q
Chris Parker: That is a harder question. There is precedent here—of course, we can think back to the precedents that this great building has set on allowing things such as, post-Clapham train disaster, the Corporate Manslaughter and Corporate Homicide Act 2007 putting it very firmly on boards, evolving from the Health and Safety at Work etc. Act 1974. We are not there yet, but do not forget that we are starting to legislate, as is everyone else in Europe and America who are on this journey.
I believe that we will see a requirement at some point in the future. We all hope that the requirement is not driven by something terrible, but is driven by sensible, wise methodology through which we can find out how we can ensure that people are liable and accept their liability. We have seen statements stood up on health and safety from CEOs at every office in this country, for good reason, and that sort of evolution may well be the next phase.
Carla and I talk about this a lot, but we have to be careful about how much we put into this Bill. We have to get the important bit about critical national infrastructure under way, and then we can address it all collaboratively at the next stage to deal with very important issues such as that.
Lincoln Jopp
Q
Chris Parker: I was referring to strategic and critical suppliers, which is a list of Government suppliers. We are advocating that the level of governance and regulatory requirement inside an organisation is difficult, and it really is. It requires quite a lot of work and resource, and if we are putting that on to too small a supplier, on the basis that we think it is on the critical path, I would advocate a different system for risk management of that organisation, rather than it being in the regulatory scope of a cyber-resilience Bill. The critical suppliers should be the larger companies. If we start that way in legislation and then work down—the Bill is designed to be flexible, which is excellent—we can try to get that way.
As a last point on flexibility—this is perhaps very obvious to us but less so to people who are less aware of the Bill—there is a huge dynamic going on here where you have a continuum, a line, at one end of which you have the need for clarity, which comes from business. At the other you have a need for flexibility, which quite rightly comes from the Government, who want to adjust and adapt quite quickly to secure the population, society and the economy against a changing threat. That continuum has an opposing dynamic, so the CRB has a big challenge. We must therefore not be too hard on ourselves in finding exactly where to be on that line. Some things will go well, and some will just need to be looked at after a few years of practice—I really believe that. We are not going to get it all right, because of the complexities and different dynamics along that line.
Carla Baker: This debate about whether SMEs should be involved or regulated in this space has been around since we were discussing GDPR back in 2018. It comes down to the systemic nature of the supplier. You can look at the designation of critical dependencies. I am sure you have talked about this, but for example, an SME software company selling to an energy company could be deemed a critical supplier by a regulator, and it is then brought into scope. However, I think it should be the SMEs that are relevant to the whole sector, not just to one organisation. If they are systemic and integral to a number of different sectors, or a number of different organisations within a sector, it is fair enough that they are potentially brought into scope.
It is that risk-based approach again. But if it is just one supplier, one SME, that is selling to one energy company up in the north of England, is it risk-based and proportionate that they are brought into scope? I think that is debatable.
Andrew Cooper (Mid Cheshire) (Lab)
Q
I can imagine that the legislation has been worded as it is to try to capture that situation where activity might occur, but not have an impact. Would you accept that that is important, and how would that fit in with the tiered approach that you described?
Carla Baker: I completely get your point. We have looked at that; my legal colleagues have looked at things such as spyware, where you have malware in the system that is not doing anything but is living there, for example, or pre-emptive, where they are waiting to launch an attack, and we think this amendment would still cover those scenarios. It is not necessarily cause and impact: the lights have not gone out, but if there is, for example, a nation state actor in your network, we think the amendment would still cover that.
Q
Chris Parker: Yes, absolutely.
Carla Baker: Yes, completely. That is similar to my point, which was probably not explained well enough: how you are deemed critical should be more about your criticality to the entire ecosystem, not just to one organisation.
Q
Carla Baker: I think that is part of the issue about not having clear criteria about how regulators will designate. That also means that different regulators will take different approaches, so we would welcome more clarity and early consultation around the criteria that will be used for the regulators to designate a critical dependency, which prevents having different regulatory approaches across the 12 different regulators, which we obviously do not want, and gives greater harmonisation and greater clarity for organisations to know, “Okay, I might be brought in, because those are the clear criteria the Government will be using.”
David Chadwick
Q
Chris Parker: The consultation has been a best effort and I think it is a best effort as a function of three things. First, we have a new sector, a new Bill—something very new, it is not repeating something. Secondly, we are doing something at pace, it is a moving target, we have to get on with this, and so there is some compulsion involved. Thirdly, there are already some collaborative areas set up, such as techUK, that have been used. Would I personally have liked to have seen more? Yes—but I am realistic about how much time is needed; when you only have a certain resource, some people have got to do some writing and crafting as well as discussing.
One thing that we could look at, if we did the process again, would be more modelling, exercising and testing the Bill until it shakes a bit more—that is something that perhaps we could do, if we were to do this again. With the Telecommunications (Security) Act 2021, that was done at length and collaboratively with industry, on a nearly fortnightly basis, for some time. Beyond that, I think that we are realistic in industry because we understand the pressures on the people trying to bring legislation in. A second point to remember is that we are all volunteers. Carla and I, and all those on the Cyber Resilience Committee, volunteer away from our day jobs—which are busy—to do all this. There is a realistic expectation, if you like—but I would say there has been a best effort.
Carla Baker: I would like to look to the future. We have all the secondary legislation that is coming—and there will be lot—so we recommend early insights, and time to review and consult, in order to provide that industry insight that we are happy to provide. Let us look to the secondary legislation and hope that there is good consultation there.
The Chair
If there are no further questions from Members, I will thank the witnesses for their evidence. We will now move on to our final panel.
Examination of Witness
Kanishka Narayan MP gave evidence.
The Chair
We will now hear oral evidence from the Minister for AI and Online Safety, Kanishka Narayan. For this session, we have until 5.10 pm.
Q
Kanishka Narayan: Thank you for the question on definitions. I have two things to say on that. First, observing the evidence today, it is interesting that there are views in both directions on pretty much every definitional question. For example, on the definition of “incident thresholds”, I heard an expert witness at the outset of the day say that it is in exactly the right place, precisely because it adds incidents that have the capability to have an impact, even if not a directness of impact, to cover pre-positioning threats. A subsequent witness said that they felt that that precise definitional point made it not a fitting definition. The starting point is that there is a particular intent behind the definitions used in the Bill, and I am looking forward to going through it clause by clause, but I am glad that some of those tensions have been surfaced.
Secondly, in answer to your question on consultation, a number of the particular priority measures in the Bill were also consulted on under the previous Government. We have been engaging with industry and, in the course of implementation, the team has started setting up engagement with regulators and a whole programme of engagement with industry as well.
Q
Kanishka Narayan: I have met a number of companies, but the relevant Minister has also had extensive engagement with both companies and regulators, including on the question of definitions. I do not have a record of her meetings, but if that is of interest, I would be very happy to follow up on it.
Q
Kanishka Narayan: I am referring to the Minister for Digital Economy, who is in the other place.
Q
Kanishka Narayan: I have had some meetings but, as the Minister in charge of this Bill, she has been very engaged with businesses, so I think that is fitting. We have obviously worked very closely together, as we normally do, in the course of co-ordinating across the two Chambers.
Q
Kanishka Narayan: I have spoken to the Secretary of State about the Bill, including the reserve powers, and we have agreed that the policy objective is very clear. I do not think I am in a position to divulge particular details of policy discussions that we have had; I do not think that would be either appropriate or a fitting test of my memory.
Q
Kanishka Narayan: I think the guardrails in the Bill are very important, absolutely. The Bill provides that, where there is an impact on organisations or regulators, there is an appropriate requirement for both deep consultation and an affirmative motion of the House. I think that is exactly where it ought to be, and I do not think anything short of that would be acceptable.
Chris Vince
Q
Kanishka Narayan: The primary thing to say is that the range of organisations—commercial ones as well as those from the cyber-security world more generally—coming out to welcome the Bill is testament to the fact that it is deeply needed. I pay tribute to the fact that some of the provisions were engaged on and consulted on by the prior Government, and there is widespread consensus across industry and in the regulatory and enforcement contexts about the necessity and the quality of the Bill. On that front, I feel we are in a good place.
On specific questions, of course, there is debate—we have heard some of that today—but I am very much looking forward to going through clause by clause to explain why the intent of the Bill is reflected in the particular definitions.
Bradley Thomas
Q
Kanishka Narayan: I am shy of making comments on specific incidents, but as a broad brush, clearly the food supply or automotive manufacturing sectors are not directly in scope of the Bill, for reasons I am very much happy to discuss.
Bradley Thomas
Q
Kanishka Narayan: Let me place the focus of this Bill in the global context. As we have heard, there is a range of legislative as well as non-legislative measures on cyber-security. It is deeply important that every organisation, whether in scope of the Bill or not, acts robustly, and we will look at that, not least through the cyber action plan, which I know industry welcomed earlier today and which we are looking forward to publishing very soon.
The particular focus of this Bill is on essential services, the disruption of which would pose an imminent threat—for example, to life and to our economy—in the immediate context. For reasons that we can dive into, if you look at a market such as food supply, the diversity, competitive nature and alternative provision in that market are so obvious that to designate it as fitting the definitional scope I have just highlighted would not be an evidence-led way of engaging.
Bradley Thomas
Q
Kanishka Narayan: As I have said, this legislative vehicle is focused on really high standards of rigour for essential services. I am very keen to ensure that, in the first instance, we are engaging with those companies through the cyber action plan and the National Cyber Security Centre’s framework and to ensure that, as a consequence of those, they are in a robust place.
Bradley Thomas
Q
Kanishka Narayan: This is a great question. There are two things on my mind. One is that the Government have published a cyber action plan, the crux of which is to make sure that, from the point of view of understanding, principles, accountability and, ultimately, skills, there is significant capability in the public sector. The second thing to say is that we have a very broad-based plan on skills more generally across the cyber sector, public and private. For example, I am really proud of the fact that, through the CyberFirst programme, some—I think—415,000 students right across the country have been upskilled in cyber-security. It is deeply important that the public sector ensures that we are standing up to the test of hiring them and making the attraction of the sector clear to them as well. There is a broad-based plan and a specific one for the public sector in the Government context.
Tim Roca
Q
Kanishka Narayan: That is a great question. Broadly, the Bill takes a risk-based and outcomes-focused approach, rather than a technology-specific one. I think that is the right way to go about it. As we have heard today and beyond, there are some areas where frontier technology—new technology such as AI and quantum, which we talked about earlier today—will pose specific risks. There are other areas where the prevalence of legacy systems and legacy database architectures will present particular risks as well.
The Bill effectively says that the sum total of those systems, in their ultimate impact on the risk exposure of an organisation, is the singular focus where regulators should place their emphasis. I would expect that individual regulators will pay heed to the particular prevalence of legacy systems and technical debt as a source of risk in their particular sectors, and as a result to the mitigations that ought to be placed. I think that being technology agnostic is the right approach in this context.
Lincoln Jopp
Q
Kanishka Narayan: Do you mean operators of essential services, or critical suppliers, as in the third party element?
Lincoln Jopp
I meant operators of essential services.
Kanishka Narayan: The Bill effectively specifies operators of essential services as large participants in the essential services sectors. I think that that definition is very straightforward. The hospital in this question would be an operator of an essential service. If the question extends to critical third party suppliers—
Lincoln Jopp
Q
Kanishka Narayan: There are two things to say on this. There is at least a four-step test on the face of the Bill for what would qualify as a critical supplier. First, a critical supplier has to supply to an operator of an essential service, in this case the hospital. Secondly, the supplier itself must engage with important network and information systems. Thirdly, the disruption to that third party supplier would have to cause a material disruption to the operator in question—in this case, if the third party supplier falls over from a cyber-security point of view, there would be material and business continuity disruption to the hospital. Fourthly, not only that, but that disruption would have to be sufficiently severe in its impact to be in scope. That is one set of things. Underlying that is a further test in the Bill, whereby alternative provision of that third party supply could not be secured in a practicable way. The combination of those tests means that the scope set out for the critical third party suppliers is extremely tight and robust.
Then there is still the question, having gone through that five-step test, of the particular burden placed on relevant suppliers in scope. My expectation and hope would be that regulators take a much more proportionate approach there than to set the precise same conditions on those suppliers as they do on the operator in question; in particular, that the burden on them is placed specifically in sight of the directional risk that they pose to the operator, rather than the risk in sum for that third party supplier.
The first thing is therefore that the Bill clearly specifies a very tight scope. The second is that it does not seem to me, as a relative novice to both the medical world and cyber-security, unusual to have a specification of this nature in a Bill. Given my professional context, I am particularly conscious of the very clear and critical third party comparable requirement in the Financial Services and Markets Act 2000, which focuses on both cyber-security and supply chain risks. That has worked relatively proficiently in that context, so I hope that there are some good lessons to learn from that.
Lincoln Jopp
Q
Kanishka Narayan: The way in which I would envisage it is that each individual regulator assesses the critical nature of the risk posed to its regulated operators. If a hospital has a third party supplier, and the presence and nature of its supply means that there is a critical risk exposure for the hospital, that would be in scope for some degree of regulation in the Bill. To your question, if there is a comparable but separate hospital in a part of England that is separately regulated, but has the same third party supplier, there is obviously a question of whether that third party supplier would end up being regulated twice if the criticality threshold is met. In that instance, and in other similar instances of multiple regulators covering the same third party supplier, I would expect a high degree of co-ordination. In fact, the provisions in the Bill, as well as my hopes for subsequent guidance, are focused on our efficiency and proportionality when there are multiple regulators. However, I think the assessment has to be undertaken by each regulator on a separate basis, because the question being assessed is not the nature, the sum risk, of the third party supplier in itself, but the risk posed by its relationship to the operator it is providing to—if that makes sense.
Lincoln Jopp
Q
Kanishka Narayan: Yes, I guess, added together in the sense that they would be separately regulated, but they would all come within the scope of the regulations. Where there is an overlap in the party being regulated, my hope is that the Bill provides for individual regulation, but is very much open to the prospect of a lead regulator engaging in a softer way with the other regulators, as long as each regulator feels that that has assured them of the risk.
Andrew Cooper
Q
We have heard quite a bit about how important it will be, if taking a sectoral approach, to make sure that sharing information between regulators works smoothly, and that there are no information silos. The witness from Ofcom talked about an annual report to the National Cyber Security Centre. That sent chills down my spine, though I am sure she did not mean it quite in that way. How will you ensure that there is an adequate flow of information between regulators in a timely manner? They might not realise that there is cross-sectoral relevance, but when that information is provided to another regulator, it might turn out that there is. How do you address the importance of a single point of reporting that we heard about time and again from witnesses today?
Kanishka Narayan: Those are really important points. In terms of supporting the quality, frequency and depth of information sharing, first, the Bill provides the legal possibility of doing that in a deeper way. It gives the permission and the ability to do that across regulators.
Secondly, in the light of the implicit expectation of that information sharing, the National Cyber Security Centre already brings together all the relevant regulators for deeper conversation and engagement on areas of overlap, best practice sharing, and particularly the sharing of information related to incidents and wider risk as a result. I hope that will continue to be systematic.
On the question of a single reporting avenue, the National Cyber Security Centre, from an incident and operational point of view, is clearly the primary and appropriate location during the implementation of the Bill. From my conversations with the centre and its conversations with the regulators, I know there has been engagement to ensure that it remains a prompt venue for regulators to feed in their information.
Andrew Cooper
Q
Kanishka Narayan: The Bill currently says, “We are now giving you the power to be able to do information sharing.” The Bill, as well as other specific bits of wider legislation, has clear expectations on regulators to carry out their regulatory duty. If there appears to be a challenge in the frequency and quality of information sharing, we will of course look at whether we need to go further, but at the moment, giving them substantive permission and the fact that they have clear regulatory responsibilities individually is a very powerful combination.
David Chadwick
Q
Kanishka Narayan: As I mentioned at the outset, the scope of the sectors is focused on a specific test: are they essential services, the disruption to which could cause an immediate threat to life or have an extremely significant impact on the day-to-day functioning of the country? I do not mean to diminish the significance of electoral services, but, notwithstanding their significant impact on me as a candidate on election day, the test does not appear to be met.
David Chadwick
Q
Kanishka Narayan: It is absolutely critical that boards take their responsibilities to the organisation and the consequences of being in a regulated sector very seriously. The scope of the Bill has been mentioned. The Secretary of State wrote to FTSE 350 businesses, as well as a range of small businesses, to make that point very clear. The cyber assessment framework has particular requirements for boards to take their cyber-security responsibilities seriously. In the course of implementing the Bill and in the secondary legislation process, we will look to ensure that specified security and resilience activities, including the possibility of specific responsibilities, are set out very clearly.
Dr Gardner
Q
Kanishka Narayan: It is an important point. We know that the quality of current regulation for cyber-security varies across regulators. As an earlier panellist said, there is virtue in the fact that we have not set an effective cap on where regulators can go by having a single standard. At the same time, we need to make sure that we are raising a consistent floor of quality and proportionality judgments.
First, there is obviously constant oversight of each regulator through the lead Departments. In my case, for example, we consistently engage with Ofcom on a range of areas, including this one, to ensure the quality of regulation and that proportionality judgment is appropriately applied. Secondly, there is a clear commitment in the Bill for the Secretary of State to report back, on a five-year basis, on the overall implementation of the regime proposed in the Bill. That will be when we can get a global view of how the whole system is working.
The Chair
That brings us to the end of the time allotted for the Committee to ask questions, and to the end of the sitting. On behalf of the Committee, I thank the Minister for his evidence.
Ordered, That further consideration be now adjourned. —(Taiwo Owatemi.)