Cyber Security and Resilience (Network and Information Systems) Bill (Second sitting) Debate
Full Debate: Read Full DebateAllison Gardner
Main Page: Allison Gardner (Labour - Stoke-on-Trent South)Department Debates - View all Allison Gardner's debates with the Department for Science, Innovation & Technology
(1 day, 20 hours ago)
Public Bill Committees
Chris Vince (Harlow) (Lab/Co-op)
Q
Stuart Okin: In the energy sector, we tend to use operational technology rather than IT systems. That might mean technology without a screen, so an embedded system. It is therefore important to be able to customise our guidance. We do that today. We use the cyber assessment framework as a baseline, and we have a 335-page overlay on our website to explain how that applies to operational technology in our particular space. It is important to be able to customise accordingly; indeed, we have added physical elements to the cyber assessment framework, which is incredibly important. We welcome that flexibility being maintained in the Bill.
Ian Hulme: Just to contrast with colleagues from Ofcom and Ofgem, ICO’s sector is the whole economy, so it is important that we are able to produce guidance that speaks to all the operators in that sector. Because our sector is much bigger, we currently have something like 550 trust service providers registered, and that will grow significantly with the inclusion of managed service providers. So guidance will be really important to set expectations from a regulatory perspective.
Natalie Black: To round this off, at the end of the day we always have to come back to the problem we are trying to solve, which is ensuring cyber-security and resilience. As you will have heard from many others today, cyber is a threat that is always evolving. The idea that we can have a stagnant approach is for the birds. We need to be flexible as regulators. We need to evolve and adapt to the threat, and to the different operators we will engage with over the next couple of years. Collectively, we all appreciate that flexibility.
Dr Allison Gardner (Stoke-on-Trent South) (Lab)
Q
The ICO is a horizontal regulator working across all sectors. In your experience, would a single cyber regulator be a good idea? What would be the benefits and the challenges? I will allow Ofcom and Ofgem to jump in and defend themselves.
Ian Hulme: I suppose the challenge with having a single regulator is that—like ourselves, as a whole-economy regulator—it will have to prioritise and direct its resources at the issues of highest harm and risk. One benefit of a sectoral approach is that we understand our sectors at a deeper level; we certainly work together quite closely on a whole range of issues, and my teams have been working with Natalie and Stuart’s teams on the Bill over the last 18 months, and thinking about how we can collaborate better and co-ordinate our activities. It is really pleasing to see that that has been recognised in the Bill with the provisions for information sharing. That is going to be key, because the lack of information-sharing provisions in the current regs has been a bit of a hindrance. There are pros and cons, but a single regulator will need to prioritise its resources, so you may not get the coverage you might with a sectoral approach.
Natalie Black: Having worked in this area for quite some time, I would add that the challenge with a single regulator is that you end up with a race to the bottom, and minimum standards you can apply everywhere. However, with a tailored approach, you can recognise the complexity of the cyber risk and the opportunity to target specific issues—for example, prepositioning and ransomware. That said, we absolutely recognise the challenge for operators and companies in having to bounce between regulators. We hear it all the time, and you will see a real commitment from us to do something about it.
Some of that needs to sit with the Department for Science, Innovation and Technology, which is getting a lot of feedback from all of us about how we need it to co-ordinate and make things as easy as possible for companies—many of which are important investors in our economy, and we absolutely recognise that. We are also doing our bit through the UK Regulators Network and the Digital Regulation Cooperation Forum to find the low-hanging fruit where we can make a difference. To give a tangible example, we think there should be a way to do single reporting of incidents. We do not have the answer for that yet, but that is something we are exploring to try and make companies’ lives easier. To be honest, it will make our lives easier as well, because it wastes our time having to co-ordinate across multiple operators.
Bradley Thomas (Bromsgrove) (Con)
Q
Ian Hulme: Again, to contrast the ICO’s position with that of other colleagues, we have a much larger sector, as it currently exists, and we will have a massively larger sector again in the future. We are also funded slightly differently. The ICO is grant in aid funded from Government, so we are dependent on Government support.
To move from a reactive footing, which is our position at the moment—that is the Government’s guidance to competent authorities and to the ICO specifically—to a proactive footing with a much expanded sector, will need significant uplift in our skills and capability, as well as system development in order to register and ingest intelligence from MSPs and relevant digital service providers in the future.
From our perspective at the ICO, we need significant support from DSIT so that we can transition into the new regulatory regime. It will ultimately be self-funding—it is a sustainable model—but we need continued support during the transition period.
Sarah Russell
Q
Professor John Child: Yes. It is not the easiest criminal law tale, if you like. If there were a problem of overcriminalisation in the sense of prosecutions, penalisation, high sentences and so on, the solution would be to look at a whole range of options, including prosecutorial discretion, sentencing or whatever it might be, to try to solve that problem. That is not the problem under the status quo. The current problem is purely the original point of criminalisation. Think of an industry carrying out potentially criminalised activity. Even if no one is going to be prosecuted, the chilling effect is that either the work is not done or it is done under the veil of potential criminalisation, which leads to pretty obvious problems in terms of insurance for that kind of industry, the professionalisation of the industry and making sure that reporting mechanisms are accurate.
We have sat through many meetings with the CPS and those within the cyber-security industry who say that the channels of communication—that back and forth of reporting—is vital. However, a necessary step before that communication can happen is the decriminalisation of basic practices. No industry can effectively be told on the one hand, “What you are doing is vital,” but on the other, “It is a criminal offence, and we would like you to document it and report it to us in an itemised fashion over a period of time.” It is just not a realistic relationship to engender.
The cyber-security industry has evolved in a fragmented way both nationally and internationally, and the only way to get those professionalisation and cyber-resilience pay-offs is by recognising that the criminal law is a barrier—not because it is prosecuting or sentencing, but because of its very existence. It does not allow individuals to say, “If, heaven forbid, I were prosecuted, I can explain that what I was doing was nationally important. That is the basis on which I should not be convicted, not because of the good will of a prosecutor.”
Dr Gardner
Q
Professor John Child: I think the Bill does a lot of things quite effectively. It modernises in a sensible way and it allows for the recognition of change in type of threat. This goes back to my criminalisation point. Crucially, it also allows modernisation and flexibility to move through into secondary legislation, rather than us relying purely on the maturations of primary legislation.
In terms of board-level responsibility, I cannot speak too authoritatively on the civil law aspects, but drawing on my criminal law background, there is something in that as well. At the moment, the potential for criminalisation applies very much to those making unauthorised access to another person’s system. That is the way the criminal law works. We also have potential for corporate liability that can lead all the way up to board rooms, but only if you have a directing mind—so only if a board member is directing that specific activity, which is unlikely, apart from in very small companies.
You can have a legal regime that says, whether through accreditation or simple public interest offences, that there are certain activities that involve unauthorised access to another person’s system, which may be legitimate or indeed necessary. However, we want a professional culture within that; we do not want that outsourced to individuals around the world. You can then build in sensible corporate liability based on consent or connivance, which goes to individuals in the boardroom, or a failure-to-prevent model of criminalisation, which is more popular when it comes to financial crimes. That is where you say, “If this exists in your sector, as an industry and as a company, you can be potentially liable as an entity if you do not make sure these powers are used responsibly, and if you essentially outsource to individuals in order to avoid personal liabilities”.
Dr Gardner
Q
Professor John Child: Again, I have to draw back to the criminal law aspects. I think the Bill does the things it needs to do well; certainly, from the conversations I have had with those in cyber-security and so on, these are welcome steps in the right direction.
However, when you look at critical national infrastructure, although you can create layers of civil responsibility and regulation—which is entirely sensible—most of that will filter down to individuals doing cyber-security and resilience work. It is about empowering those individuals; within a state apparatus, that is one thing, but even with regulators and in-house cyber-security experts, individuals are working only within the confines of what they are allowed to do under the criminal law, as well as the civil regulatory system.
The reason I have been asked here, and what a lot of my work has focused on, is this: if you filter responsibility down to individuals doing security work for national as well as commercial infrastructure, you need to empower them to do that work effectively. The current law does not do that; it creates the problem of either doing that work under the veil of criminalisation, or not doing it, with work being outsourced to places where you do not have the back-and-forth communication and reporting regime you would need.
Dr Gardner
I think you are touching on the old problem of where liability lies when you have this long supply chain of diffused responsibility, but thank you.
Dave Robertson
Q
Professor John Child: That is a good question. It is certainly fair to say that all jurisdictions are somewhat in flux about how to deal with cyber threats, which are mushrooming in ways people would not have expected—certainly not in 1990, but even many years after.
The various international conventions—the OECD, the Budapest convention and so on—require regulation and criminalisation, but those are not nearly as wide as the blanket approach that was taken in this country. Some comparative civil law jurisdictions in the rest of Europe start from a slightly different place, in that they did not necessarily take the maximalist approach to criminalisation we did.
In a number of jurisdictions, you do not have direct criminalisation of all activities, regardless of the intention of the actor, in the same way that we do. So we are starting from a slightly different position. Having said that, we do see a number of jurisdictions making positive strides in this direction, because they need to; indeed, we see that at European Union level as well, where directives are being created to target this area of concern.
There are a few examples. We wrote a comparative report, incidentally, which is openly available. In terms of some highlights from that, there is a provision in French law, for example, where, despite mandatory prosecution being the general model within French criminal law, there is a carve-out relating to cyber-security and legitimate actors, where there is not the same requirement to prosecute. In the Netherlands, there was a scandal around hacking of keycards for public transport. That was done for responsible reasons, and there was a backlash in relation to prosecution there. There were measures taken in terms of prosecutorial discretion. Most recently, in Portugal, we saw a specific cyber-security defence created within the criminal law just last year.
In the US, it varies between states. In a lot of states, you have quite an unhelpful debate between minimalist and maximalist positions, where they either want to have complete hack-back on the one hand or no action at all on the other, but you have a slightly more tolerant regime in terms of prosecution.
So there are varying degrees, but certainly that is the direction of travel. For sensible, criminal law reasons that I would speak to, as well as the commercial benefits that come with a sector that is allowed to do its work properly, and the security benefits, that is certainly the direction of travel.
David Chadwick
Q
Richard Starnes: On what you say about the 18-month tenure, one of the problems is stress. A lot of CISOs are burning out and moving to companies that they consider to have boards that are more receptive to what they do for a living. Some companies get it. Some companies support the CISOs, and maybe have them reporting to a parallel to the CIO, or chief information officer. A big discussion among CISOs is that having a CISO reporting to a CIO is a conflict of interest. A CISO is essentially a governance position, so you wind up having to govern your boss, which I would submit is a bit of a challenge.
How do we help CISOs? First, with stringent application of regulatory instruments. We should also look at or discuss the idea of having C-level or board-level executives specifically liable for not doing proper risk governance of cyber-security—that is something that I think needs to be discussed. Section 172 of the Companies Act 2006 states that you must act in the best interests of your company. In this day and age, I would submit that not addressing cyber-risk is a direct attack on your bottom line.
Dr Gardner
Q
Richard Starnes: I think this should flow from the board to the C-level executives. Most boards have a risk committee of some sort, and I think the chair of the risk committee would be a natural place for that responsibility to sit, but there has to be somebody who is ultimately responsible. If the board does not take it seriously, the C-levels will not, and if the C-levels will not, the rest of the company will not.
Dr Gardner
Q
Richard Starnes: That is a very broad question.
Dr Gardner
I know, sorry. I collapsed it down from quite a few.
Richard Starnes: There is any number of different reasons. You have 12 competent authorities, at last count, with varying funding models and access to talent. Those could vary quite a bit, depending on those factors. I am not really sure how to answer that question.
Dr Gardner
Q
Richard Starnes: True, but I would submit that under the Companies Act that liability is already there for all the directors; it just has not been used that way.
Emily Darlington
Q
Richard Starnes: You just stepped on one of my soapbox issues. I would like to see the code of practice become part of the annual Companies House registrations for every registered company. To me, this is an attestation that, “We understand cyber-security, we’ve had it put in front of us, and we have to address it in some way.”
One of the biggest problems, which Andy talked about earlier, is that we have all these wonderful things that the Government are doing with regard to cyber-security, down to the micro-level companies, but there are 5.5 million companies in the United Kingdom that are not enterprise-level companies, and the vast majority of them have 25 employees or fewer. How do we get to these people and say, “This is important. You need to look at this”? This is a societal issue. The code of practice and having it registered through Companies House are the way to do that. We need to start small and move big. Only 3% of businesses are involved in Cyber Essentials, which is just that: the essentials. It is the baseline, so we need to start there.
Chris Vince
Q
Stewart Whyte: Anything that increases or improves our processes in the NHS for a lot of the procured services that we take in, and anything that is going to strengthen the framework between the health board or health service and the suppliers, is welcome for me. One of our problems in the NHS is that the systems we put in are becoming more and more complex. Being able to risk assess them against a particular framework would certainly help from our perspective. A lot of our suppliers, and a lot of our systems and processes, are procured from elsewhere, so we are looking for anything at all within the health service that will improve the process and the links with third party service providers.
Dr Gardner
Q
Brian Miller: That is a great question. I will touch on some different parts, because I might have slightly different information from some of the information you have heard previously. On reporting—Stewart will deal with the data protection element for reporting into the Information Commissioner’s Office—we report to the Scottish Health Competent Authority. It is important that we have an excellent relationship with the people there. To put that in context, I was speaking to them yesterday regarding our transition to the CAF, as part of our new compliance for NHS Greater Glasgow and Clyde. If there was a reportable incident, we would report into the SHCA. The thresholds are really well defined against the confidentiality, integrity and availability triad—it will be patient impact and stuff like that.
Organisationally, we report up the chain to our director of digital services, and we have an information governance steering group. Our senior information risk officer is the director of digital, and the chief information security officer role sits with our director of digital. We report nationally, and we work really closely with National Services Scotland’s Cyber Security Centre of Excellence, which does a lot of our threat protection and secure operations, 24/7, 365 days a year. We work with the Scottish Government through the Scottish Cyber Co-ordination Centre and what are called CREW—cyber resilience early warning—notices for a lot of threat intelligence. If something met the threshold, we would report to the SHCA. Stewart, do you want to come in on the data protection officer?
Stewart Whyte: We would report to the Information Commissioner, and within 72 hours we also report to the Scottish Government information governance and data protection team. We would risk assess the breaches and determine whether they meet the threshold for reporting. Not every data breach is required to be reported.
From the reporting perspective, it would be helpful to report into one individual organisation. I noticed that in the reporting requirements we are looking at doing it within 24 hours, which could be quite difficult, because sometimes we do not know everything about the breach within that time. We might need more information to be able to risk assess it appropriately. Making regulators aware of the breach as soon as possible is always going to be a good thing.
Lincoln Jopp
Q
Brian Miller: We would work with the Scottish Health Competent Authority as our regulator; I cannot speak for other regulators and what that might look like. We are doing work on what assurance for critical suppliers outside the Bill looks like just now, and we are working across the boards in Scotland on identifying critical suppliers. Outside of that, for any suppliers or any new services, we will assess the risk individually, based on the services they are providing.
The Bill is really valuable for me, particularly when it comes to managed service provision. One of the questions I was looking at is: what has changed since 2018? The biggest change for me is that identity has went to the cloud, because of video conferencing and stuff like that. When identity went to the cloud, it then involved managed service providers and data centres. We have put additional controls around that, because the network perimeter extended out into the cloud. We might want to take advantage of those controls for new things that come online, integrating with national identity, but we need to be assured that the companies integrating with national identity are safe. For me, the Bill will be a terrific bit of legislation that will help me with that—if that makes sense.
Dr Gardner
Q
Carla Baker: My comment on information sharing was about what else the Government could do. It was not necessarily specifically to do with the Bill. If you want me to elaborate on the wider issue of information sharing, I am happy to.
Dr Gardner
Particularly between regulators, and how that would work.
Carla Baker: I cannot necessarily talk in much detail about information sharing across regulators. It is more about information sharing across the technology industry that I can talk about.
Dr Gardner
Q
I will ask my actual question, and I am trying to get my head around this. You recommend mandating that company boards be accountable for mitigating cyber-risks, and as we know from the annual cyber-security breaches survey, there are declining levels of board responsibility for cyber in recent years, which links to whether there should be a statutory duty. I am a little worried about small and microbusinesses having to deal with that regulatory burden, especially if they are designated as critical suppliers. I am trying to marry those two things together, and the concern of where liability sits, because we are very dependent on service providers. I do not know if that makes any sense to you, but could you clarify my thinking?
Chris Parker: It is a concern. I will start off with a small point about where there is a statutory requirement, certainly for large companies. I personally believe—and I am pretty sure that most industry people I speak to would say this—that it would be very surprising if we did not have cyber-focused people on boards and in much bigger governance, as we would in a financial services company, where people who are expert in financial risk are able to govern appropriately. As we get smaller and smaller in scale, that is much harder to do.
The good news is that there are some brilliant—and I really mean that—resources available from probably the most underused website in the world, but the best one, which is the National Cyber Security Centre website. It has some outstanding advice for boards and governance on there. You can effectively make a pack and write a checklist, even if you are a very small company with a board of two people, and go through your own things and make sure your checklists are there.
The data and the capability are there to give support. Whether it is signposted enough, and whether we are helping on a local level, to make sure that people are aware of those things is perhaps something we could do better at in this country. But I am sure that industry will do our part, and we do, to share and reinforce the good sharing of things like that website, to guide good governance for SMEs especially.
Carla Baker: That board-level accountability is really important, and it is crucial for cyber-security. I think it is getting better—from the senior execs that I speak to in industry, there is more understanding—but generally speaking, there is a view that cyber-security is an IT issue, not a business issue. I am sure you have heard throughout the day about understanding the risks we have seen around vulnerabilities, and the incidents that have affected the retail or manufacturing sectors. Those are substantial incidents that have impacted the UK and have systemic knock-on effects. Organisations have to understand the serious nature of cyber-security, and therefore put more emphasis on cyber at the board level.
Should we be mandating board-level governance? That is useful for this debate to seek information and input on, but the burden on SMEs has to be risk-based and proportionate, however it is framed.
Dr Gardner
Q
Chris Parker: That is a harder question. There is precedent here—of course, we can think back to the precedents that this great building has set on allowing things such as, post-Clapham train disaster, the Corporate Manslaughter and Corporate Homicide Act 2007 putting it very firmly on boards, evolving from the Health and Safety at Work etc. Act 1974. We are not there yet, but do not forget that we are starting to legislate, as is everyone else in Europe and America who are on this journey.
I believe that we will see a requirement at some point in the future. We all hope that the requirement is not driven by something terrible, but is driven by sensible, wise methodology through which we can find out how we can ensure that people are liable and accept their liability. We have seen statements stood up on health and safety from CEOs at every office in this country, for good reason, and that sort of evolution may well be the next phase.
Carla and I talk about this a lot, but we have to be careful about how much we put into this Bill. We have to get the important bit about critical national infrastructure under way, and then we can address it all collaboratively at the next stage to deal with very important issues such as that.
Lincoln Jopp
Q
Chris Parker: I was referring to strategic and critical suppliers, which is a list of Government suppliers. We are advocating that the level of governance and regulatory requirement inside an organisation is difficult, and it really is. It requires quite a lot of work and resource, and if we are putting that on to too small a supplier, on the basis that we think it is on the critical path, I would advocate a different system for risk management of that organisation, rather than it being in the regulatory scope of a cyber-resilience Bill. The critical suppliers should be the larger companies. If we start that way in legislation and then work down—the Bill is designed to be flexible, which is excellent—we can try to get that way.
As a last point on flexibility—this is perhaps very obvious to us but less so to people who are less aware of the Bill—there is a huge dynamic going on here where you have a continuum, a line, at one end of which you have the need for clarity, which comes from business. At the other you have a need for flexibility, which quite rightly comes from the Government, who want to adjust and adapt quite quickly to secure the population, society and the economy against a changing threat. That continuum has an opposing dynamic, so the CRB has a big challenge. We must therefore not be too hard on ourselves in finding exactly where to be on that line. Some things will go well, and some will just need to be looked at after a few years of practice—I really believe that. We are not going to get it all right, because of the complexities and different dynamics along that line.
Carla Baker: This debate about whether SMEs should be involved or regulated in this space has been around since we were discussing GDPR back in 2018. It comes down to the systemic nature of the supplier. You can look at the designation of critical dependencies. I am sure you have talked about this, but for example, an SME software company selling to an energy company could be deemed a critical supplier by a regulator, and it is then brought into scope. However, I think it should be the SMEs that are relevant to the whole sector, not just to one organisation. If they are systemic and integral to a number of different sectors, or a number of different organisations within a sector, it is fair enough that they are potentially brought into scope.
It is that risk-based approach again. But if it is just one supplier, one SME, that is selling to one energy company up in the north of England, is it risk-based and proportionate that they are brought into scope? I think that is debatable.
Dr Gardner
Q
Kanishka Narayan: It is an important point. We know that the quality of current regulation for cyber-security varies across regulators. As an earlier panellist said, there is virtue in the fact that we have not set an effective cap on where regulators can go by having a single standard. At the same time, we need to make sure that we are raising a consistent floor of quality and proportionality judgments.
First, there is obviously constant oversight of each regulator through the lead Departments. In my case, for example, we consistently engage with Ofcom on a range of areas, including this one, to ensure the quality of regulation and that proportionality judgment is appropriately applied. Secondly, there is a clear commitment in the Bill for the Secretary of State to report back, on a five-year basis, on the overall implementation of the regime proposed in the Bill. That will be when we can get a global view of how the whole system is working.
The Chair
That brings us to the end of the time allotted for the Committee to ask questions, and to the end of the sitting. On behalf of the Committee, I thank the Minister for his evidence.
Ordered, That further consideration be now adjourned. —(Taiwo Owatemi.)