All 19 contributions to the Data Protection and Digital Information Bill 2022-23

Read Bill Ministerial Extracts

Wed 20th Mar 2024
Data Protection and Digital Information Bill
Grand Committee

Committee stage & Committee stage: Minutes of Proceedings & Committee stage: Minutes of Proceedings & Committee stage & Committee stage

Data Protection and Digital Information (No. 2) Bill

2nd reading
Monday 17th April 2023

(1 year, 2 months ago)

Commons Chamber
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Read Debate Ministerial Extracts
Second Reading
18:06
Julia Lopez Portrait The Minister for Data and Digital Infrastructure (Julia Lopez)
- Hansard - - - Excerpts

I beg to move, That the Bill be now read a Second time.

Data is already the fuel driving the digital age: it powers the everyday apps that we use, public services are being improved by its better use and businesses rely on it to trade, produce goods and deliver services for their customers. But how we choose to use data going forward will become even more important: it will determine whether we can grow an innovative economy with well-paid, high-skill jobs, it will shape our ability to compete globally in developing the technologies of the future and it will increasingly say something about the nature of our democratic society. The great challenge for democracies, as I see it, will be how to use data to empower rather than control citizens, enhancing their privacy and sense of agency without letting authoritarian states—which, in contrast, use data as a tool to monitor and harvest information from citizens—dominate technological advancement and get a competitive advantage over our companies.

The UK cannot step aside from the debate by simply rubber-stamping whatever iteration of the GDPR comes out of Brussels. We have in our hands a critical opportunity to take a new path and, in doing so, to lead the global conversation about how we can best use data as a force for good—a conversation in which using data more effectively and maintaining high data protection standards are seen not as contradictory but as mutually reinforcing objectives, because trust in this more effective system will build the confidence to share information. We start today not by kicking off a revolution, turning over the apple cart and causing a compliance headache for UK firms, but by beginning an evolution away from an inflexible one-size-fits-all regime and towards one that is risk-based and focused on innovation, flexibility and the needs of our citizens, scientists, public services and companies.

Businesses need data to make better decisions and to reach the right consumers. Researchers need data to discover new treatments. Hospitals need it to deliver more personalised patient care. Our police and security services need data to keep our people safe. Right now, our rules are too vague, too complex and too confusing always to understand. The GDPR is a good standard, but it is not the gold standard. People are struggling to utilise data to innovate, because they are tied up in burdensome activities that are not fundamentally useful in enhancing privacy.

A recently published report on compliance found that 81% of European publishers were unknowingly in breach of the GDPR, despite doing what they thought the law required of them. A YouGov poll from this year found that one in five marketing professionals in the UK report knowing absolutely nothing about the GDPR, despite being bound by it. It is not just businesses: the people whose privacy our laws are supposed to protect do not understand it either. Instead, they click away the thicket of cookie pop-ups just so they can see their screen.

The Bill will maintain the high standards of data protection that British people rightly expect, but it will also help the people who are most affected by data regulation, because we have co-designed it with those people to ensure that our regulation reflects the way in which real people live their lives and run their businesses.

Christine Jardine Portrait Christine Jardine (Edinburgh West) (LD)
- Hansard - - - Excerpts

Does the Minister agree that the retention and enhancement of public trust in data is a major issue, that sharing data is a major issue for the public, and that the Government must do more—perhaps she can tell us whether they intend to do more—to educate the public about how and where our data is used, and what powers individuals have to find out this information?

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I thank the hon. Lady for her helpful intervention. She is right: as I said earlier, trust in the system is fundamental to whether citizens have the confidence to share their data and whether we can therefore make use of that data. She made a good point about educating people, and I hope that this debate will mark the start of an important public conversation about how people use data. One of the challenges we face is a complex framework which means that people do not even know how to talk about data, and I think that some of the simplifications we wish to introduce will help us to understand one of the fundamental principles to which we want our new regime to adhere.

Julian Lewis Portrait Sir Julian Lewis (New Forest East) (Con)
- Hansard - - - Excerpts

My hon. Friend gave a long list of people who found the rules we had inherited from outside the UK challenging. She might add to that list Members of Parliament themselves. I am sure I am not alone in having been exasperated by being complained about to the Information Commissioner, in this case by a constituent who had written to me complaining about a local parish council. When I shared his letter with the parish council so that it could show how bogus his long-running complaint had been, he proceeded to file a complaint with the Information Commissioner’s Office because I had shared his phone number—which he had not marked as private—with the parish council, with which he had been in correspondence for several years. The Information Commissioner’s Office took that seriously. This sort of nonsense shows how over-restrictive regulations can be abused by people who are out to stir up trouble unjustifiably.

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

Let me gently say that if my right hon. Friend’s constituent was going to pick on one Member of Parliament with whom to raise this point, the Member of Parliament who does not, I understand, use emails would be one of the worst candidates. However, I entirely understand Members’ frustration about the current rules. We are looking into what we can do in relation to democratic engagement, because, as my right hon. Friend says, this is one of the areas in which there is not enough clarity about what can and cannot be done.

We want to reduce burdens on businesses, and above all for the small businesses that account for more than 99% of UK firms. I am pleased that the Under-Secretary of State for Business and Trade, my hon. Friend the Member for Thirsk and Malton (Kevin Hollinrake), is present to back up those proposals. Businesses that do not have the time, the money or the staff to spend precious hours doing unnecessary form-filling are currently being forced to follow some of the same rules as a billion-dollar technology company. We are therefore cutting the amount of pointless paperwork, ensuring that organisations only have to comply with rules on record-keeping and risk assessment when their processing activities are high-risk. We are getting rid of excessively demanding requirements to appoint data protection officers, giving small businesses much more flexibility when it comes to how they manage data protection risks without procuring external resources.

Those changes will not just make the process simpler, clearer and easier for businesses, they will make it cheaper too. We are expecting micro and small businesses to save nearly £90 million in compliance costs every year: that is £90 million more for higher investment, faster growth and better jobs. According to figures published in 2021, data-driven trade already generates 85% of our services exports. Our new international transfers regime clarifies how we can build data bridges to support the close, free and safe exchange of data with other trusted allies.

John Penrose Portrait John Penrose (Weston-super-Mare) (Con)
- Hansard - - - Excerpts

I am delighted to hear the Secretary of State talk about reducing regulatory burdens without compromising the standards that we are none the less delivering—that is the central distinction, and greatly to be welcomed for its benefits for the entrepreneurialism and fleetness of foot of British industry. Does she agree, however, that while the part of the Bill that deals with open data, or smart data, goes further than that and creates fresh opportunities for, in particular, the small challenger businesses of the kind she has described to take on the big incumbents that own the data lakes in many sectors, those possibilities will be greatly reduced if we take our time and move too slowly? Could it not potentially take 18 months to two years for us to start opening up those other sectors of our economy?

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I am delighted, in turn, to hear my hon. Friend call me the Secretary of State—I am grateful for the promotion, even if it is not a reality. I know how passionate he feels about open data, which is a subject we have discussed before. As I said earlier, I am pleased that the Under-Secretary of State for Business and Trade is present, because this morning he announced that a new council will be driving forward this work. As my hon. Friend knows, this is not necessarily about legislation being in place—I think the Bill gives him what he wants—but about that sense of momentum, and about onboarding new sectors into this regime and not being slow in doing so. As he says, a great deal of economic benefit can be gained from this, and we do not want it to be delayed any further.

Kit Malthouse Portrait Kit Malthouse (North West Hampshire) (Con)
- Hansard - - - Excerpts

Let me first draw attention to my entry in the Register of Members’ Financial Interests. Let me also apologise for missing the Minister’s opening remarks—I was taken by surprise by the shortness of the preceding statement and had to rush to the Chamber.

May I take the Minister back to the subject of compliance costs? I understand that the projected simplification will result in a reduction in those costs, but does she acknowledge that a new regime, or changes to the current regime, will kick off an enormous retraining exercise for businesses, many of which have already been through that process recently and reached a settled state of understanding of how they should be managing data? Even a modest amount of tinkering instils a sense among British businesses, particularly small businesses, that they must put everyone back through the system, at enormous cost. Unless the Minister is very careful and very clear about the changes being made, she will create a whole new industry for the next two or three years, as every data controller in a small business—often doing this part time alongside their main job—has to be retrained.

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

We have been very cognisant of that risk in developing our proposals. As I said in my opening remarks, we do not wish to upset the apple cart and create a compliance headache for businesses, which would be entirely contrary to the aims of the Bill. A small business that is currently compliant with the GDPR will continue to be compliant under the new regime. However, we want to give businesses flexibility in regard to how they deliver that compliance, so that, for instance, they do not have to employ a data protection officer.

Ben Lake Portrait Ben Lake (Ceredigion) (PC)
- Hansard - - - Excerpts

I am grateful to the Minister for being so generous with her time. May I ask whether the Government intend to maintain data adequacy with the EU? I only ask because I have been contacted by some business owners who are concerned about the possible loss of EU data adequacy and the cost that might be levied on them as a result.

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I thank the hon. Gentleman for pressing me on that important point. I know that many businesses are seeking to maintain adequacy. If we want a business-friendly regime, we do not want to create regulatory disruption for businesses, particularly those that trade with Europe and want to ensure that there is a free flow of data. I can reassure him that we have been in constant contact with the European Commission about our proposals. We want to make sure that there are no surprises. We are currently adequate, and we believe that we will maintain adequacy following the enactment of the Bill.

Rebecca Long Bailey Portrait Rebecca Long Bailey (Salford and Eccles) (Lab)
- Hansard - - - Excerpts

I was concerned to hear from the British Medical Association that if the EU were to conclude that data protection legislation in the UK was inadequate, that would present a significant problem for organisations conducting medical research in the UK. Given that so many amazing medical researchers across the UK currently work in collaboration with EU counterparts, can the Minister assure the House that the Bill will not represent an inadequacy in comparison with EU legislation as it stands?

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I hope that my previous reply reassured the hon. Lady that we intend to maintain adequacy, and we do not consider that the Bill will present a risk in that regard. What we are trying to do, particularly in respect of medical research, is make it easier for scientists to innovate and conduct that research without constantly having to return for consent when it is apparent that consent has already been granted for particular medical data processing activities. We think that will help us to maintain our world-leading position as a scientific research powerhouse.

Alongside new data bridges, the Secretary of State will be able to recognise new transfer mechanisms for businesses to protect international transfers. Businesses will still be able to transfer data across borders with the compliance mechanisms that they already use, avoiding needless checks and costs. We are also delighted to be co-hosting, in partnership with the United States, the next workshop of the global cross-border privacy rules forum in London this week. The CBPR system is one of the few existing operational mechanisms that, by design, aims to facilitate data flows on a global scale.

World-class research requires world-class data, but right now many scientists are reluctant to get the data they need to get on with their research, for the simple reason that they do not know how research is defined. They can also be stopped in their tracks if they try to broaden their research or follow a new and potentially interesting avenue. When that happens, they can be required to go back and seek permission all over again, even though they have already gained that permission earlier to use personal data. We do not think that makes sense. The pandemic showed that we cannot risk delaying discoveries that could save lives. Nothing should be holding us back from curing cancer, tackling disease or producing new drugs and treatments. This Bill will simplify the legal requirements around research so that scientists can work to their strengths with legal clarity on what they can and cannot do.

The Bill will also ensure that people benefit from the results of research by unlocking the potential of transformative technologies. Taking artificial intelligence as an example, we have recently published our White Paper: “AI regulation: a pro-innovation approach”. In the meantime, the Bill will ensure that organisations know when they can use responsible automated decision making and that people know when they can request human intervention where those decisions impact their lives, whether that means getting a fair price for the insurance they receive after an accident or a fair chance of getting the job they have always wanted.

I spoke earlier about the currency of trust and how, by maintaining it through high data protection standards, we are likely to see more data sharing, not less. Fundamental to that trust will be confidence in the robustness of the regulator. We already have a world-leading independent regulator in the Information Commissioner’s Office, but the ICO needs to adapt to reflect the greater role that data now plays in our lives alongside its strategic importance to our economic competitiveness. The ICO was set up in the 1980s for a completely different world, and the pace, volume and power of the data we use today has changed dramatically since then.

It is only right that we give the regulator the tools it needs to keep pace and to keep our personal data safe while ensuring that, as an organisation, it remains accountable, flexible and fit for the modern world. The Bill will modernise the structure and objectives of the ICO. Under this legislation, protecting our personal data will remain the ICO’s primary focus, but it will also be asked to focus on how it can empower businesses and organisations to drive growth and innovation across the UK, and support public trust and confidence in the use of personal data.

The Bill is also important for consumers, helping them to share less data while getting more product. It will support smart data schemes that empower consumers and small businesses to make better use of their own data, building on the extraordinary success of open banking tools offered by innovative businesses, which help consumers and businesses to manage their finances and spending, track their carbon footprint and access credit.

Jim Shannon Portrait Jim Shannon (Strangford) (DUP)
- Hansard - - - Excerpts

The Minister always delivers a very solid message and we all appreciate that. In relation to the high data protection standards that she is outlining, there is also a balance to be achieved when it comes to ensuring that there are no unnecessary barriers for individuals and businesses. Can she assure the House that that will be exactly what happens?

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I am always happy to take an intervention from the hon. Member. I want to assure him that we are building high data protection standards that are built on the fundamental principles of the GDPR, and we are trying to get the right balance between high data protection standards that will protect the consumer and giving businesses the flexibility they need. I will continue this conversation with him as the Bill passes through the House.

Mike Amesbury Portrait Mike Amesbury (Weaver Vale) (Lab)
- Hansard - - - Excerpts

I thank the Minster for being so generous with her time. With regard to the independent commissioner, the regulator, who will set the terms of reference? Will it be genuinely independent? It seems to me that a lot of power will fall on the shoulders of the Secretary of State, whoever that might be in the not-too-distant future.

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

The Secretary of State will have greater powers when it comes to some of the statutory codes that the ICO adheres to, but those powers will be brought to this House for its consent. The whole idea is to make the ICO much more democratically accountable. I know that concern about the independence of the regulator has been raised as we have been working up these proposals, but I wish to assure the House that we do not believe those concerns to be justified or legitimate. The Bill actually has the strong support of the current Information Commissioner, John Edwards.

The Bill will also put in place the foundations for data intermediaries, which are organisations that can help us to benefit from our data. In effect, we will be able to share less sensitive data about ourselves with businesses while securing greater benefits. As I say, one of the examples of this is open banking. Another way in which the Bill will help people to take back control of their data is by making it easier and more secure for people to prove things about themselves once, electronically, without having to dig out stacks of physical documents such as passports, bills, statements and birth certificates and then having to provide lots of copies of those documents to different organisations. Digital verification services already exist, but we want consumers to be able to identify trustworthy providers by creating a set of standards around them.

The Bill is designed not just to boost businesses, support scientists and deliver consumer benefits; it also contains measures to keep people healthy and safe. It will improve the way in which the NHS and adult social care organise data to deliver crucial health services. It will let the police get on with their jobs by allowing them to spend more time on the beat rather than on pointless paperwork. We believe that this will save up to 1.5 million hours of police time each year—

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I know that my hon. Friend has been passionate on this point, and we are looking actively into her proposals.

We are also updating the outdated system of registering births and deaths based on paper processes from the 19th century.

Data has become absolutely critical for keeping us healthy, for keeping us safe and for growing an economy with innovative businesses, providing jobs for generations to come. Britain is at its best when its businesses and scientists are at theirs. Right now, our rules risk holding them back, but this Bill will change that because it was co-designed with those businesses and scientists and with the help of consumer groups. Simpler, easier, clearer regulation gives the people using data to improve our lives the certainty they need to get on with their jobs. It maintains high standards for protecting people’s privacy while seeking to maintain our adequacy with the EU. Overall, this legislation will make data more useful for more people and more usable by businesses, and it will enable greater innovation by scientists. I commend the Bill to the House.

18:26
Lucy Powell Portrait Lucy Powell (Manchester Central) (Lab/Co-op)
- View Speech - Hansard - - - Excerpts

It is good finally to get the data Bill that was promised so long ago. We nearly got there in the halcyon days of September 2022, under the last Prime Minister, after it had been promised by the Prime Minister before. However, the Minister has a strong record of bringing forward and delivering things that the Government have long promised. I also know that she has another special delivery coming soon, which I very much welcome and wish her all the best with. She took a lot of interventions and I commend her for all that bobbing up and down while so heavily pregnant. I would also like to send my best wishes to the Secretary of State, who let me know that she could not be here today. I would also like to wish her well with her imminent arrival. There is lots of delivery going on today.

We are in the midst of a digital and data revolution, with data increasingly being the most prized asset and fundamental to the digital age, but this Bill, for all its hype, fails to meet that moment. Even since the Bill first appeared on the Order Paper last September, AI chatbots have become mainstream, TikTok has been fined for data breaches and banned from Government devices, and AI image generators have fooled the world into thinking that the Pope had a special papal puffer coat. The world, the economy, public services and the way we live and communicate are changing fast. Despite these revolutions, this data Bill does not rise to the challenges. Instead, it tweaks around the edges of GDPR, making an already dense set of privacy rules even more complex.

The UK can be a global leader in the technologies of the future. We are a scientific superpower, we have some of the world’s best creative industries and now, outside the two big trading blocs, we could have the opportunities of nimbleness and being in the vanguard of world-leading regulation. In order to harness that potential, however, we need a Government who are on the pitch, setting the rules of the game and ensuring that the benefits of new advances are felt by all of us and not just by a handful of companies. The Prime Minister can tell us again how much he loves maths, but without taking the necessary steps to support the data and digital economy, his sums just do not add up.

The contents of this Bill might seem technical—as drafted, they are incredibly technical—but they matter greatly to every business, consumer, citizen and organisation. As such, data is a significant source of power and value. It shapes the relationship between business and consumers, between the state and citizens, and much, much more. Data information is critical to innovation and economic growth, to modern public services, to democratic accountability and to transforming societies, if harnessed and shaped in the interest of the many, not simply the few—pretty major, I would say.

Now we have left the EU, the UK has an opportunity to lead the world in this area. The next generation of world-leading regulation could allow small businesses and start-ups to compete with the monopolies in big tech, as we have already heard. It could foster a climate of open data, enable public services to use and share data for improved outcomes, and empower consumers and workers to have control over how their data is used. In the face of this huge challenge, the Bill is at best a missed opportunity, and at worst adds another complicated and uncertain layer of bureaucracy. Although we do not disagree with its aims, there are serious questions about whether the Bill will, in practice, achieve them.

Data reform and new regulation are welcome and long overdue. Now that we have left the EU, we need new legislation to ensure that we both keep pace with new developments and make the most of the opportunities. The Government listened to some of the concerns raised in response to the consultation and removed most of the controversial and damaging proposals. GDPR has been hard to follow for some businesses, especially small businesses and start-ups, so streamlining and simplifying data protection rules is a welcome aim. However, we will still need some of them to meet EU data adequacy rules.

The aim of shifting away from tick-box exercises towards a more proactive and systematic approach to regulation is also good. Better and easier data sharing between public services is essential, and some of the changes in that area are welcome, although we will need assurances that private companies will not benefit commercially from personal health data without people’s say so. Finally, nobody likes nuisance calls or constant cookie banners, and the moves to reduce or remove them are welcome, although there are questions about whether the Bill lives up to the rhetoric.

In many areas, however, the Bill threatens to take us backwards. First, it may threaten our ability to share data with the EU, which would be seriously bad for business. Given the astronomical cost to British businesses should data adequacy with the EU be lost, businesses and others are rightly looking for more reassurances that the Bill will not threaten these arrangements. The EU has already said that the vast expansion of the Secretary of State’s powers, among other things, may put the agreement in doubt. If this were to come to pass, the additional burdens on any business operating within the EU, even vaguely, would be enormous.

British businesses, especially small businesses, have faced crisis after crisis. Many only just survived through covid and are now facing rising energy bills that threaten to push them over the edge. According to the Information Commissioner,

“most organisations we spoke to had a plea for continuity.”

The Government must go further on this.

Secondly, the complex new requirements in this 300-page Bill threaten to add more hurdles, rather than streamlining the process. Businesses have serious concerns that, having finally got their head around GDPR, they will now have to comply with both GDPR and all the new regulations in this Bill. That is not cutting red tape, in my view.

Thirdly, the Bill undermines individual rights. Many of the areas in which the Bill moves away from GDPR threaten to reduce protection for citizens, making it harder to hold to account the big companies that process and sell our data. Subject access requests are being diluted, as the Government are handing more power to companies to refuse such requests on the grounds of being excessive or vexatious. They are tilting the rules in favour of the companies that are processing our data. Data protection impact assessments will no longer be needed, and protections against automated decision making are being weakened.

Rebecca Long Bailey Portrait Rebecca Long Bailey
- Hansard - - - Excerpts

AlgorithmWatch explains that automated decision making is “never neutral.” Outputs are determined by the quality of the data that is put into the system, whether that data is fair or biased. Machine learning will propagate and enhance those differences, and unfortunately it already has. Is my hon. Friend concerned that the Bill removes important GDPR safeguards that protect the public from algorithmic bias and discrimination and, worse, provides Henry VIII powers that will allow the Secretary of State to make sweeping regulations on whether meaningful human intervention is required at all in these systems?

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

My hon. Friend makes two very good points, and I agree with her on both. I will address both points in my speech.

Taken together, these changes, alongside the Secretary of State’s sweeping new powers, will tip the balance away from individuals and workers towards companies, which will be able to collect far more data for many more purposes. For example, the Bill could have a huge impact on workers’ rights. There are ever more ways of tracking workers, from algorithmic management to recruitment by AI. People are even being line managed by AI, with holiday allocation, the assignment of roles and the determination of performance being decided by algorithm. This is most serious when a low rating triggers discipline or dismissal. Transparency and accountability are particularly important given the power imbalance between some employers and workers, but the Bill threatens to undermine them.

If a person does not even know that surveillance or algorithms are being used to determine their performance, they cannot challenge it. If their privacy is being infringed to monitor their work, that is a harm in itself. If a worker’s data is being monetised by their company, they might not even know about it, let alone see a cut. The Bill, in its current form, undermines workers’ ability to find out what data is held about them and how it is being used. The Government should look at this again.

The main problem, however, is not what is in the Bill but, rather, what is not. Although privacy is, of course, a key issue in data regulation, it is not the only issue. Seeing regulation only through the lens of privacy can obscure all the ways that data can be used and can impact on communities. In modern data processing, our data is not only used to make decisions about us individually but pooled together to analyse trends and predict behaviours across a whole population. Using huge amounts of data, companies can predict and influence our behaviour. From Netflix recommendations to recent examples of surge pricing in music and sports ticketing, to the monitoring of covid outbreaks, the true power of data is in how it can be analysed and deployed. This means the impact as well as the potential harms of data are felt well beyond the individual level.

Moreover, as we heard from my hon. Friend the Member for Salford and Eccles (Rebecca Long Bailey), the algorithms that analyse data often replicate and further entrench society’s biases. Facial recognition that is trained on mostly white faces will more likely misidentify a black face—something that I know the parliamentary channel sometimes struggles with. AI language bots produce results that reflect the biases and limitations of their creators and the data on which they are trained. This Bill does not take on any of these community and societal harms. Who is responsible when the different ways of collecting and using data harm certain groups or society as a whole?

As well as the harms, data analytics offers huge opportunities for public good, as we have heard. Opening up data can ensure that scientists, public services, small businesses and citizens can use data to improve all our lives. For example, Greater Manchester has, over the years, linked data across a multitude of public services to hugely improve our early years services, but this was done entirely locally and in the face of huge barriers. Making systems and platforms interoperable could ensure that consumers can switch services to find the best deal, and it could support smaller businesses to compete with existing giants.

Establishing infrastructure such as a national research cloud and data trusts could help small businesses and not-for-profit organisations access data and compete with the giants. Citymapper is a great example, as it used Transport for London’s open data to build a competitor to Google Maps in London. Open approaches to data will also provide better oversight of how companies use algorithms, and of the impact on the rest of us.

Finally, where are the measures to boost public trust? After the debacle of the exam algorithms and the mishandling of GP data, which led millions of people to withdraw their consent, and with workers feeling the brunt but none of the benefits of surveillance and performance management, we are facing a crisis in public trust. Rather than increasing control over and participation in how our data is used, the Bill is removing even the narrow privacy-based protections we already have. In all those regards, it is a huge missed opportunity.

To conclude, with algorithms increasingly making important decisions about how we live and work, data protection has become ever more important to ensure that people have knowledge, control, confidence and trust in how and why data is being used. A data Bill is needed, but we need one that looks towards the future and harnesses the potential of data to grow our economy and improve our lives. Instead, this piecemeal Bill tinkers around the edges, weakens our existing data protection regime and could put our EU adequacy agreement at risk. We look forward to addressing some of those serious shortcomings in Committee.

18:40
John Whittingdale Portrait Sir John Whittingdale (Maldon) (Con)
- View Speech - Hansard - - - Excerpts

I welcome the Bill. I am delighted that it finally takes advantage of one of the freedoms that has resulted from our leaving the European Union, which I supported at the time and continue to support. As has been indicated, the Bill has had a long gestation. I was the Minister at the time of the issue of the consultation paper in September 2021 and the Bill first appeared a year later. As the Opposition spokesman pointed out, a small hiccup delayed it a bit further.

Our current data protection laws originate almost entirely from the EU and are based on GDPR. Before the adoption of GDPR in 2016, the UK Government opposed parts of it. I recall that the assessment at the time was that, although there were benefits to larger companies, there would be substantial costs for smaller firms and indeed that has been borne out. There was a debate in government about whether we should oppose the GDPR regulation when it was going through the process of the Commission formation. As so often was the case in the EU, we were advised that, if we opposed that, we would lose vital leverage and our ability to influence its development. Whether we were able then to influence its development is arguable, but it was decided that we should not outright oppose it. However, it has always been clear that the one-size-fits-all GDPR that currently is in place imposes significant costs on smaller firms. When we had the consultation in 2021, smaller firms in particular complained about the complexity of GDPR, and the uncertainty and cost that it imposed. Clearly, there was seen to be an opportunity to streamline it—not to remove it, but to make it simpler and more understandable, and to reduce some of the burdens it imposes. We now have that opportunity to diverge.

The other thing that came back from the consultation—I agree with the Opposition Members who have raised this point—was that there is an advantage in the UK’s retaining data adequacy with the EU. It was not taken for granted that we would get data adequacy. A lengthy negotiation with the EU took place before a data adequacy agreement was reached. As part of that process, officials rightly looked at what alternative there would be, should we not be granted data adequacy. It became clear that there are ways around it. Standard contractual clauses and alternative transfer mechanisms would allow companies to continue to exchange data. It would be a little more complicated. They would need to write the clauses into contracts. For that reason, there was clearly a value in having a general data adequacy agreement, but one should not think that the loss of data adequacy would be a complete disaster because, as I say, there are ways around it.

The Government are right to look at additional adequacy agreements with countries outside the EU, because therein lies a great opportunity. The EU has managed to conclude some, but not that many, and the Government have rightly identified a number of target countries where we see benefits from achieving data adequacy agreements. It is perfectly possible for us to diverge to a limited extent from GDPR and still retain adequacy. Notably, the EU recognises New Zealand’s regime as being adequate, even though New Zealand’s data protection laws are different from those of the EU. The fact that we decided to appoint the former New Zealand Information Commissioner as our own Information Commissioner means that he brings a particular degree of knowledge about that, which will be very useful.

In considering data protection law, it is sometimes said that there is a conflict between privacy—the right of consumers to have protection of their data—and the innovation and growth opportunities of technology companies. I do not believe that that is true; the two things have to be integral parts of our data protection laws. If people believe that their privacy is at risk, they will not trust the exchange of data. One problem is that, in general, people read only about the problems that arise, particularly from things such as identity theft, hacks and the loss of data as a result of people leaving memory sticks on phones or of cyber-criminals hacking into large databases and taking all their financial information. All those things are a genuine risk, but they present only one side of the picture and, in general, people reach their view about the importance of data protection according to all the risk, without necessarily seeing the real benefits that come from the free exchange of data. That was perhaps the lesson that covid showed us more than any other: by allowing the exchange of data, it allowed us to develop and research vaccines. We were able to research what worked in terms of prevention and the various measures that could be taken to protect consumers from getting covid. Therefore, covid was the big demonstration of the fact that data exchange can bring real benefits to all consumers. We are just on the threshold—

John Penrose Portrait John Penrose
- Hansard - - - Excerpts

Further to my right hon. Friend’s point about facilitating a trusted mechanism for sharing data, does he agree that the huge global success of open banking in this country has demonstrated that a trust framework not only makes people much more willing to exchange their data but frees up the economy and creates a world-leading sector at the same time?

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I agree with my hon. Friend on that. The use of smart data in open banking demonstrates the benefits that can flow from its use, and that example could be replicated in a large number of other sectors to similar benefit. I hope that that will be one benefit that will eventually flow from the changes we are making.

As I say, we are on the threshold of an incredibly exciting time. The use of artificial intelligence and automated decision making will bring real consumer benefits, although, of course, safeguards must be built in. The question of algorithmic bias was looked at by the Centre for Data Ethics and Innovation and there was evidence there. Obviously, we need to take account of that and build in protections against it, but, in general, the opportunities that can flow from making data more easily available are enormous.

I wish to flag up a couple of things. People have long found pop-up banner cookies deeply irritating. They have become self-defeating, because they are so ubiquitous that everybody just presses “yes”. The whole point of them was to acquire informed consent, but that is undermined if everybody is confronted by these things every time they log on to the internet and they automatically press “yes” without properly reading what they are consenting to. Restricting them to cookies that represent intrusive acquisition of data and explaining that to people and requiring consent is clearly an improvement. That will not only make data exchange easier but increase consumer protection, as people will know that they are being asked to give consent because they may choose not to allow their data to be used.

I understand the concerns that have been expressed about the Bill in some areas, particularly about the powers that will be given to the Secretary of State, but this is a complicated area. It is also one where technology is moving very fast. We need flexible legislation to keep up to date with the development of technology, so, to some extent, secondary legislation is probably the right way forward. We will debate these matters in Committee, but, generally, the Bill will help to deliver the Government’s declared intention, which is to make the UK the most successful data-driven technology economy in the world.

18:50
Carol Monaghan Portrait Carol Monaghan (Glasgow North West) (SNP)
- View Speech - Hansard - - - Excerpts

We can all agree that the free flow of personal data across borders is essential to the economy, not just within the UK but with other countries, including our biggest trading partner, the EU. Reforms to our data protection framework must have appropriate safeguards in place to ensure that we do not put EU-UK data flows at risk.

Despite the Government’s promises of reforms to empower people in the use of their data, the Bill instead threatens to undermine privacy and data protection. It potentially moves the UK away from the “adequacy” concept in the EU GDPR, and gives weight to the idea that different countries can maintain data protection standards in different but equally effective ways. The only way that we can properly maintain standards is by having a standard across the different trading partners, but the Bill risks creating a scenario where the data of EU citizens could be passed through the UK to countries with which the EU does not have an agreement. The changes are raising red flags in Europe. Many businesses have spoken out about the negative impacts of the Bill’s proposals. Many of them will continue to set their controls to EU standards and operate on EU terms to ensure that they can continue to trade there.

According to conservative estimates, the loss of the adequacy agreement could cost £1.6 billion in legal fees alone. That figure does not include the cost resulting from disruption of digital trade and investments. The Open Rights Group says:

“Navigating multiple data protection regimes will significantly increase costs and create bureaucratic headaches for businesses.”

Although I understand that the Bill is an attempt to reduce the bureaucratic burden for businesses, we are now potentially asking those businesses to operate with two different standards, which will cause them a bigger headache. It would be useful if the Government confirmed that they have sought legal advice on the adequacy impact of the Bill, and that they have confirmed with EU partners that the EU is content that the Bill and its provisions will not harm EU citizens or undermine the trade and co-operation agreement with the EU.

Several clauses of the Bill cause concern. We need more clarity on those that expand the powers of the Home Secretary and the police, and we will require much further discussion on them in Committee. Given what has been revealed over the past few months about the behaviour of some members of the Metropolitan police, there are clauses in the Bill that should cause us concern. A national security certificate that would give the police immunity when they commit crimes by using personal data illegally would cause quite a headache for many of us. The Government have not tried to explain why they think that police should be allowed to operate in the darkness, which they must now rectify if they are to improve public trust.

The Bill will also expand what counts as an “intelligence service” for the purposes of data protection law, again at the Home Secretary's discretion. The Government argue that this would create a “simplified” legal framework, but, in reality, it will hand massive amounts of people’s personal information to the police. This could include the private communications as well as information about an individual’s health, political belief, religious belief or sex life.

The new “designation notice” regime would not be reviewable by the courts, so Parliament might never find out how and when the powers have been used, given that there is no duty to report to Parliament. The Home Secretary is responsible for both approving and reviewing designation notices, and only a person who is “directly affected” by a such a notice will be able to challenge it, yet the Home Secretary would have the power to keep the notice secret, meaning that even those affected would not know it and therefore could not possibly challenge it.

These are expansive broadenings of the powers not only of the Secretary of State, but of the police and security services. If the UK Government cannot adequately justify these powers, which they have not done to date, they must be withdrawn or, at the very least, subject to meaningful parliamentary oversight.

Far from giving people greater power over their data, the Bill will stop the courts, Parliament and individuals from challenging illegal uses of data. Under the Bill, organisations can deny or charge a fee to individuals for the right to access information. The right hon. Member for New Forest East (Sir Julian Lewis) mentioned the difficulty he had with a constituent. I think we can all have some sympathy with that, because many of us have probably experienced similar requests from members of the public. However, it is the public’s right to have access to the data that we hold. If an organisation decides that these requests are “vexatious or excessive”, they can refuse them, but what is “vexatious or excessive”? These words are vague and open to interpretation. Moreover, charging a fee will create a barrier for some people, particularly those on lower incomes, and effectively restricts control of data to more affluent citizens.

The Bill changes current rules that prevent companies and the Government from making solely automated decisions about individuals that could have legal or other significant effects on their lives. We have heard a lot about the potential benefits of AI and how it could be used to enhance our lives, but for public trust and buy-in of AI, we need to know that there is some oversight. Without that, there will always be a question hanging over it. The SyRI case in the Netherlands involved innocuous datasets such as household water usage being used by an automated system to accuse individuals of benefit fraud.

The Government consultation response acknowledges that, for respondents,

“the right to human review of an automated decision was a key safeguard”.

But despite the Government acknowledging the importance of a human review in an automated decision, clause 11, if implemented, would mean that solely automated decision making is permitted in a wider range of contexts. Many of us get excited about AI, but it is important to acknowledge that AI still makes mistakes.

The Bill will allow the Secretary of State to approve international transfers to countries with weak data protection, so even if the Bill does not make data security in the UK weaker, it will weaken the protections of UK citizens’ data by allowing it to be transferred abroad in cases with lower safeguards.

It is useful to hear a couple of stakeholder responses. The Public Law Project has said:

“The Data Protection and Digital Information (No.2) Bill would weaken important data protection rights and safeguards, making it more difficult for people to know how their data is being used”.

The Open Rights Group has said:

“The government has an opportunity to strengthen the UK’s data protection regime post Brexit. However, it is instead setting the country on a dangerous path that undermines trust, furthers economic instability, and erodes fundamental rights.”

Since we are talking about a Bill under the Department for Science, Innovation and Technology, it is important to hear from the Royal Society, which says that losing adequacy with the EU would be damaging for scientific research in the UK, creating new costs and barriers for UK-EU research collaborations. While the right hon. Member for Maldon (Sir John Whittingdale) is right about the importance of being able to share data, particularly scientific data—and we understand the importance of that for things such as covid vaccines—we need to make sure this Bill does not set up further hurdles that could prevent that.

There is probably an awful lot for us to thrash out in Committee. The SNP will not vote against Second Reading tonight, but I appeal to those on the Government Front Bench to give an opportunity for hon. Members to amend and discuss this Bill properly in Committee.

19:01
Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- View Speech - Hansard - - - Excerpts

I am delighted to speak in support of this long-awaited Bill. It is a necessary piece of legislation to learn the lessons from GDPR and look at how we can improve the system, both to make it easier for businesses to work with and to give users and citizens the certainty they need about how their data will be processed and used.

In bringing forward new measures, the Bill in no way suggests that we are looking to move away from our data adequacy agreements with the European Union. Around the world, in north America, Europe, Australia and elsewhere in the far east, we see Governments looking at developing trusted systems for sharing and using data and for allowing businesses to process data across international borders, knowing that those systems may not be exactly the same, but they work to the same standards and with similar levels of integrity. That is clearly the direction that the whole world wants to move in and we should play a leading role in that.

I want to talk briefly about an important area of the Bill: getting the balance between data rights and data safety and what the Bill refers to as the “legitimate interest” of a particular business. I should also note that this Bill, while important in its own right, sits alongside other legislation—some of it to be introduced in this Session and some of it already well on its way through the Parliamentary processes—dealing with other aspects of the digital world. The regulation of data is an aspect of digital regulation; it is in some ways the fuel that powers the digital experience and is relevant to other areas of digital life as well.

To take one example, we have already established and implemented the age-appropriate design code for children, which principally addresses the way data is gathered from children online and used to design services and products that they use. As this Bill goes through its parliamentary stages, it is important that we understand how the age-appropriate design code is applied as part of the new data regime, and that the safeguards set out in that code are guaranteed through the Bill as well.

There has been a lot of debate, as has already been mentioned, about companies such as TikTok. There is a concern that engineers who work for TikTok in China, some of whom may be members of the Chinese Communist party, have access to UK user data that may not be stored in China, but is accessed from China, and are using that data to develop products. There is legitimate concern about oversight of that process and what that data might be used for, particularly in a country such as China.

However, there is also a question about data, because one reason the TikTok app is being withdrawn from Government devices around the world is that it is incredibly data-acquisitive. It does not just analyse how people use TikTok and from that create data profiles of users to determine what content to recommend to them, although that is a fundamental part of the experience of using it; it is also gathering, as other big apps do, data from what people do on other apps on the same device. People may not realise that they have given consent, and it is certainly not informed consent, for companies such as TikTok to access data from what they do on other apps, not just when they are TikTok.

It is a question of having trusted systems for how data can be gathered, and giving users the right to opt out of such data systems more easily. Some users might say, “I’m quite happy for TikTok or Meta to have that data gathered about what I do across a range of services.” Others may say, “No, I only want them to see data about what I do when I am using their particular service, not other people’s.”

The Online Safety Bill is one of the principal ways in which we are seeking to regulate AI now. There is debate among people in the tech sectors; a letter was published recently, co-signed by a number of tech executives, including Elon Musk, to say that we should have a six-month pause in the development of AI systems, particularly for large language models. That suggests a problem in the near future of very sophisticated data systems that can make decisions faster than a human can analyse them.

People such as Eric Schmidt have raised concerns about AI in defence systems, where an aggressive system could make decisions faster than a human could respond to them, to which we would need an AI system to respond and where there is potentially no human oversight. That is a frightening scenario in which we might want to consider moratoriums and agreements, as we have in other areas of warfare such as the use of chemical weapons, that we will not allow such systems to be developed because they are so difficult to control.

If we look at the application of that sort of technology closer to home and some of the cases most referenced in the Online Safety Bill, for example the tragic death of the teenager Molly Russell, we see that what was driving the behaviour of concern was data gathered about a user to make recommendations to that person that were endangering their life. The Online Safety Bill seeks to regulate that practice by creating codes and responsibilities for businesses, but that behaviour is only possible because of the collection of data and decisions made by the company on how the data is processed.

This is where the Bill also links to the Government’s White Paper on AI, and this is particularly important: there must be an onus on companies to demonstrate that their systems are safe. The onus must not just be on the user to demonstrate that they have somehow suffered as a consequence of that system’s design. The company should have to demonstrate that they are designing systems with people’s safety and their rights in mind—be that their rights as a worker and a citizen, or their rights to have certain safeguards and protections over how their data is used.

Companies creating datasets should be able to demonstrate to the regulator what data they have gathered, how that data is being trained and what it is being used for. It should be easy for the regulator to see and, if the regulator has concerns up-front, it should be able to raise them with the company. We must try to create that shift, particularly on AI systems, in how systems are tested before they are deployed, with both safety and the principles set out in the legislation in mind.

Kit Malthouse Portrait Kit Malthouse
- Hansard - - - Excerpts

My hon. Friend makes a strong point about safety being designed, but a secondary area of concern for many people is discrimination—that is, the more data companies acquire, the greater their ability to discriminate. For example, in an insurance context, we allow companies to discriminate on the basis of experience or behaviour; if someone has had a lot of crashes or speeding fines, we allow discrimination. However, for companies that process large amounts of data and may be making automated decisions or otherwise, there is no openly advertised line of acceptability drawn. In the future it may be that datasets come together that allow extreme levels of discrimination. For example, if they linked data science, psychometrics and genetic data, there is the possibility for significant levels of discrimination in society. Does he think that, as well as safety, we should be emphasising that line in the sand?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My right hon. Friend makes an extremely important point. In some ways, we have already seen evidence of that at work: there was a much-talked-about case where Amazon was using an AI system to aid its recruitment for particular roles. The system noticed that men tended to be hired for that role and therefore largely discarded applications from women, because that was what the data had trained it to do. That was clear discrimination.

There are very big companies that have access to a very large amount of data across a series of different platforms. What sort of decisions or presumptions can they make about people based on that data? On insurance, for example, we would want safeguards in place, and I think that users would want to know that safeguards are in place. What does data analysis of the way in which someone plays a game such as Fortnite—where the company is taking data all the time to create new stimuli and prompts to encourage lengthy play and the spending of money on the game—tell us about someone’s attitude towards risk? Someone who is a risk taker might be a bad risk in the eyes of an insurance company. Someone who plays a video game such as Fortnite a lot and sees their insurance premiums affected as a consequence would think, I am sure, that that is a breach of their data rights and something to which they have not given any informed consent. But who has the right to check? It is very difficult for the user to see. That is why I think the system has to be based on the idea that the onus must rest on the companies to demonstrate that what they are doing is ethical and within the law and the established guidelines, and that it is not for individual users always to demonstrate that they have somehow suffered, go through the onerous process of proving how that has been done, and then seek redress at the end. There has to be more up-front responsibility as well.

Finally, competition is also relevant. We need to safeguard against the idea of a walled garden for data meaning that companies that already have massive amounts of data, such as Google, Amazon and Meta, can hang on to what they have, while other companies find it difficult to build up meaningful datasets and working sets. When I was Chairman of the then Digital, Culture, Media and Sport Committee, we considered the way in which Facebook, as it then was, kicked Vine—a short-form video sharing app—off its platform principally because it thought that that app was collecting too much Facebook user data and was a threat to the company. Facebook decided to deny that particular business access to the Facebook platform. [Interruption.] I see that the Under-Secretary of State for Science, Innovation and Technology, my hon. Friend the Member for Sutton and Cheam (Paul Scully), is nodding in an approving way. I hope that he is saying silently that that is exactly what the Bill will address to ensure that we do not allow companies with big strategic market status to abuse their market power to the detriment of competitive businesses.

19:11
Darren Jones Portrait Darren Jones (Bristol North West) (Lab)
- View Speech - Hansard - - - Excerpts

I refer the House to my entry in the Register of Members’ Financial Interests.

The Bill has had a curious journey. It started life as the Data Protection and Digital Information Bill, in search of the exciting Brexit opportunities that we were promised, only to have died and then arisen as the Data Protection and Digital Information (No 2) Bill. In the Bill’s rejuvenated—and, dare I say, less exciting—form, Ministers have rightly clawed back some of the most high-risk proposals of its previous format, recognising, of course, that our freedom from the European Union, at least in respect of data protection, is anything but. We may have left the European Union, but data continues to flow between the EU and the United Kingdom, and that means of course that we must keep the European Commission happy to maintain our adequacy decision. For the most part, the Bill does not therefore represent significant change from the existing GDPR framework. There are some changes to paperwork and the appointment of officers, but nothing radical.

With that settled—at least in my view—the question is this: what is the purpose of this Bill? The Government aim to reduce regulatory burdens on business. To give Ministers credit, according to the independent assessment of the Regulatory Policy Committee, they have adequately set out how that will happen—unlike for other Government Bills in recent weeks. I congratulate the Government on their so-called “co-design” with stakeholders, which other Departments could learn from in drafting legislation. But the challenge in reducing business regulation and co-designing legislation with stakeholders is knowing how much of an influence the largest, most wealthy voices have over the smallest, least influential voices.

In this Bill—and, I suspect, in the competition Bill as its relates to the digital markets unit, and, if rumours are correct, the media Bill—that means the difference between the voice of big tech and the voice of the people. If reports are correct, I share concerns about the current influence of big tech specifically on Downing Street and about the amount of interference by No. 10 in the drafting of legislation in the Department. [Interruption.] Ministers are shaking their heads; I am grateful for the clarification. I am sure that the reporters at Politico are watching.

Research is a good example of a concern in the Bill relating to the balance between big tech and the people. When I was on the pre-legislative committee of the Online Safety Bill—on which I enjoyed working with the hon. Member for Folkestone and Hythe (Damian Collins), who spoke before me—everybody recognised the need for independent academics to have access to data from, the social media companies, for example, to help us understand the harms that can come from using social media. The Europeans have progressed that in their EU Digital Services Act, and even the Americans are starting to look at legislation in that area. But in the Bill, Ministers have not only failed to provide this access, but have opted instead to give companies the right to use our data to develop their own products. That means in practice that companies can now use the data they have on us to understand how to improve their products, primarily and presumably so that we use them more or—for companies that rely on advertising income—to increase our exposure to advertising, in order to create more profit for the company.

All that is, we are told, in the name of scientific research. That does not feel quite right to me. Why might Ministers have decided that that was necessary—a public policy priority—or that it is in any way in the interests of our constituents for companies to be able to do corporate research on product design without our explicit consent, instead of giving independent academics the right to do independent research about online harms, for example? The only conclusion I can come to is that it is because Ministers were, in the co-design process, asked by big tech to allow big tech to do that. I am not sure that consumers would have agreed, and that seems to be an example of big tech winning out in the Bill.

The second example relates to consumer rights and the ability of consumers to bring complaints and have them dealt with in a timely manner. Clause 7 allows for unreasonable delays by companies or data controllers, especially those that have the largest quantities of data on consumers. In practice, that once again benefits big tech, which holds the most data. The time that it can take to conclude a complaint under the Bill is remarkably long and will merely act as a disincentive to bringing a complaint in the first place.

It can take up to two months for a consumer or data subject to request access to the data that a company holds on them, then another two months for the company to confirm whether a complaint will be accepted. If a complaint is not accepted, there will then be up to another six months for the Information Commissioner to decide whether the complaint should be accepted, and if the Information Commissioner does decide that, the company then has one more month to provide the data, which was originally asked for nine months earlier. The consumer can then look at the data and put in a complaint to the company. If the company does not deal with the complaint, the earliest that the consumer can complain to the Information Commissioner is month 14, and the Information Commissioner will then have up to six months to resolve the complaint. All in all, that is up to 20 months of emails, forms, processes and decisions from multiple parties for an individual consumer to have a complaint considered and resolved.

That lengthy and complex complaints process also highlights the risks associated with the provisions in the Bill relating to automated decision making. Under current law, fully autonomous decision making is prohibited where it relates to a significant decision, but the Bill relaxes those requirements and ultimately puts the burden on a consumer to successfully bring a complaint against a company taking a decision about them in a wholly automated way. Will an individual consumer really do that when it could take up to 20 months? In the world we live in today, the likes of Chat GPT and other large language models will revolutionise customer service processes. The approach in the Bill seems to fail in regulating for the future and, unfortunately, deals with the past. I ask again: which stakeholder group asked the Government to draft the law in this complex and convoluted way? It certainly was not consumers.

In other regulated sectors and areas of law, such as consumer law, we allow representative bodies to bring what the Americans call “class actions” on behalf of groups of consumers whose rights have been infringed. That process is perfectly normal and exists in UK law today. Experience shows that representative bodies such as Citizens Advice and Which? do not bring class actions easily because it is too financially risky. They therefore bring an action only when there is a clear and significant breach. So why have Ministers not allowed for those powers to exist for breaches of data protection law in the same way that the European Union has, when we are very used to them existing in UK law? Again, that feels like another win for big tech and a loss for consumers. Reducing unnecessary compliance burdens on business is of course welcome, but the Government seem to have forgotten that data protection law is based on a foundation of protecting the consumer, not being helpful to business.

On a different subject, I highlight once again the ongoing creep of powers being taken from Parliament and given to the Executive. We have already heard about the powers for the Secretary of State to make amendments to the legislation without following a full parliamentary process. That keeps happening—not just in this Bill but in other Bills this Session, including the Online Safety Bill. My Committee, which has whole-of-Government scrutiny powers in relation to good regulation, has reprimanded the Department—albeit in its previous form—for the use of those Henry VIII powers. It is disappointing to see them in use again.

The Minister, in response to my hon. Friend the Member for Weaver Vale (Mike Amesbury), said that the Government had enhanced oversight of the Information Commissioner by giving themselves power to direct some of its legitimate interests or decisions, or the content of codes. I politely point out that the Information Commissioner regulates the Government’s use of our data. It seems odd to me that the Government alone are being given enhanced powers to scrutinise the Information Commissioner, and that Parliament has not been given additional oversight; that ought to be included.

The Government have yet to introduce any substantive legislation on biometrics. Biometric data is the most personal type of data, be it about our faces, our fingerprints, our voices or other characteristics that are personal to our bodies. The Bill does not even attempt to bring forward biometric-specific regulation. My private Member’s Bill in the 2019-21 Session—now the Forensic Science Regulator Act 2021—originally contained provisions for a biometrics strategy and associated regulations. At the then Minister’s insistence, I removed those provisions, having been told that the Government were drafting a more wide-ranging biometrics Bill, which we have not seen. That is especially important in the light of the Government’s artificial intelligence White Paper, as lots of AI is driven by biometric data. We have had some debate on the AI White Paper, but it warrants a whole debate, and I hope to secure a Westminster Hall debate on it soon. We need to fully understand the context of the AI White Paper as the Bill progresses through Committee and goes to the other place.

I am conscious that I have had an unusual amount of time, so I will finish by flagging two points, which I hope that the Parliamentary Under-Secretary of State for Science, Innovation and Technology will respond to in his summing-up. The first is the age-appropriate design code. I think that we all agree in this House that children should have more protection online than other users. The age-appropriate design code, which we all welcomed, is based on the foundation of GDPR. There are concerns that the changes in the Bill, including to the rights of the Secretary of State, could undermine the age-appropriate design code. I invite the Minister to reassure us, when he gets to the Dispatch Box, that the Government are absolutely committed to the current form of the age-appropriate design code, despite the changes in the Bill.

The last thing I invite the Minister to comment on is data portability. It will drive competition if companies are forced to allow us to download our data in a way that allows us to upload it to another provider. Say I wanted to move from Twitter to Mastodon; what if I could download my data from Twitter, and upload it to Mastodon? At the moment, none of the companies really allow that, although that was supposed to happen under GDPR. The result is that monopolies maintain their status and competitors struggle to get new customers. Why did the Government not bring forward provision for improved data portability in the Bill? To draw on a thread of my speech, I fear that it may be because that is not in the interests of big tech, though it is in the interests of consumers.

I doubt that I will be on the Bill Committee. I am sorry that I will not be there with colleagues who seem to have already announced that they will be on it, but I am sure that they will all consider the issues that I have raised.

19:22
Jane Hunt Portrait Jane Hunt (Loughborough) (Con)
- View Speech - Hansard - - - Excerpts

This Bill provides us with yet another opportunity to ensure that our legal and regulatory frameworks are tailored to our needs and specifications, now that we are free from the confines of EU law. It is crucial that we have a data rights regime that maintains the high data protection standards that the public expect, but it must do so in a way that is not overly burdensome to businesses and public services, and does not stifle innovation, growth and productivity. The Bill will go a long way to achieving that, but I would like to focus on one small aspect of it.

Announcing the First Reading of the Bill, the Secretary of State stated that it would improve

“the efficiency of data protection for law enforcement and national security partners encouraging better use of personal data where appropriate to help protect the public. It provides agencies with clarity on their obligations, boosting the confidence of the public on how their data is being used.”—[Official Report, 8 March 2023; Vol. 729, c. 20WS.]

That is a positive step forward for national security, but we are missing a crucial opportunity to introduce further reforms that will reduce administrative burdens on police forces across the UK.

I recently met members of the Leicestershire Police Federation, who informed me of the association’s concerns regarding part 3 of the Data Protection Act 2018. Specifically, the Police Federation is concerned about how the requirements of part 3 interact with the Crown Prosecution Service’s “Director’s Guidance on Charging”, which obliged the police to provide more information to the CPS pre-charge. That information includes unused material, digitally recovered material and third-party material, all of which must be redacted in accordance with the Data Protection Act.

Combined, the guidance’s requirements and the provisions of the Act represent a huge amount of administrative work for police officers, who would have to spend hours making the necessary redactions. Furthermore, much of that work may never be used by the CPS if no charge is brought, or the defendant pleads guilty before trial. Nationally, around 25% of cases submitted to the CPS result in no charge. This desk-based work would remove police officers from the frontline.

Picture the scene of an incident. Say that 10 police officers attend, all turning on their body cameras as they arrive. They deal with different aspects of the incident; they talk to a variety of people and take statements, standing in different positions that result in different backgrounds to the video footage and different side-conversations being captured. The lead officer then spends hours, if not days, redacting all the written data and video footage generated by all the officers, only for the redacted data to be sent to a perfectly trusted source, the CPS, which will not necessarily take the case forward.

The data protection Bill is meant to update and simplify the data protection framework used by bodies in the UK. The Bill refers to the work of the police in national security situations, but it should also cover their day-to-day work as a professional body. They should be able to share their data with the CPS, another professional body. Both have a legitimate interest in accessing and sharing the data collected. My hon. Friend the Minister for Data and Digital Infrastructure will know that this is an issue, as I have already raised it with her. I am very grateful for her considered response, and for the Government’s commitment to looking into this matter further, including in the context of this Bill, and at whether the Police Federation’s idea of a data bubble between the police service and the CPS is a workable solution.

I look forward to working with the Government on the issue. It is vital that we do what we can to ease the administrative burden on police officers, so that we can free up thousands of policing hours every year and get police back to the frontline, where they can support communities and tackle crime. Speaking of easing burdens, may I also take this opportunity to wish my hon. Friend the Minister the very best with the arrival that is expected in, I suspect, the none-too-distant future?

19:26
Daniel Zeichner Portrait Daniel Zeichner (Cambridge) (Lab)
- View Speech - Hansard - - - Excerpts

My interest in this debate comes from my representing a science and research city, where data, and transferring it, is key, and from my long-term background in information technology. Perhaps as a consequence of both, back in 2018 I was on the Bill Committee that had the interesting task of implementing GDPR, even though, as my hon. Friend the Member for Bristol North West (Darren Jones)—my good friend—pointed out at the time, none of us had the text in front of us. I think he perhaps had special access to it. In those long and complicated discussions, there were times when I was not entirely sure that anyone in the room fully gripped the complexity of the issues.

I recall that my right hon. Friend the Member for Birmingham, Hodge Hill (Liam Byrne) persistently called for a longer-term vision that would meet the fast-changing challenges of the digital world, and Labour Members constantly noted the paucity of resources available to the Information Commissioner’s Office to deal with those challenges, notwithstanding yellow-vested people entering offices. Five years on, I am not sure that much has changed, because the Bill before us is still highly technical and detailed, and once again the key issues of the moment are being dodged.

I was struck by the interesting conversations on the Conservative Benches, which were as much about what was not being tackled by the Bill as what is being tackled —about the really hot issues that my hon. Friend the Member for Manchester Central (Lucy Powell) mentioned in her Front-Bench speech, such as ChatGPT and artificial intelligence. Those are the issues of the moment, and I am afraid that they are not addressed in the Bill. I make the exact point I made five years ago: there is the risk of hard-coding previous prejudice into future decision making. Those are the issues that we should be tackling.

I chair the all-party parliamentary group on data analytics, which is carrying out a timely review of AI governance. I draw Members’ attention to a report made by that group, with the help of my hon. Friend the Member for Bristol North West, called “Trust, Transparency and Technology”. It called for, among other things, a public services licence to operate, and transparent, standardised ethics and rules for public service providers such as universities, police, and health and care services, so that we can try to build the public confidence that we so need. We also called for a tough parliamentary scrutiny Committee, set up like the Public Accounts Committee or the Environmental Audit Committee, to make sure the public are properly protected. That idea still has strong resonance today.

I absolutely admit that none of this is easy, but there are two particular areas that I would like to touch on briefly. One, which has already been raised, is the obvious one of data adequacy. Again, I do not feel that the argument has really moved on that much over the years. Many of the organisations producing briefings for this debate highlight the risks, and back in 2018—as I think the right hon. Member for Maldon (Sir John Whittingdale) pointed out—there were genuine concerns that we would not necessarily achieve an adequacy agreement with the European Union. Frankly, it was always obvious that this was going to be a key point in future trade negotiations with the EU and others, and I am afraid that that is the way it has played out.

It is no surprise that adequacy is often a top issue, because it is so essentially important, but that of course means that we are weakened when negotiation comes to other areas. Put crudely, to get the data adequacy agreements we need, we are always going to be trading away something else, and while in my opinion the EU is always unlikely to withhold at the very end, the truth is that it can, and it could. That is a pretty powerful weapon. On the research issues, I would just like to ask the Minister whether, in summing up, he could comment on the concerns that were raised back in 2018 about the uncertainty for the research sector, and whether he is confident that what is proposed now—in my view, it should have been done then—can provide the clarity that is needed.

On a more general note, one of the key Cambridge organisations has pointed out to me that, in its view, it is quite hard to see the point of this Bill for organisations that are operating globally because, as the EU GDPR has extraterritorial effect, they are still going to need to meet those standards for much of what they do. It would simply be too complicated to try to apply different legal regimes to different situations and people. That is the basic problem with divergence: when organisations span multiple jurisdictions, taking back control is frankly meaningless. Effectively, it cedes control to others without having any influence—the worst of all worlds. That organisation also tells me that it has been led to believe by the Government, as I think was echoed in some of the introductory points, that any organisation wishing to carry on applying current legal standards will, by default, meet those in the new Bill. It is sceptical about that claim, and it would like some confirmation, because it rightly wonders how that can be the case when new concepts and requirements are introduced and existing ones amended.

There is much, much more that could be said, has been said and will be said by others, including genuine concerns about the weakening of rights around subject access requests and some of the protections around algorithmic unfairness. Those need to be tested and scrutinised in Committee; frankly, too much cannot just be left to ministerial judgment. Huge amounts of data are now held about all of us, and the suspicion is rightly held that decisions are sometimes made without our knowledge, decisions that can have a direct impact on our lives. I think we can all agree that data used well can be transformative and a power for good, but that absolutely relies on confidence and trust, which in turn requires a strong regulatory framework that engenders that trust. It feels to me like this Bill fails to meet some of those challenges. It needs to be strengthened and improved.

19:29
Robin Millar Portrait Robin Millar (Aberconwy) (Con)
- View Speech - Hansard - - - Excerpts

It is a pleasure to follow the speech of the hon. Member for Cambridge (Daniel Zeichner), and in fact, I have enjoyed listening to the various contributions about the many aspects of the many-headed hydra that the data Bill represents. In particular, the point made by the hon. Member for Manchester Central (Lucy Powell) about interoperability and the one made by the hon. Member for Glasgow North West (Carol Monaghan) about hurdles are points I will be returning to briefly.

I welcome the fact that we have a Bill that focuses on data. Data is the new oil, as they say, and it is essential that we grapple with the implications of that. If there is need of an example, data was critical in our fight against covid-19. Data enabled the rapid processing of new universal credit applications. Data meant that we could target funds into business accounts quickly to make sure that furlough payments were made. Data gave us regular updates on infection rates, and data underpinned the research into vaccines, their rapid roll-out, and their reporting to the right people, at the right time and in the right place. We have also seen that data on all those matters was questioned at every step of the way then and continuously since.

Data matters. This Bill matters: it gives us an opportunity to redefine our regulatory approach, as the hon. Member for Cambridge alluded to. It also provides a clearer and more stable framework for appropriate international transfers of personal data—I stress the word “appropriate”. In addition, it is welcome that the Bill extends data-sharing powers, enabling the targeting of Government services to support business growth more effectively and deliver joined-up public services, which will be the thrust of my contribution. I also welcome the Bill’s delivery of important changes to our everyday lives. Whether it is an increase in financial penalties for those behind nuisance calls, addressing the number of cookie pop-ups on web browsers that we use every day, or providing a trusted framework for digital verification services, these are important updates in protecting everyday lives that are, in part, lived online now. That is to be welcomed—provided, again, that the necessary safeguards are in place.

I will give the bulk of my time to focusing on another area in which I think the Bill could go much further. The Bill recognises that, for public services to operate efficiently, safely and with effective scrutiny, data should be collected, presented, processed and shared in a consistent way, yet it is frustrating that the current scope of the Bill is for such information standards to apply in England only.

I am going to use health as an example to illustrate my point. In Aberconwy, we are experiencing severe, systematic failings in the delivery of health services across north Wales. The health board has been under special measures for six of the past eight years, and in their latest intervention, the Welsh Government have just sacked the non-executive members of the board. It therefore comes as little surprise that health is the No. 1 domestic concern for constituents across north Wales, or that my constituents put it into our plan for Aberconwy. This is not an exercise in point scoring, but in this Bill, I see an opportunity to help to tackle that problem. Wales is linked to the rest of the UK, historically and today, on an east-west axis for family, business, leisure and public services. Our health and social care services in north Wales rely on working and sharing information with colleagues in England—with hospitals in Chester, Stoke and Liverpool. However, sharing that data, which relies on the interoperability that the hon. Member for Manchester Central referred to, often presents an obstacle to care.

Of course, I recognise and respect that health is a devolved matter that is under the remit of the Welsh Government in Cardiff Bay, but one of the arguments made in favour of Welsh devolution 25 years ago was that it would enable learning from comparisons between different policy approaches across the UK, exposing underperformance as well as celebrating successes. In order to do so, though, we must have comparable and reliable data. If this sounds familiar, I made exactly that point in the debate on the Health and Care Bill back in November 2021. At that time, working with hon. Friends from across north Wales, we showed that we had overwhelming support from patients—they agreed that data must be shared. The healthcare professionals we spoke to also agreed that data needed to be shared. The IT experts we consulted with agreed that data must and could be shared, and the local administrators, community groups and civil servants we spoke to also told us that data needed to be shared. However, the reality is that currently, data in different parts of the UK is often not comparable, nor is the timing of its publication aligned.

Again, I have focused today on health as a pressing and urgent example of the need for sharing data, but these points apply across our public services. Indeed, my hon. Friend the Member for Loughborough (Jane Hunt) gave an excellent and powerful practical example of how data sharing within the police inadvertently introduces all sorts of unnecessary barriers. As much as I have spoken about health, these points apply equally to the education of our children, the wellbeing of our grandparents, skilling our workforce, levelling up our communities, ensuring fair and competitive environments for business across the UK, and more—not least the future of our environment.

I repeat: good data is essential for good services. I recognise the good work that is going on in the Office for National Statistics, with the helpful co-operation of devolved Administrations, but it is time and an opportunity for the Government to consider amending the Bill in Committee to mandate agreement on, and the collection and publication of, key UK-wide data for public services. That data should be timely, accessible and interoperable.

All Administrations will already hold data for the operation of public services, but comparability and interoperability will allow professionals and planners to assign resources and guide interventions where they are needed most. It will allow patients and users of public services to make informed decisions about where to be treated, where to live and where to seek those services. It will also allow politicians like me to be held to account when services fail. I do not believe that such an amendment would divide the House in compassion or in common sense.

In conclusion, I know our Prime Minister understands the importance of data. He seeks to put it at the heart of a modern, innovative, dynamic and thriving UK, but it must be good data that flows through our veins and to all parts of our nation if it is to animate us and make the UK a success. For that reason, we need to go further. We need to ensure data comparability and interoperability across all parts of the UK. I look forward to hearing the Minister’s closing remarks.

19:40
Layla Moran Portrait Layla Moran (Oxford West and Abingdon) (LD)
- View Speech - Hansard - - - Excerpts

I start by echoing the well wishes to the Secretary of State on her imminent arrival. I am delighted to be here in my first outing as the Lib Dem spokesperson for science, innovation and technology, although in my mind I consider it as the spokesperson for proud geeks. I appreciate that is not a term everyone likes, but as a physics graduate and an MP for Oxford, where we have many fellow-minded geeks, I am proud to call myself that.

Much as this important Bill is geeky and technical—it sounds like it will be an interesting Bill Committee —it integrates into our whole lives. People have spoken about the potential and progress, and I agree to an extent with the comment from the hon. Member for Aberconwy (Robin Millar) about this being the new oil. However, in the context of climate change, there is a lesson for us there. Imagine that we knew then what we know now. We can already see that here. As new as some of these technologies are, and as new as some of these challenges may be, it does feel like, as legislators, we are constantly playing catch-up with this stuff.

We consult and we look, and we know what the problems are and what the issue fundamentally is, but I agree with the hon. Member for Cambridge (Daniel Zeichner) that we need a bit of vision here. I would argue that what we need is what my former colleague, the former Member for East Dunbartonshire, called for, which is a code of ethics for data and artificial intelligence. I sincerely hope that the Government, with the extra power to the elbow of the new Department, can put some real resource behind that—not in White Papers and thought, but in a proper bit of legislation that answers some of the questions raised earlier about the moral use, for example, of artificial intelligence in war.

Those are important questions. The problem and worry I have is that this Government and others will find themselves constantly on the back foot, unless we talk not just about the geekery and the technical bits—by the sounds of it, there are enough of us in the House who would enjoy doing that—but about the slightly loftier and more important ways that this Bill will connect with society.

In the digital first age, the Government themselves are encouraging those who want to access benefits and every other part of the state to do so digitally. If someone is to be a full citizen of the state, they are required often to give over their data. If someone does not want to engage with the digital realm, it is difficult for them to access the services to which they are entitled. Those are some of the big issues that encircle this Bill. It is fair to make that point on Second Reading, and I urge the Government, and especially the new Department, to give serious thought to how they will knit this all together, because it is incredibly important.

The Liberal Democrats have a few issues with the Bill. I associate myself with the remarks of the hon. Member for Bristol North West (Darren Jones), and in particular what he said in asking who is at the centre of the Bill, which is incredibly important. As liberals, we believe it should always be the citizen. Where there is a conflict of interest between the citizen, business and the state, in our view and in our political ideology, the citizen always comes top. I am not convinced that has been at the heart of the Bill at points. Citizens have been thought about, but were they at the centre of it at every stage? I am afraid that our ability as individuals to access, manipulate and decide who has our data has at various stages got lost.

The concerns we share with others are in four main areas: the Bill will undermine data rights; it will concentrate power with the Secretary of State—notwithstanding potential change in government, that is the sort of thing that Parliament needs to think about in the round, regardless of who is in power; the Bill will further complicate our relationship with Europe, as some have mentioned; and it sets a worrying precedent.

We need to understand where we start from. Only 30% of people in the UK trust that the Government use their data ethically. That means that 70% of people in the UK do not. Polls across the world have shown roughly the same thing. That is a huge level of mistrust, and we need to take it seriously. The Open Rights Group has described the Bill as part of a deregulatory race to the bottom, as the rights and safeguards of data subjects could be downgraded because of the changes proposed.

Clause 5 and schedule 1 to the Bill introduce a whole set of legitimate interests for processing data without consent and with few controls around their application. The Bill changes the definition of personal data, which would reduce the circumstances in which that information is protected. It reforms subject access requests, as others have said. We all run our own small businesses in our offices as MPs. We understand the burden placed on small businesses in particular, but it is absolutely the right of that individual to find out what is held on them in the way that subject access requests allow. If there is a conflict, it is the right of the individual that needs to be protected. The Government assess that the proposal would save about £82 a year—a price worth paying, given the number of consumers whom those businesses on average are looking after. There is an important hierarchy of user use that is not entirely captured by what the Government have been saying so far.

Big Brother Watch has said:

“The revised Data Protection and Digital Information Bill poses serious threats to Brits’ privacy. The Government are determined to tear up crucial privacy and data protection rights that protect the public from intrusive online surveillance and automated-decision making in high-risk areas. This bonfire of safeguards will allow all sorts of actors to harvest and exploit our data more than ever before. It is completely unacceptable to sacrifice the British public’s privacy and data protection rights on the false promise of convenience.”

I am deeply concerned that far from restoring confidence in data protection, the Bill sets a dangerous precedent for a future in which rights and safeguards are undermined. I have listened to what the Secretary of State has said at the Dispatch Box. I sincerely hope that those safeguards that the Government want to keep in place will remain in place, but we should be listening to those third-party groups that have scrutinised this Bill in some detail. There are legitimate concerns that need to be addressed.

My other concern is the concentration of power with the Secretary of State. As I have said before, while it would be lovely to think that all Secretaries of State and all Governments will all think the same on this and that we all have the same principles, my deep concern is that one day that will not happen. There is an important part for Parliament to play, especially when legislation is running behind what is happening in society, in raising the issues in real time. My worry is that by acting through secondary legislation, which we end up scrutinising less and less often, the Government do not have a mechanism for Parliament to feed in as society changes, which can be year-on-year. We need some way, whether through a Select Committee or whatever, to be able to keep pace with changes in society.

Finally, I want to talk about adequacy and in particular its loss being a real concern. I am pleased to hear that being raised on all sides in the House, which is a good sign, but I hope that this is not a case where little then gets changed in the Bill, as we have seen many times over. We could have it both ways: we can diverge from EU standards if we make the protection of the rights of the citizens stronger. Some who have mentioned divergence, however, have spoken about a weakening, which I worry will lead to a loss of adequacy.

In closing, will the Minister give a cast-iron guarantee to businesses that rely on it—and to our researchers who equally rely on it—that adequacy will not be watered down but will be one of the key tenets of how we move forward? Certainty for businesses and our researchers is incredibly important, and if there is any suggestion that changes in the Bill will affect that, they must be pulled immediately.

19:50
Jim Shannon Portrait Jim Shannon (Strangford) (DUP)
- View Speech - Hansard - - - Excerpts

It is a pleasure to add some comments and make a contribution, and also to have heard all the right hon. and hon. Members’ speeches as I have sat here tonight. There will not be any votes on the Bill, I understand, but if there had been, my party would have supported the Government, because I think the intention of the Minister and the Government is to try to find a correct way forward. I hope that some of the tweaking that is perhaps needed can happen in a positive way that can address such issues. It is always good to speak in any debate in this House, but this is the first one after the recess, and I am indeed very pleased to be a part of any debates in the House. I have spoken on data protection and its importance in the House before, and I again wish to make a contribution, specifically on medical records and protection of health data with regard to GP surgeries. I hope to address that with some questions for the Minister at the end.

Realistically, data protection is all around us. I know all too well from my constituency office that there are guidelines. There are procedures that my staff and I must follow, and we do follow them very stringently. It is important that businesses, offices, healthcare facilities and so on are aware of the guidelines they must follow, hence the necessity of this Bill. As I have said, if there had been a vote, we would have supported the Government, but it seems that that will not be the case tonight. Data exposure means the full potential for it to fall into the wrong hands, posing dangers to people and organisations, so it is great to be here to discuss how we can prevent that, with the Government presenting the legislation tonight and taking it through Committee when the time comes.

I have recently had some issues with data protection—this is a classic example of how mistakes can happen and how important data can end up in the wrong place—when in two instances the Independent Parliamentary Standards Authority accidentally published personal information about me and my staff online. It did not do it on purpose—it was an accident, and it did retrieve the data very quickly—but it has happened on two occasions at a time of severe threat in Northern Ireland and a level of threat on the mainland as well. Although the matter was quickly resolved, it is a classic example of the dangers posed to individuals.

I am sure Members are aware that the threat level in Northern Ireland has been increased. Despite there being external out-of-office security for Members, I have recently installed CCTV cameras in my office for the security of my staff, which, though not as great in comparison, is my responsibility. I have younger staff members in their 20s who live on their own, and staff who are parents of young children, and they deserve to know that they are safe. Anxieties have been raised because of the data disclosure, and I imagine that many others have experienced something similar.

I want to focus on issues about health. Ahead of this debate, I have been in touch with the British Medical Association, which raised completely valid concerns with me about the protection of health data. I have a number of questions to ask the Minister, if I may. The BMA’s understanding of the Bill is that the Secretary of State or the Minister will have significant discretionary powers to transfer large quantities of health information to third countries with minimal consultation or transparent assessment about how the information will benefit the UK. That is particularly worrying for me, and it should be worrying for everyone in this House. I am sure the Minister will give us some clarification and some reassurance, if that is possible, or tell us that this will not happen.

There is also concern about the Secretary of State having the power to transfer the same UK patients’ health data to a third country if it is thought that that would benefit the UK’s economic interests. I would be very disturbed, and quite annoyed and angry, that such a direction should be allowed. Again, the Minister may wish to comment on that at the end of the debate. I would be grateful if the Minister and his Department provided some clarity for the BMA about what the consultation process will be if information is to be shared with third-party countries or organisations.

There have also been concerns about whether large tech and social media companies are storing data correctly and upholding individuals’ rights or privacy correctly. We must always represent our constituents, and the Bill must ensure that the onus of care is placed on tech companies and organisations to legally store data safely and correctly. The safety and protection of data is paramount. We could not possibly vote for a Bill that undermined trust, furthered economic instability and eroded fundamental rights. Safeguards must be in place to protect people’s privacy, and that starts in the House today with this Bill. Can the Minister assure me and the BMA that our data will be protected and not shared willy-nilly with Tom, Dick and Harry? As I have said, protection is paramount, and we need to have it in place.

To conclude, we have heard numerous stories both from our constituents and in this place about the risks of ill-stored and unprotected data. The Bill must aim to retain high data protection standards without creating unnecessary barriers for individuals and businesses. I hope that the Minister and his Department can answer the questions we may have to ensure that the UK can be a frontrunner in safe and efficient data protection. We all want that goal. Let us make sure we go in the right direction to achieve it.

19:57
Stephanie Peacock Portrait Stephanie Peacock (Barnsley East) (Lab)
- View Speech - Hansard - - - Excerpts

I would like to add my best wishes to the Minister and the Secretary of State on their imminent arrivals.

We are in the midst of a tech revolution, and right at the centre of this is data. From social media and online shopping to the digitisation of public services, the rate at which data is being collected, processed and shared is multiplying by the minute. This new wealth of data holds great potential for innovation, boosting economic growth and improving the delivery of public services. The aims of the Bill to unlock the economic and societal benefits of data while ensuring strong, future-proofed privacy rights are therefore ones that we support. We welcome, for example, provisions to modernise the ICO structure, and we support provisions for the new smart data regimes, so long as there are clear requirements for impact assessments.

However, the Bill in its current form does not go far enough in actually achieving its aims. Its narrow approach and lack of clarity render it a missed opportunity to implement a truly innovative and progressive data regime. Indeed, in its current form many clarifications will be needed to reassure the public that their rights will not be weakened by the Bill while sweeping powers are awarded to the Secretary of State. Currently, solely automated processing is defined by the Bill as one having “no meaningful human involvement” that results in a “significant decision”, with the Secretary of State trusted with powers to amend what counts within this definition. The lack of detail on the boundaries of such definitions as well as their ability to change over time have concerned the likes of the Ada Lovelace Institute and the TUC.

The Chair of the Business, Energy and Industrial Strategy Committee, my hon. Friend the Member for Bristol North West (Darren Jones), outlined in his powerful speech the power imbalance between big tech and the people, which is an important insight and a challenge for us in this House. Indeed, just this month Uber was found to have violated the rights of three UK-based drivers by firing them without appeal on the basis of fraudulent activity picked up by its automated decision-making system. In its judgment, the court found that the limited human intervention in Uber’s automated decision process was not

“much more than a purely symbolic act”.

This case and the justice the drivers received therefore explicitly relied on current legislation in the form of article 22 of the UK GDPR, and a clear understanding of what constitutes meaningful human involvement. Without providing clear boundaries for defining significant decisions and meaningful human involvement, this Bill therefore risks removing the exact rights that won this case and creating an environment where vital safeguards, such as the right to contest automated decisions and request human intervention, could easily become exempt from applying at the whim of the Secretary of State. This must be resolved, and the public must be reassured that they will not be denied a job, mortgage or visa by an algorithm without a method of redress.

There is also a lack of clarity around how rules allowing organisations to charge a fee or refuse subject access requests deemed “vexatious” and “excessive” will work, as the likes of Which? and the Public Law Project have argued and which my hon. Friend the Member for Cambridge (Daniel Zeichner) highlighted. Indeed, if the list of circumstances where these terms might be met is non-exhaustive, what safeguards will be in place to stop controllers from abusing this, deciding that any request they dislike is vexatious? Organisations should absolutely be supported in directing resources to good faith requests, but we must be careful to ensure that any new limits are protected against abuse.

Reform of the responsibilities of the Information Commissioner’s Office is another area in need of analysis. Indeed, more than evolving its structure, the Bill gives the Secretary of State power to set the strategic priorities of the regulator and approve codes of practice. This has sparked concern across the spectrum of stakeholders, from the Open Rights Group to techUK, over what it means for the regulator’s independence. Given these new powers, particularly in cases where guidance addresses the activity of the Government, how can Ministers assure us that a Secretary of State will not be marking their own homework?

Whether it is the Secretary of State being able to amend the “recognised legitimate interests” list or the removal of the requirement for consultation on impact assessment, this same theme is echoed throughout the Bill, which was raised by the hon. Member for Oxford West and Abingdon (Layla Moran). Without additional guidance and clear examples of how definitions apply, it is hard to grasp the full extent of the consequences of these new measures, especially given the sweeping powers of the Secretary of State to make further changes. We will look to ensure that this clarity is included in the Bill, so that everyone can be assured of their rights and of a truly independent regulator. We must also ensure that children are protected by the Bill and that the age-appropriate design code is not compromised, as raised by the hon. Member for Folkestone and Hythe (Damian Collins) and others across the House.

Clarity on the new regime is also vital for reassuring businesses who still have fears around losing EU adequacy, something raised throughout this debate and which the former Secretary of State the right hon. Member for Maldon (Sir John Whittingdale) outlined in his contribution. The Government have said that they recognise that losing adequacy would be disastrous, costing up to £460 million as a one-off and £410 million every year afterwards. Ministers have rightly rowed back on many of the more concerning suggestions from their consultation, but they must be absolutely clear on how they are sure that the measures in the Bill, particularly those that toy with the regulator’s independence and give Ministers power to create further change, will not threaten adequacy.

Having already made significant adjustments to comply with UK GDPR, the changes in the Bill must also be careful not to create further uncertainty for businesses. Indeed, although Ministers say that anyone who abides by the current rules will still be compliant after the passing of the Bill, organisations will still have to do their own legal due diligence to understand how, if at all, this set of amendments impacts them. It would therefore be good to hear from Ministers on how they plan to ensure that businesses, particularly small and medium-sized enterprises, are supported in understanding the requirements on them.

We understand the Government’s attempts to future-proof this legislation, and it would be great to see an end to constant cookie banners or nuisance calls, which the hon. Member for Aberconwy (Robin Millar) referenced, but the measures in the Bill rely on technology that does not currently operationally exist. In the case of browser-enabled cookie models, there is also the concern that this may entrench power in the hands of existing tech giants and muddy the waters on liability. We must be careful, therefore, to ensure that businesses can actually implement what the Bill requires.

Ultimately, with the exception of the section on smart data, this Bill chooses to take a very narrow view of what an innovative data regime could look like. In the context of a rapidly changing world, this Bill was a great opportunity to really consider how we can get data working in better interests, like those of the general public or small businesses. Labour would have used a Bill like this to, for example, examine how data can empower communities and collective groups such as workers in industries who have long felt that they have been on the wrong end of automated decision-making as well as the automation of jobs.

We would also have sought to improve public trust and understanding in how our data is used, particularly since the willingness to share data has been eroded after the likes of the Cambridge Analytica scandal, the NHS data opt-out, and the exam algorithm scandal, which disproportionately affected my constituents in Barnsley. As it stands, however, the Bill seems only to consider data rights when they emerge as a side product of making changes to rules for processors. Data rights and data protection have wide-ranging consequences across society, as the hon. Member for Strangford (Jim Shannon) discussed. Labour would have used this as an opportunity to look at the larger picture of data ownership. Deregulation measures such as those in the Bill might mean less work for some small businesses, but as long as a disproportionate amount of data is held by a limited number of firms, they will still be at a large competitive disadvantage. From introducing methods of collective redress to nurturing privacy-enhancing technologies, there are many positive opportunities a progressive data Bill could have explored to put our country at the forefront of innovation while genuinely strengthening rights and trust for the modern era, but the Government have missed this opportunity.

Overall, we can all agree on unlocking innovation through data while ensuring data subjects have the rights and trust they fundamentally deserve. However, there are many areas for clarity and improvement if this Bill is to match the bold vision required to truly be at the forefront of data use and data protection. I look forward to working closely with Ministers in the coming months towards legislation that better fulfils these aims.

20:05
Paul Scully Portrait The Parliamentary Under-Secretary of State for Science, Innovation and Technology (Paul Scully)
- View Speech - Hansard - - - Excerpts

I thank all Members for their contributions, including the hon. Members for Manchester Central (Lucy Powell), for Glasgow North West (Carol Monaghan), for Bristol North West (Darren Jones), for Cambridge (Daniel Zeichner), for Oxford West and Abingdon (Layla Moran), for Strangford (Jim Shannon) and for Barnsley East (Stephanie Peacock) and my right hon. Friend the Member for Maldon (Sir John Whittingdale) and my hon. Friends the Members for Folkestone and Hythe (Damian Collins), for Loughborough (Jane Hunt) and for Aberconwy (Robin Millar). The debate has been held in the right spirit, understanding the importance of data, and I will try to go through a number of the issues raised.

Adequacy has come up on a number of occasions. We have been straight from the beginning that adequacy is very important and we work with the EU Commission on this; we speak to it on a regular basis, but it is important to note that the EU does not require exactly the same rules to be in place to be adequate. We can see that from Japan and from New Zealand, so we are trying to get the balance right and making sure that we remain adequate not just with the EU but with other countries with which we want to have data bridges and collaboration. We are also making sure that we can strip back some of the bureaucracy not just for small businesses, but for public services including GPs, schools and similar institutions, as well as protecting the consumer, which must always be central.

Automated decision-making was also raised by a number of Members. The absence of meaningful human intervention in solely automated decisions, along with opacity in how those decisions can be reached, will be mitigated by providing data subjects with the opportunity to make representations about, and ultimately challenge, decisions of this nature that are unexpected or seem unwarranted. For example, if a person is denied a loan or access to a product or services because a solely automated decision-making process has identified a high risk of fraud or irregularities in their finances, that individual should be able to contest that decision and seek human review. If that decision is found to be unwarranted on review, the controller must re-evaluate the case and issue an appropriate decision.

Our reforms are addressing the uncertainty over the applications of safeguards. They will clarify when safeguards apply to ensure that they are available in appropriate circumstances. We will develop that with businesses and other organisations in guidance.

The hon. Member for Glasgow North West talked about joint-working designation notices and it is important to note that the police and intelligence services are working off different data regimes and that can make joint-working more difficult. Many of the changes made in this Bill have come from learning from the Fishmongers’ Hall terrorist incident and the Manchester Arena bombing.

Members raised the question of algorithmic bias. We agree that it is important that organisations are aware of potential biases in data sets and algorithms and bias monitoring and correction can involve the use of personal data. As we set out in our response to the consultation on the Bill, we plan to introduce a statutory instrument that will provide for the monitoring and correction of bias in AI systems by allowing the processing of sensitive personal data for this purpose with appropriate safeguards. However, as we know from the AI White Paper we published recently, this is a changing area so it is important that we remain able to flex in Government in the context of AI and that type of decision-making.

The hon. Member for Bristol North West talked about biometrics. That is classed as sensitive data under the UK GDPR, so is already provided with additional protection. It can only be processed if a relevant condition is met under article 9 or schedule 1 of the Data Protection Act. That requirement provides sufficient safeguards for biometric data. There are significant overlaps in the current oversight framework, which is confusing for the police and the public, and it inhibits innovation. That is why the Bill simplifies the oversight for biometrics and overt surveillance technologies.

The hon. Gentleman talked about age-appropriate guidance. We are committed to protecting children and young people online. The Bill maintains the high standards of data protection that our citizens expect and organisations will still have to abide by our age-appropriate design code. Any breach of our data protection laws will result in enforcement action by the Information Commissioner’s Office.

The hon. Gentleman also talked about data portability. The Bill increases data portability by setting up smart data regulations. He talked about social media, but it is far wider than that. Smart data is the secure sharing of customer data with authorised third parties on the customer’s request. Those third parties can then use that data to provide innovative services for the consumer or business user, utilising AI and data-driven insights to empower customer choice. Services may include clear account management across services, easier switching between offers or providers, and advice on how to save money. Open banking is an obvious live example of that, but the Bill, with the smart data changes within it, will turbocharge the use of this matter.

My hon. Friend the Member for Loughborough talked about policing. It will save 1.5 million police hours, but it is really important that we do more. We are looking at ways of easing redaction burdens for the police while ensuring we maintain victim and witness confidence. It is really important to them, and in the interests of public trust, that the police do not share information not relevant to a case with other organisations, including the Crown Prosecution Service and the defence. Removing information, as my hon. Friend says, places a resource burden on officers. We will continue to work with the police and the Home Office on that basis.

On UK-wide data standards, raised by my hon. Friend the Member for Aberconwy, improving access to comparable data and evidence from across the UK is a crucial part of the Government’s work to strengthen the Union. The UK Government and the Office for National Statistics have an ongoing and wide-ranging work programme to increase coherency of data across the nations, as my hon. Friend is aware. We remain engaged in discussions and will continue to work with him, the Wales Office and the ONS to ensure that we can continue.

On international data transfer, it is important that we tackle the uncertainties and instabilities in the current regime, but the hon. Member for Strangford is absolutely right that in doing that, we must maintain public trust in the transfer system.

Finally, on the ICO, we believe that the Bill does not undercut its independence. It is really important that, for the trust issues I have talked about, we retain its independence. It is not about Government control over an independent regulator and it is not about a Government trying to exert influence or pressure for what are deemed to be more favourable outcomes. We are committed to the ICO’s ongoing independence and that is why we have worked closely with the ICO. The Information Commissioner himself is in favour of the changes we are making. He has spoken approvingly about them.

This is a really important Bill, because it will enable greater innovation while keeping personal protections to keep people’s data safe.

Question put and agreed to.

Bill accordingly read a Second time.

Data Protection and Digital Information (No. 2) Bill (Programme)

Motion made, and Question put forthwith (Standing Order No. 83A(7)),

That the following provisions shall apply to the Data Protection and Digital Information (No. 2) Bill:

Committal

(1) The Bill shall be committed to a Public Bill Committee.

Proceedings in Public Bill Committee

(2) Proceedings in the Public Bill Committee shall (so far as not previously concluded) be brought to a conclusion on Tuesday 13 June 2023.

(3) The Public Bill Committee shall have leave to sit twice on the first day on which it meets.

Consideration and Third Reading

(4) Proceedings on Consideration shall (so far as not previously concluded) be brought to a conclusion one hour before the moment of interruption on the day on which those proceedings are commenced.

(5) Proceedings on Third Reading shall (so far as not previously concluded) be brought to a conclusion at the moment of interruption on that day.

(6) Standing Order No. 83B (Programming committees) shall not apply to proceedings on Consideration and Third Reading.—(Joy Morrissey.)

Question agreed to.

Data Protection and Digital Information (No. 2) Bill (Money)

King’s recommendation signified.

Motion made, and Question put forthwith (Standing Order No. 52(1)(a)),

That, for the purposes of any Act resulting from the Data Protection and Digital Information (No. 2) Bill, it is expedient to authorise the payment out of money provided by Parliament of—

(a) any expenditure incurred under or by virtue of the Act by the Secretary of State, the Treasury or a government department, and

(b) any increase attributable to the Act in the sums payable under any other Act out of money so provided.—(Joy Morrissey.)

Question agreed to.

Data Protection and Digital Information (No. 2) Bill (Ways and Means)

Motion made, and Question put forthwith (Standing Order No. 52(1)(a)),

That, for the purposes of any Act resulting from the Data Protection and Digital Information (No. 2) Bill, it is expedient to authorise:

(1) the charging of fees or levies under or by virtue of the Act; and

(2) the payment of sums into the Consolidated Fund.—(Joy Morrissey.)

Question agreed to.

Data Protection and Digital Information (No. 2) Bill (Carry-over)

Motion made, and Question put forthwith (Standing Order No. 80A(1)(a)).

That if, at the conclusion of this Session of Parliament, proceedings on the Data Protection and Digital Information (No. 2) Bill have not been completed, they shall be resumed in the next Session.—(Joy Morrissey.)

Question agreed to.

Data Protection and Digital Information (No. 2) Bill (First sitting)

Committee stage
Wednesday 10th May 2023

(1 year, 1 month ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 10 May 2023 - (10 May 2023)
The Committee consisted of the following Members:
Chairs: † Mr Philip Hollobone, Ian Paisley
† Amesbury, Mike (Weaver Vale) (Lab)
† Bristow, Paul (Peterborough) (Con)
† Clarke, Theo (Stafford) (Con)
† Collins, Damian (Folkestone and Hythe) (Con)
† Double, Steve (Lord Commissioner of His Majesty's Treasury)
† Eastwood, Mark (Dewsbury) (Con)
Henry, Darren (Broxtowe) (Con)
† Hunt, Jane (Loughborough) (Con)
† Huq, Dr Rupa (Ealing Central and Acton) (Lab)
Long Bailey, Rebecca (Salford and Eccles) (Lab)
† Monaghan, Carol (Glasgow North West) (SNP)
† Onwurah, Chi (Newcastle upon Tyne Central) (Lab)
† Peacock, Stephanie (Barnsley East) (Lab)
† Richards, Nicola (West Bromwich East) (Con)
Simmonds, David (Ruislip, Northwood and Pinner) (Con)
† Wakeford, Christian (Bury South) (Lab)
† Whittingdale, Sir John (Minister for Data and Digital Infrastructure)
Huw Yardley, Bradley Albrow, Committee Clerks
† attended the Committee
Witnesses
John Edwards, Information Commissioner, Information Commissioner's Office
Paul Arnold, ICO Deputy Chief Executive and Chief Operating Officer, Information Commissioner's Office
Eduardo Ustaran, Global co-head of the Hogan Lovells Privacy and Cybersecurity practice, Hogan Lovells
Vivienne Artz OBE
Bojana Bellamy, President, Centre for Information Policy Leadership
Neil Ross, Associate Director for Policy, TechUK
Chris Combemale, CEO, Data and Marketing Association
Dr Jeni Tennison OBE, Founder and Executive Director, Connected by Data
Anna Thomas, Co-Founder and Director, Institute for the Future of Work
Michael Birtwistle, Associate Director (AI Law and Regulation), Ada Lovelace Institute
Public Bill Committee
Wednesday 10 May 2023
(Morning)
[Mr Philip Hollobone in the Chair]
Data Protection and Digital Information (No. 2) Bill
09:25
None Portrait The Chair
- Hansard -

Before we begin, I have a couple of preliminary announcements that Mr Speaker has asked me to draw to your attention. Hansard colleagues would be grateful if Members emailed their speaking notes to hansardnotes@parliament.uk. Please switch electronic devices to silent. Tea and coffee are not allowed during sittings.

Today we will first consider the programme motion on the amendment paper. We will then consider a motion to enable the reporting of written evidence for publication and a motion to allow us to deliberate in private about our questions before the oral evidence session. In view of the time available, I hope we can take these matters formally—without debate. The programme motion was discussed yesterday by the Programming Sub-Committee for this Bill.

Ordered,

That—

1. the Committee shall (in addition to its first meeting at 9.25 am on Wednesday 10 May) meet—

(a) at 2.00 pm on Wednesday 10 May;

(b) at 9.25 am and 2.00 pm on Tuesday 16 May;

(c) at 11.30 am and 2.00 pm on Thursday 18 May;

(d) at 9.25 am and 2.00 pm on Tuesday 23 May;

(e) at 9.25 am and 2.00 pm on Tuesday 6 June;

(f) at 11.30 am and 2.00 pm on Thursday 8 June;

(g) at 9.25 am and 2.00 pm on Tuesday 13 June;

2. the Committee shall hear oral evidence in accordance with the following Table:

Date

Time

Witness

Wednesday 10 May

Until no later than 9.55 am

Information Commissioner’s Office

Wednesday 10 May

Until no later than 10.25 am

Hogan Lovells; London Stock Exchange Group; Centre for Information Policy Leadership

Wednesday 10 May

Until no later than 10.50 am

techUK; Data & Marketing Association

Wednesday 10 May

Until no later than 11.25 am

Connected by Data; Institute for the Future of Work; Ada Lovelace Institute

Wednesday 10 May

Until no later than 2.25 pm

Medtronic; UK Biobank

Wednesday 10 May

Until no later than 2.50 pm

ZILO; UK Finance

Wednesday 10 May

Until no later than 3.05 pm

Better Hiring Institute

Wednesday 10 May

Until no later than 3.30 pm

National Crime Agency; Metropolitan Police

Wednesday 10 May

Until no later than 3.55 pm

Prospect; Trades Union Congress

Wednesday 10 May

Until no later than 4.25 pm

Public Law Project; Law Society of Scotland; Rights and Security International

Wednesday 10 May

Until no later than 4.40 pm

AWO



3. proceedings on consideration of the Bill in Committee shall be taken in the following order: Clauses 1 to 5; Schedule 1; Clause 6; Schedule 2; Clauses 7 to 11; Schedule 3; Clauses 12 to 20; Schedule 4; Clause 21; Schedules 5 to 7; Clauses 22 to 41; Schedule 8; Clauses 42 to 45; Schedule 9; Clauses 46 to 86; Schedule 10; Clauses 87 to 98; Schedule 11; Clause 99; Schedule 12; Clause 100; Schedule 13; Clauses 101 to 114; new Clauses; new Schedules; remaining proceedings on the Bill;

4. the proceedings shall (so far as not previously concluded) be brought to a conclusion at 5.00 pm on Tuesday 13 June.— (Sir John Whittingdale.)

Resolved,

That, subject to the discretion of the Chair, any written evidence received by the Committee shall be reported to the House for publication.—(Sir John Whittingdale.)

Resolved,

That, at this and any subsequent meeting at which oral evidence is to be heard, the Committee shall sit in private until the witnesses are admitted.—(Sir John Whittingdale.)

None Portrait The Chair
- Hansard -

Copies of written evidence that the Committee receives will be made available in the Committee Room and circulated to Committee members by email. We will now go into private session to discuss lines of questioning.

09:26
The Committee deliberated in private.
Examination of Witnesses
10:50
John Edwards and Paul Arnold gave evidence.
None Portrait The Chair
- Hansard -

We are now sitting in public again and the proceedings are being broadcast. Before we hear from the witnesses, do any Members wish to make a declaration of interest in connection with the Bill?

Jane Hunt Portrait Jane Hunt (Loughborough) (Con)
- Hansard - - - Excerpts

I am not sure whether this is a declaration of interest, so I will mention it just in case. I have had a meeting with Leicestershire Police Federation and I am interested in an amendment that it would like tabled.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

I am not sure whether this is directly relevant to the Bill or adjacent to it, but I am an unpaid member of the board of the Centre for Countering Digital Hate, which does a lot of work looking at hate speech in the online world.

Mark Eastwood Portrait Mark Eastwood (Dewsbury) (Con)
- Hansard - - - Excerpts

Given that one of today’s witnesses is from Prospect, I wish to declare that I am a member of that union.

Stephanie Peacock Portrait Stephanie Peacock (Barnsley East) (Lab)
- Hansard - - - Excerpts

I am a proud member of a trade union. I refer the Committee to my entry in the Register of Members’ Financial Interests.

Chi Onwurah Portrait Chi Onwurah (Newcastle upon Tyne Central) (Lab)
- Hansard - - - Excerpts

I am a proud member of two trade unions.

Rupa Huq Portrait Dr Rupa Huq (Ealing Central and Acton) (Lab)
- Hansard - - - Excerpts

Should we declare our membership of any union?

None Portrait The Chair
- Hansard -

My advice is that it is always better to declare.

Rupa Huq Portrait Dr Huq
- Hansard - - - Excerpts

Okay. I am a member of Unison, formerly the National and Local Government Officers Association.

Christian Wakeford Portrait Christian Wakeford (Bury South) (Lab)
- Hansard - - - Excerpts

I am also a member of a union.

Mike Amesbury Portrait Mike Amesbury (Weaver Vale) (Lab)
- Hansard - - - Excerpts

I am a member of Unison and the GMB.

None Portrait The Chair
- Hansard -

We will now hear oral evidence from John Edwards, the Information Commissioner, and Paul Arnold, the deputy chief executive and chief operating officer of the Information Commissioner’s Office. I remind all Members that questions should be limited to matters within the scope of the Bill, and that we must stick to the timings in the programme order, which the Committee has agreed. For this panel, we have until 9.55 am. Will the witnesses please introduce themselves for the record?

John Edwards: Kia ora! My name is John Edwards. I am the Information Commissioner. I took up the job at the beginning of January last year. I was previously the Privacy Commissioner of New Zealand for eight years.

Paul Arnold: I am Paul Arnold, the deputy chief executive and chief operating officer of the ICO. I took up that position in 2016.

None Portrait The Chair
- Hansard -

May I gently say to the witnesses that this is a big room, so you will need to project your voices so that we can hear your evidence?

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q Good morning and welcome. The Bill creates a new body corporate to replace the corporation sole. What impact, both in the short and long term, do you think that will have on its ability to carry out its functions?

John Edwards: The corporation sole model is fit for a number of purposes. That was the structure that I had back home in New Zealand. For an organisation such as the Information Commissioner’s Office, it is starting to buckle under the weight. It will benefit, I think, from the support of a formal board structure, with colleagues with different areas of expertise appointed to ensure that we bring an economy-wide perspective to our role, which as we have heard from the declarations of interest spans almost every aspect of human activity.

There will be some short-term, transitional challenges as we make the transition from a corporation sole to a board structure. We will need to employ a chief executive, for example, as well as getting used to those structures and setting up our new accountability frameworks. But I think, in the longer term, the model proposed in the legislation is well proven across other regulators, both domestically and internationally.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q I would like to ask about the independence of the ICO as it stands. Do you have any experience of being directed by the Secretary of State in a way that has threatened the regulator’s impartial position?

John Edwards: No, I do not.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q If the Bill is passed in its current form, the Secretary of State—whoever that might be—will have the ability to approve and veto statutory codes of practice produced by the commission, as well as to set out a statement of strategic priorities to which the commission will have to adhere. Do you perceive that having any impact on your organisation’s ability to act independently of political direction?

John Edwards: No, I do not believe it will undermine our independence at all. What I think it will do is to further enhance and promote our accountability, which is very important.

To take your first challenge, about codes of conduct, we worked closely with the Department for Digital, Culture, Media and Sport and subsequently the Department for Science, Innovation and Technology to ensure that we got the appropriate balance between the independence of the commission with the right of the Executive and Parliament to oversee what is essentially delegated lawmaking. I think we have got there. It is not a right to veto out of hand; there is a clear process of transparency, which would require the Secretary of State, in the event that he or she decided not to publish a statutory code that we had recommended, to publish their reasons, and those would be available to the House. I do think there is an appropriate level of parliamentary and Executive oversight of what is, as I say, essentially a lawmaking function on the part of the commission.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q If the Secretary of State can veto a code of practice that the commission has produced regarding the activities of Government, will that not mean that they are, effectively, marking their own homework?

John Edwards: I do not believe so. The code of practice would be statutory—it is only the most serious statutory guidance that we would issue, not the day-to-day opinions that we have of the way in which the law operates. But, also, it is a reflection of the commissioner’s view of the law, and a statement as to how he or she will interpret and apply the very general principles. A failure of the Secretary of State to table and issue a proposed code would not affect the way in which the commissioner discharges his or her enforcement functions. We would still be able to investigate matters and find them in breach, regardless of whether that finding was consistent with the Secretary of State’s view of the law.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q I will come on to a slightly different topic now. The ICO will play a huge role in enforcing the measures in the Bill. Is there enough clarity in the Bill to ensure that the commission is able to do that effectively? For example, are you clear on how the commission will enforce the law surrounding terms like “vexatious” and “excessive” with regards to subject access requests?

John Edwards: Yes. We are in the business of statutory interpretation. We are given a law by Parliament. A term like “vexatious” has a considerable provenance and jurisprudence; it is one that I worked with back home in New Zealand. So, yes, I am quite confident that we will be able to apply those.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q Linked to that, what about terms like “meaningful human involvement” and “significant decision” with regards to automated decision making?

John Edwards: Sorry, what is your question?

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Parts of the Bill refer to there being “meaningful human involvement” and “significant decisions” within automated decision making. That might be in an application for a mortgage or in certain parts of employment. Do you feel that you can interpret those words effectively?

John Edwards: Yes, of course. You are quite right to point out that those phrases are capable of numerous different interpretations. It will be incumbent on my office to issue guidance to provide clarity. There are phrases in the legislation that Parliament could perhaps look at providing clearer criteria on to assist us in that process of issuing guidance—here I am particularly thinking of the phrase “high risk” activities. That is a new standard, which will dictate whether some of the measures apply.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

That is useful. Thank you.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q Continuing with that theme, the Bill uses a broader definition of “recognised legitimate interests” for data controllers. How do you think the Bill will change the regime for businesses? What sort of things might they argue they should be able to do under the Bill that they cannot do now?

John Edwards: There is an argument that there is nothing under the Bill that they cannot do now, but it does respond to a perception that there is a lack of clarity and certainty about the scope of legitimate interests, and it is a legitimate activity of lawmakers to respond to such perceptions. The provision will allow doubt to be taken out of the economy in respect of aspects such as, “Is maintaining the security of my system a legitimate interest in using this data?” Uncertainty in law is very inefficient—it causes people to seek legal opinions and expend resources away from their primary activity—so the more uncertainty we can take out of the legislation, the greater the efficiency of the regulation. We have a role in that at the Information Commissioner’s Office and you as lawmakers have just as important a role.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q How would you define that clarity that the Bill is seeking? If a data controller thinks, “Well, if I have legitimate business interests, I can make an excuse for doing whatever I like,” that surely is not what the Bill intends. How would you define the clarity that you say the Bill seeks?

John Edwards: You are right that it is the controller’s assessment and that they are entitled to make that assessment, but they need to be able to justify and be accountable for it. If we investigate a matter where a legitimate interest is asserted, we would be able to test that.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q How would you test it?

John Edwards: Well, through the normal process of investigation, in the same way as we do now. We would ask whether this was in the reasonable contemplation of the individual who has contributed their data as a necessary adjunct to the primary business activity that is being undertaken.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q Does this change things very much? It sounds like you are saying that business may assert it has a legitimate interest, but if you think it does not, you can investigate and take action as the law stands currently, effectively.

John Edwards: Yes, that is right. But the clarity will be where specific categories of legitimate interest are specified in the legislation. Again, that will just take out the doubt, if there is doubt as to whether a particular activity falls within scope.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q Is more clarity needed about the use of inferred data? Major social media platforms rely on inferred data to drive their recommendation tools and systems. There are then questions about whether inferred data draws on protected data characteristics without user permission. A platform might say that that is part of its recognised legitimate business interests, but users might say that it is an infringement of their data rights. Is that clear enough?

John Edwards: I am afraid that I have to revert to the standard, which is, “It depends.” These are questions that need to be determined on a case-by-case basis after examination ex post. It is a very general question that you ask. It depends on what the inferred data is being used for and what it is. For example, my office has taken regulatory action against a company that inferred health status based on purchasing practices. We found that that was unlawful and a breach of the General Data Protection Regulation, and we issued a fine for the practice. Again, the law is capable of regulating inferred data, and there is no kind of carte blanche for controllers to make assumptions about people based on data points, whether collected from or supplied by the individual or not.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q Your predecessor raised the issue of the use of inferred data among users’ protected data characteristics—political opinions, religious beliefs, sexual orientation—and said that, without the user’s informed consent, that could not be legal. Do you agree with that?

John Edwards: I am not aware of the statement she made or the context in which she made it, so it is difficult for me to say whether she agreed it. Certainly, informed consent is not the only lawful basis for a data processing activity and it may be that data about protected activities can be inferred and used in some circumstances. I would be happy to come back to you having checked that quote and to give you my views as to whether I agree with it in the context in which it was made.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q These are quite important matters because inferred data is such an important part of data processing for major platforms, be it a company assessing someone’s attitude to risk and how that affects the way they might use a gambling product, versus taking someone’s personal, private information, inferring things from it and making them open to suggestions they may not want to receive without their informed consent. That is a grey area, and I wonder whether you think the Bill provides greater clarity, or you think there needs to be more clarity still.

John Edwards: I think there is sufficient clarity. I am not sure whether the Bill speaks to the point you have just made, but for me the overarching obligation to use data fairly enables us to make assessments about the legitimacy of the kinds of practices you are describing.

None Portrait The Chair
- Hansard -

It is a really tight timetable this morning and we have nine minutes left. The Minister wants to ask some questions and there are three Members from the Opposition. I will call the Minister now. Perhaps you would be kind enough, Minister, to leave time for one question each from our three Members of the Opposition.

John Whittingdale Portrait The Minister for Data and Digital Infrastructure (Sir John Whittingdale)
- Hansard - - - Excerpts

Q Thank you, Mr Hollobone. Good morning, Mr Edwards. Both the structure and powers of your office are going to change as a result of the Bill. Do you believe that the existing structure and the absence of the powers you will gain under the Bill have in any way impeded the carrying out of your functions?

John Edwards: The obligation to investigate every complaint does consume quite a lot of our resources. Can I ask my colleague to make a contribution on this point?

Paul Arnold: As the commissioner says, that duty to investigate all complaints can challenge us in terms of where we need to dedicate the majority of our resources.

To the previous question and answer, our role in trying to provide or maximise regulatory certainty means being able to invest as much resource as we can in that upstream advice, particularly in those novel, complex, finely balanced, context-specific areas. We are adding far more value if we can add that support upstream.

The additional statutory objectives that are being added through the Bill overall will be a real asset to our accountability. Any regulator that welcomes independence also needs to welcome the accountability. It is the means through which we describe how we think, how we act and the outcomes that we achieve. Those extra statutory objectives will be a real aid to us and also an aid to Parliament and our stakeholders. It really does crystallise and clarify why we are here and how we will prioritise our efforts and resources.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Q In the interests of time, I will ask you one other question. Mr Edwards, you had experience as the New Zealand Privacy Commissioner for some time. New Zealand is one of the countries recognised as having data adequacy by the European Union. Can you give us a view, based on your experience of dealing with the European Union, of whether there is any concern about the Bill that might put at risk the UK’s data adequacy recognition from the EU?

John Edwards: I do not believe there is anything in the Bill that would put at risk the adequacy determination with the European Union. The test the Commission applies is whether the law is essentially equivalent. New Zealand lacks many of the features of the GDPR, as do Israel and Canada, each of which has maintained adequacy status. The importance of an independent regulator is preserved in this legislation. All the essential features of the UK GDPR or the rights that citizens of the European Union enjoy are present in the Bill, so I do not believe that there is a realistic prospect of the Commission reviewing negatively the adequacy determination.

None Portrait The Chair
- Hansard -

It is a brutal cut-off, I am afraid, at 9.55 am. I have no discretion in this matter. It is a quick-fire round now, gentlemen. We need quick questions and quick answers, with one each from Carol Monaghan, Chi Onwurah and Mike Amesbury.

Carol Monaghan Portrait Carol Monaghan (Glasgow North West) (SNP)
- Hansard - - - Excerpts

Q Clause 40 sets out the criteria by which a data controller can refuse data access requests. Do you think this is appropriate? Are you concerned that it may lead to a situation in which only those who can afford to pay a potential fee will be able to access their data?

John Edwards: Yes and no. Yes, I do believe it is an adequate provision, and no, I do not believe there will be an economic barrier to people accessing their information rights.

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

Q The Bill’s intent is to reduce burdens on organisations while maintaining high data protection standards. Do you agree that high data protection standards are promoted by well-informed and empowered citizens? What steps do you think the Bill takes to ensure greater information empowerment for citizens?

John Edwards: Yes, I do believe that an empowered citizenry is best placed to enjoy these rights. However, I also believe that the complexity of the modern digital environment creates such an information asymmetry that it is important for strong advocates such as the Information Commissioner’s Office to act as a proxy on behalf of citizenry. I do not believe that we should devolve responsibility to citizens purely to ensure that high standards are set and adhered to in digital industries.

Mike Amesbury Portrait Mike Amesbury
- Hansard - - - Excerpts

Q Drawing on your expertise, is there anything missing from the Bill that you would have liked to see?

John Edwards: I do not believe so. We have been involved right from the outset. We made a submission on the initial White Paper. We have worked closely with officials. We have said that we want to see the Bill get to a position where I, as Information Commissioner, am able to stand up and say, “I support this legislation.” We have done that, which has meant we have achieved quite significant changes for the benefit of the people of the United Kingdom. It does not mean that we have just accepted what the Government have handed out. We have worked closely together. We have acted as advocates, and I believe that the product before you shows the benefits of that.

None Portrait The Chair
- Hansard -

We have a late entry—the last question will be from Rupa Huq.

Rupa Huq Portrait Dr Huq
- Hansard - - - Excerpts

Q When I was on the Criminal Finances Bill Committee, lots was promised, but the National Crime Agency then claimed that it was not financed enough to pursue all the unexplained wealth orders that were promised. Do you think that a beefed-up Information Commission will be sufficiently well resourced to do all the things it is meant to do?

John Edwards: In short, yes. We are having discussions about the funding model with DSIT. We are funded by levies. There are two questions: one is about how those levies are set and where the burden of funding our office lies in the economy, and the second is about the overall quantum. We can always do more with more. If you look at the White Paper on artificial intelligence and the Vallance report, you will see that there is a role for our office to patrol the new boundaries of AI. In order to do that, we will have to be funded appropriately, but I have a good relationship with our sponsor Department and am confident that we will be able to discharge all the responsibilities in the Bill.

None Portrait The Chair
- Hansard -

Gentlemen, thank you very much indeed for your evidence. You can now breathe, relax and enjoy the rest of your day.

Examination of Witnesses

Eduardo Ustaran, Vivienne Artz and Bojana Bellamy gave evidence.

09:53
None Portrait The Chair
- Hansard -

Q We will now hear oral evidence from Eduardo Ustaran, global co-head of the privacy and cyber-security practice at Hogan Lovells, who is appearing via Zoom; Vivienne Artz OBE, who is in the room; and Bojana Bellamy, president of the Centre for Information Policy Leadership, who is also appearing via Zoom. For this session we have until 10.25 am. Will the witnesses introduce themselves for the record, starting with Vivienne Artz?

Vivienne Artz: Good morning. My name is Vivienne Artz. I am the chair of the International Regulatory Strategy Group data committee, I have more than 25 years’ experience in financial services, including acting as a chief privacy officer, and I now do advisory work across a range of sectors, including in the context of financial crime.

None Portrait The Chair
- Hansard -

Will Eduardo Ustaran please introduce himself? Can you hear us, Mr Ustaran? No. Can you hear us, Bojana Bellamy? No. Okay, we will start with our witness who has been kind enough to join us in the room.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q Welcome. Vivienne, would you be in favour of implementing a smart data regime in your industry? If so, why?

Vivienne Artz: Yes, we are interested in implementing a smart data regime because it will allow broader access to data for innovation, particularly in the context of open banking and open finance. It would require access to information, which can often be limited at the moment. There is a lot of concern from businesses around whether or not they can actually access data. Some clarification on what that means, in respect of information that is not necessarily sensitive and can be used for the public good, would be most welcome. Currently, the provisions in the legislation are pretty broad, so it is difficult to see what it will look like, but in theory we are absolutely in favour.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q Could you give more detail on who you think would benefit or lose out, and in what ways?

Vivienne Artz: Consumers would absolutely benefit, and that is where our priority needs to be—with individuals. It is an opportunity for them to leverage the opportunities that the data can provide. It will enable innovators to produce more products and services that will help individuals to better understand their financial and personal circumstances, particularly in the context of utility bills and so on. There are a number of positive use cases. There is obviously always the possibility that data can be misused, but I am a great advocate of saying that we need to find the positive use cases and allow business to support society and our consumers to the fullest extent. That is what we need to support.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q Brilliant. What are your thoughts on giving the Secretary of State the power to amendment data protection legislation further? Do you think it is necessary to future-proof the Bill?

Vivienne Artz: It is necessary to future-proof the Bill. We are seeing such an incredible speed of innovation and change, particularly with regard to generative artificial intelligence. We need to make sure that the legislation remains technology-neutral and can keep up to date with the changes that are currently taking place.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

I have more questions if our other witnesses are with us.

None Portrait The Chair
- Hansard -

We still have not heard definitively whether our other guests can hear us or speak to us, so we are waiting for confirmation from the tech people. In the meantime, I invite the Minister to question Vivienne Artz.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Q You have a lot of experience in respect of international data transfers. The European Union has a number of data adequacy agreements around the world, but the process to establish them has been slow. How do you think the Bill will make it easier for us to improve international data agreements? What prospects are there for the UK to establish such agreements, and with which countries?

Vivienne Artz: The Bill provides for the opportunity for the Government to look at a range of issues and to move away from an equivalence approach to one in which we can consider more factors and features. The reality is that if you compare two pieces of legislation, you will always find differences because they come from different cultural backgrounds and different legal regimes. There will always be differences. The approach the UK is taking in the Bill is helpful because it looks at outcomes and broader issues such as the rule of law in different jurisdictions.

What is said on paper is not necessarily what always happens in practice; we need to look at it far more holistically. The legislation gives the Government the opportunity to take that broader and more common-sense view with regard to adequacy and not just do a word-by-word comparison of legislative provisions without actually looking at how the legislation is implemented in that jurisdiction and what other rights can support the outcomes. We can recognise that there is a different legal process and application but ask whether it still achieves the same end. That is what is really important. There is an opportunity not only to move more quickly in this space but to consider jurisdictions that might not be immediately obvious but none the less still offer appropriate safeguards for data.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Q Obviously it is already possible for us to undertake international data transfers to countries with which we do not have an adequacy agreement. Can you set out the advantages of having a general adequacy agreement in terms of data transfer and the benefits to the UK economy?

Vivienne Artz: The current process is incredibly cumbersome for businesses and, if I am honest, it provides zero transparency for individuals as well. It tends to be mostly a paperwork exercise—forgive if that sounds provocative, but putting in place the model clauses is very often an expensive paperwork exercise. At the moment, it is difficult, time-consuming and costly, as the case may be.

The thing with adequacy is that it is achieved at a Government-to-Government level. It is across all sectors and provides certainty for organisations to move forward to share information, sell their goods and services elsewhere and receive those goods and services, and for consumers to access those opportunities as well. Adequacy is certainly the ideal. Whether it is achievable in all jurisdictions I do not know, but I think it is achievable for many jurisdictions to provide confidence for both consumers and businesses on how they can operate.

None Portrait The Chair
- Hansard -

We can see Mr Ustaran and Ms Bellamy and they can hear us, but we cannot hear them, so we will carry on with questioning Vivienne Artz.

Carol Monaghan Portrait Carol Monaghan
- Hansard - - - Excerpts

Q A number of organisations have expressed concerns about moving to a situation in which we can refuse subject access requests or indeed charge a fee. Do you believe the thresholds in the Bill are appropriate and proportionate?

Vivienne Artz: I do think the thresholds are appropriate and proportionate. In practice, most organisations do not actually choose to charge, because actually it costs more to process the cheque than it is worth in terms of the revenue. Certainly, some sectors have been subject to very vexatious approaches through claims-management companies and others, where it is a bombarding exercise and it is unclear whether it is in the best interests of the consumers, or whether it is at their understanding and behest, to make a genuine subject access request.

I am a great supporter of subject access requests—they are a way for individuals to exercise their rights to understand what data is being processed—but as a result of quirks of how we operate often in the UK, they are being used as a pre-litigation investigative tool on the cheap, which is unfortunate and has meant that we have had to put in place additional safeguards to ensure they are used for the purpose for which they were provided, which is so that individuals can have transparency and clarity around what data is being processed and by whom.

Carol Monaghan Portrait Carol Monaghan
- Hansard - - - Excerpts

Q Do you think the threshold for something to be considered vexatious or excessive is well understood?

Vivienne Artz: We have heard from the Information Commissioner that they are fairly clear on what that terminology means and it will reflect the existing body of law in practice. I will be perfectly honest: it is not immediately clear to me, but there is certainly a boundary within which that could be determined, and that is something we would rely on the Information Commissioner to provide further guidance on. It is probably also likely to be contextual.

Carol Monaghan Portrait Carol Monaghan
- Hansard - - - Excerpts

Q How frequently do we expect such requests to be refused off the back of this legislation?

Vivienne Artz: I think it depends on the sector. I come from the financial services sector, so the types of subject access requests we get tend to be specific to us. I think organisations are going to be reluctant to refuse a subject access request because, at the end of the day, an individual can always escalate to the Information Commissioner if they feel they have been unfairly treated. I think organisations understand their responsibility to act in the best interests of the individual at all times.

None Portrait The Chair
- Hansard -

Q Ms Bellamy and Mr Ustaran, we can now hear both of you. Would you be kind enough to introduce yourselves?

Bojana Bellamy: Thank you for inviting me to this hearing. My name is Bojana Bellamy. I lead the Centre for Information Policy Leadership. We are a global data privacy and data policy think-and-do-tank operating out of London, Brussels and Washington, and I have been in the world of data privacy for almost 30 years.

Eduardo Ustaran: Good morning. My name is Eduardo Ustaran. I am a partner at Hogan Lovells, based in London, and I co-lead our global privacy and cyber-security practice, a team of over 100 lawyers who specialise in data protection law all over the world.

None Portrait The Chair
- Hansard -

Thank you. Chi Onwurah and Damian Collins are lined up to ask questions, but I want first to ask the shadow Minister whether she has any further questions, followed by the Minister. Because we have one witness in the room and two online, please will whoever is asking the question indicate whom you are asking it of?

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q Good morning to our guests joining us via Zoom. Ms Bellamy, in your opinion has it been difficult for businesses to adapt to the EU GDPR? If so, do you think the changes in the Bill will make it easier or harder for businesses to comply with data protection legislation?

Bojana Bellamy: Yes, certainly it has been hard to get businesses to comply with GDPR, in particular small and medium-sized businesses. I think the changes proposed in the Bill will make it easier, because it is more about outcomes-based regulation. It is more about being effective on the ground, as opposed to being prescriptive. GDPR is quite prescriptive and detailed. It tells you how to do things. In this new world of digital, that is not very helpful, because technology always goes in front of and faster than the rules.

In effect, what we see proposed in the Bill is more flexibility and more onus on organisations in both the public and private sector to deliver accountability and effective protection for people. It does not tell them and prescribe how exactly to do that, yet they are still accountable for the outcomes. From that perspective, it is a step forward. It is a better regime, in my opinion.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q Mr Ustaran, what do you perceive the value of EU adequacy to be? What would be the consequences for your businesses and other businesses and the UK market of losing such an agreement?

Eduardo Ustaran: From the point of view of adequacy, it is fundamental to acknowledge that data flows between the UK and the EU and the EU and the UK are essential for global commerce and for our digital existence. Adequacy is an extremely valuable element of the way in which the current data protection regime works across both the EU and the UK.

It is really important to note at the outset that the changes being proposed to the UK framework are extremely unlikely to affect that adequacy determination by the EU, in the same way that if the EU were to make the same changes to the EU GDPR, the UK would be very unlikely to change the adequacy determination of the EU. It is important to appreciate that these changes do not affect the essence of UK data protection law, and therefore the adequacy that is based on that essence would not be affected.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q You have answered my next question—thank you—but I will pose it to the other witnesses, who may have something to add. In the previous session, the Information Commissioner said that he did not think the Bill was a threat to adequacy. That is comforting, but it is not confirmation, because the only people who have the power to decide whether adequacy stands are the European Commission. Do you think any of the measures in the Bill pose a risk to the adequacy agreement?

Bojana Bellamy: I certainly agree that adequacy is a political decision. In many ways—you have seen this with the Northern Ireland protocol—some of these decisions are made for different purposes. I do not believe there are elements of the Bill that would reduce adequacy; if anything, the Bill is very well balanced. Let me give you some examples of where I think the Bill goes beyond GDPR: certainly, on expectations of accountability on the senior responsible individual, which actually delivers better oversight and leadership over privacy; on the right to complain to an organisation and on organisations to respond to these complaints; and on the strong and effective Information Commissioner, who actually has more power. The regulator is smarter; that, again, is better than GDPR. There are also the safeguards that exist for scientific research and similar purposes, as well as some other detailed ones.

Yes, you will see, and you have seen in public projects as well, that there are people who are worried about the erosion of rights, but I do not believe that exception to subject access requests and other rights we talked about are actually a real erosion. I think it just clarifies what has been the law. Some of the requirements to simplify privacy impact assessment and records of processing will, in fact, deliver better accountability in practice. They are still there; they are just not as prescriptive. The Information Commissioner has strong powers; it is a robust regulator, and I do not believe its independence will be dented by this Bill. I say to those who think that we are reducing the level of protection that, actually, the balance of all the rules is going to be essential equivalency to the EU. That is really what is important.

May I say one more thing quickly? We have seen the EU make adequacy decisions regarding countries such as Japan and Korea, and even privacy shield. Even in these cases, you have not had a situation where the requirements were essentially equivalent. These laws are still different from GDPR—they do not have the right of portability or the concept of automated decision making—but they are still found to be adequate. That is why I really do not believe that this is a threat. One thing we have to keep absolutely clear and on par with the EU is Government access to data for national security and intelligence purposes. That is something the EU will be very interested in to ensure that that is not where the bar goes down, but there is no reason to believe so and there is nothing in the Bill to tell us so.

Vivienne Artz: I concur; I do not think the Bill poses any threat to adequacy with the EU. With regard to the national security issue that Bojana raises, I would also point out that the UN rapporteur noted that the UK has better protections for Government access to data than many EU member states, where it is often a very political approach as opposed to a practical approach and really looking at what the outcomes are. There is nothing in this Bill that would jeopardise adequacy with the EU.

None Portrait The Chair
- Hansard -

We have 12 minutes left and two Members are indicating that they wish to ask questions after you, Minister.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Q I will be very quick, Mr Hollobone. Ms Bellamy, you have suggested that in some ways the regime that the Bill puts in place is superior to that of the existing GDPR and that it certainly does not risk our adequacy recognition in any way. Given the development of technology and the increasing use of things like AI, to what extent do you think the EU might follow the same sort of path that the Bill sets out to try to create a more flexible and a state-of-the-art regime?

Eduardo Ustaran: That is a very important question to address because perhaps one of the ways in which we should be looking at this legislative reform is a way of seeing how the existing GDPR framework that exists both in the EU and the UK could, in fact, be made more effective, relevant and modern to deal with the issues we are facing right now. You refer to artificial intelligence as one of those issues.

GDPR in the EU and the UK, is about five years old. It is not a very old piece of legislation, but a number of technological developments have happened in the past five years. More importantly, we have learned how GDPR operates in practice. This exercise in the UK is in fact very useful, not just for the UK but for the EU and the world at large, because it is looking at how to reform elements of existing law that is already in operation in order to make it more effective. That does not mean that the law needs to be more onerous or more strict, but it can be more effective at the same time as being more pragmatic. This is an important optic in terms of how we look at legislative reform, and not only from the UK’s point of view. The UK can make an effort to try to make the changes more visible outside the United Kingdom, and possibly influence the way in which EU GDPR evolves in the years to come.

Bojana Bellamy: I agree that we need a more flexible legal regime to enable the responsible use of AI and machine learning technologies. To be very frank with you, I was hoping the Bill would go a little further. I was hoping that there would be, for example, a recognition of the use of data in order to train algorithms to ensure that they are not discriminatory, not biased and function properly. I would have hoped that would be considered as an example of legitimate interests. That is certainly a way in which the Government can go further, because there are possibilities for the Secretary of State to augment those provisions.

We have seen that in the European AI Act, where they are now allowing greater use of data for algorithmic AI training, precisely in order to ensure that algorithms work properly. We have Dubai’s data protection law and some others are starting to do that. I hope that we have good foundations to ensure further progression of the rules on AI. The rules on automated decision making are certainly better in this Bill than they are in GDPR. They are more realistic; they understand the fact that we going to be faced with AI and machine learning taking more and more decisions, of course with the possibility of human intervention.

Again, to those who criticise the rules, I would say it is more important to have these exposed rights of individuals. We should emphasise, in the way we have done in the Bill, the right to information that there is AI involved, the right to make a representation, the right to contest a decision, and the right to demand human review or human intervention. To me, that is really what empowers individuals and gives them trust that the decisions will be made in a better way. There is no point in prohibiting AI in the way GDPR sort of does. In GDPR, we are going to have something of a clash between the fact that the world is moving toward greater use of AI, and that in article 22 on automated decision making, there is a prohibition that makes it subject to consent or contract. That is really unrealistic. Again, we have chosen a better way.

As a third small detail, I find the rules on research purposes to be smarter. They are rather complicated to read, to be frank, but I look forward to the consolidated, clean version. The fact that technological development research is included in commercial research will enable the organisations that are developing AI to create the rules in a responsible way that creates the right outcomes for people, and does not create harms or risks. To me, that is what matters. That is more important, and that is what is going to be delivered here. We have the exemptions from notices for research and so on, so I feel we will have better conditions for the development of AI in a responsible and trusted way. However, we must not take our eyes off it. We really need to link GDPR with our AI strategy, and ensure that we incentivise organisations to be accountable and responsible when they are developing and deploying AI. That will be a part of the ICO’s role as well.

None Portrait The Chair
- Hansard -

Five minutes left. This will be the quick-fire round. I have two Members indicating that they wish to ask questions—Chi Onwurah.

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

Q Thank you, Mr Hollobone. We have heard that the intent in the Bill is in part to reduce the burden on organisations from data protection. We heard you set out what some of those burdens might be. The organisations affected by this Bill, and the organisations with which you work in different ways, operate in different jurisdictions. I think you, Ms Artz, set out quite well the challenges of having—or trying to have—the same regime in different jurisdictions. If forced to make a choice between following the European Union regime and following a divergent UK regime, what choice would the organisations with which you work make?

None Portrait The Chair
- Hansard -

Please choose one witness.

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

Mr Ustaran, please.

Eduardo Ustaran: This is a question that many organisations that operate globally face right now. You must understand that data protection law operates all over the world and data flows all over the world, so consistency is really important in order to achieve compliance in an effective way. Therefore, a question—a very valid question—is, “Do I comply with the EU GDPR across the board, including in the UK, or should I make a difference?”

The reality is that when you look at the way in which the UK data protection framework is being amended, it provides a baseline for compliance with both the UK and EU regimes, in the sense that much of what is being introduced could potentially be interpreted as already being the case in the EU, if you apply perhaps a more progressive interpretation of EU law. Therefore, I think we should look just a little bit further than just saying, “Well, if I do comply with EU law, will I be all right in the UK?”

Maybe the way to look at it—something I see some organisations exploring—is, “If I were to take the UK interpretation of the GDPR on a wholesale basis, would that allow me to operate across the world, and certainly in the EU, in a more effective and efficient but still compliant way?” This is something that companies will be exploring, and it is not as easy as simply saying, “Well, I will just do EU law across the board.”

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

Could I—

None Portrait The Chair
- Hansard -

Sorry. It must be one quick question and one quick answer. We must finish at 10.25 am. Damian Collins.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q Ms Artz, one of the complaints about the current GDPR regime has been, for example, that oligarchs use it aggressively to target investigative journalists conducting legitimate investigations into their business activities, to bombard them with data access requests. Do you think that the provisions in the Bill around vexatious requests will help in that situation? Do you think that it will make any difference?

Vivienne Artz: I think it will help a little bit in terms of the threshold of “vexatious”. I think the other piece that will help is the broadening of the provisions around legitimate interests, because now there is an explicit legitimate interest for fraud detection and prevention. At the moment, it is articulated mostly as to prevent a crime. I would suggest that it could be broadened in the context of financial crime, which has anti-money laundering, sanctions screening and related activities, so that firms can actually process data in that way.

Those are two different things: the one is processing data around sanctioned individuals and such like in the context of suspicious activities, and the other is the right of a subject access to remove their data. Even if they make that subject access request, the ability now to balance it against broader obligations where there is a legitimate interest is incredibly helpful.

None Portrait The Chair
- Hansard -

I thank all three witnesses for their time this morning and their extremely informative answers to the questions. Our apologies from Parliament for the tech issues that our two Zoom contestants had to endure. Thank you very much indeed. We will now move on to our third panel.

Examination of Witnesses

Neil Ross and Chris Combemale gave evidence.

10:24
None Portrait The Chair
- Hansard -

Q Welcome. We will now hear oral evidence from Neil Ross, Associate Director for Policy at techUK, and Chris Combemale—I hope I pronounced that correctly—the Chief Executive Officer of the Data and Marketing Association. Gentlemen, this session, as you have seen from the previous two, has to end no later than 10.50 am. I will be grateful if you could be kind enough, please, to introduce yourselves to the Committee for the record.

Neil Ross: Thank you for having us before the Committee. My name is Neil Ross. I am the Associate Director for Policy at techUK, the trade association that represents the technology sector in the UK. We have 950 companies in our membership.

Chris Combemale: I am Chris Combemale, the CEO of the Data and Marketing Association. I have 40 years’ experience as a practitioner in marketing and advertising. I started on the agency side, including well-known brands, leading marketing technology business and first-generation cloud marketing technology.

None Portrait The Chair
- Hansard -

I apologise for getting your surname pronunciation wrong, Mr Combemale.

Chris Combemale: That’s okay, it happens all the time. It is actually of French heritage, rather than Italian.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q Welcome to the witnesses. TechUK’s response to the withdrawn Bill last autumn stated that it

“could go further in seeking the full benefits of data driven innovation”.

Does this amended Bill go further?

Neil Ross: Yes, it does. If we go back to the statement of the Information Commissioner earlier, the most important part of the legislation is to provide increased clarity on how we can use data. I think there were about 3,000 responses to the consultation, and the vast majority—particularly around the scientific research and the legitimate interest provisions—focused on providing that extra level of clarity. What the Government have done is quite clever, in that they have lifted examples from the recitals—recital 157, as well as those related to legitimate interests—to give additional clarity on the face of the Bill, so that we can take a much more innovative approach to data management and use in the UK, while still maintaining that within the broad umbrella of what means we qualify for EU adequacy.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q How have your members found adapting to GDPR? Will the Bill make it easier or harder for those that you represent to comply?

Neil Ross: Most tech companies have adapted to GDPR. It is now a common global standard. The Bill makes the compliance burden a little easier to use, allows us to be a little more flexible in interpretation of it and will give companies much more certainty when taking decisions about data use.

One really good example is fraud. Online fraud is a massive problem in the UK and the Government have a strategy to deal with it, so having that legitimate interest that focuses on crime prevention—also those further processing rights around compliance with the law—means that we can be much more innovative and adaptive about how we share and process data to protect against and prevent fraud. That will be absolutely vital in addressing the shared objective that we all have to reduce online fraud.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q On the changes to requirements to report suspicious activity related to unsolicited direct marketing, do the telecoms companies among your members have the technical capability to identify instances of mass unsolicited direct marketing in order to report as required?

Neil Ross: No. That is one area where we think further work is needed in the Bill. I think you are referring to clause 85. When we responded to the consultation, we said that the Government should try to create equivalence between the private communications requirements and the GDPR to give that extra level of flex. By not doing that and by not setting out specific cases of where telecoms companies have to identify unsolicited calls, the Government are being really unfair in what they are asking them to do. We have had concerns raised by a range of companies, both large and small, that they might not have the technical capability and that they will have to set up new systems to do it. Overall, we think that the Bill makes a bit of a misstep here and that we need to clarify exactly how it will work. TechUK and some of my colleagues will be suggesting to the Committee some legal amendments for how to do that.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q On that point, do the telecoms companies feel that they have been consulted properly in the making of the legislation?

Neil Ross: No, not on that clause, but yes in relation to the rest of the legislation.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q I was asking about that. Chris, will the changes to the cookies set out in the Bill benefit, first, the consumer experience and, secondly, your members or businesses?

Chris Combemale: Yes. First, on the consumer experience, I think that we all recognise that the pop-up consent banners for cookies are generally ticked as a matter of course by consumers who really want to go about their business and get to the website that they want to do business on. In a way, it is not genuine consent, because people are not really thinking deeply about it.

In terms of business, a number of the cookies, which are really identifiers that help you understand what people are doing on your website, are used just on a first-party basis by websites, such as e-commerce websites and business-to-business websites, to understand the basic operational aspects and statistical measurement of how many people are going to which pages. Those are websites that do not take any advertising and do not share any data with third parties, so the exemptions in the Bill generally would make those types of companies no longer need cookie banners while providing no risk to the customers, because the company uses the cookies purely to understand the behaviours of its own website traffic and its own customers. In that sense, we strongly support the provisions and the exemptions in the Bill.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q Is the technology available to centralise cookies by browser?

Chris Combemale: I think it can be eventually, but we oppose those provisions in the Bill, because they create a market imbalance and give control as a gateway to large companies that manage browser technology, at the expense of media owners and publishers that are paying journalists and investing in content. It is incumbent upon all else that media owners are able to develop first-party relationships with their audiences and customers to better understand what they need. If anything, we need more control in the hands of the people who invest in creating the content and in paying the journalists who provide those important democratic functions.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q Is there a concern that centralising cookies by browser will entrench power in the hands of the larger tech companies that own the browsers?

Chris Combemale: It certainly would give even greater market control to those companies.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q Is the risk in centralising cookies by browser that we could confuse liability, for example who is responsible for a breach of cookie regulation?

Chris Combemale: I think it could be. For us, the essential principle is that a business, whether a media owner, e-commerce business or publishing business, should have control of the relationships between its products and services and its customers and prospects for its customers. By nature, when you give control to a third party, whether a large tech company or another company, you are getting in between the relationship between people and the organisations that they want to do business with and giving control to an intermediary who may not understand. At the least point, if you register with a website after, for instance, changing your browser setting, that should take precedence over the browser setting: your choice to engage with a particular company should always take precedence over a centralised cookie management system.

Neil Ross: I think that what the Government have done in relation to this is quite clever: they have said that their objective is to have a centralised system in the future, but they have recognised that there are a number of different ongoing legislative and regulatory activities that have a significant bearing on that. I think it was only last week that the Government introduced the Digital Markets, Competition and Consumers Bill, clause 20 of which—on conduct requirements—would play a large role in whether you could set up a centralised system, so there is an element of co-ordinating two different but ongoing regulatory regimes. I think we agree with Chris that the steps on analytical cookies now are good but that we need to have a lot more deep thought about what a centralised system may or may not look like and whether we want to go ahead with it.

Chris Combemale: May I come in on that final point? What makes sense to us is a centralised system for managing opt-outs as opposed to managing consent. As the Data and Marketing Association, we operate the telephone preference service and the mailing preference service, which give consumers the opportunity to opt out from receiving unwanted cold calls or unwanted direct mail. There is already a system in place with digital advertising—an icon that people can use to opt out from the use of personal data for personalising digital ads. I think it makes sense that, if people do not want to receive certain things, they can opt out centrally, but a centralised consent opt-in gives too much control to the intermediaries.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Q Mr Ross, I know that techUK has been supportive of a number of elements of the Bill, particularly around the opportunities created by the use of smart data. Will you set out your view of the opportunities, and how the Bill will help to attain them?

Neil Ross: Smart data is potentially a very powerful tool for increasing consumer choice, lowering prices and giving people access to a much broader range of services. The smart data provisions that the Government have introduced, as well as the Smart Data Council that they are leading, are really welcome. However, we need to go one step further and start to give people and industries clarity around where the Government will look first, in terms of what kind of smart data provisions they might look at and what kind of sectors they might go into. Ultimately, we need to make sure that businesses are well consulted and that there is a strong cost-benefit analysis. We then need to move ahead with the key sectors that we want to push forward on. Similarly to on nuisance calls, we will send some suggested text to the Committee to add those bits in, but it is a really welcome step forward.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Q Which particular sectors offer the most opportunity?

Neil Ross: I do not want to name specific sectors at this point. We are having a lot of engagement with our members about where we would like to see it first. The transport sector is one area where it has been used in the past and could have a large use in the future, but it is something that we are exploring. We are working directly with the Government through the Smart Data Council to try to identify the initial sectors that we could look at.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Q Thank you. Mr Combemale, will you set out some of the obstacles for your organisation, and how you would like the Bill to reduce them?

Chris Combemale: I think the single biggest one that has troubled our members since the implementation of GDPR is the issue around legitimate interest, which was raised by the hon. Member for Folkestone and Hythe. The main issue is that GDPR contains six bases of data processing, which in law are equal. For the data and marketing industry, the primary bases are legitimate interest and consent. For some reason it has become widely accepted through the implementation of GDPR that GDPR requires consent for marketing and for community activities. I am sure that you hear in your constituencies of many community groups that feel that they cannot go about organising local events because they must have consent to communicate. That has never been the intention behind the legislation; in fact, the European Court of Justice has always ruled that any legal interest could be a legitimate interest, including advertising and marketing.

If you look at what we do, which is effectively finding and retaining customers, the GDPR legislation says in recital 4 that privacy is a fundamental right, not an absolute right, and must be balanced against other rights, such as the right to conduct a business. You cannot conduct a business without the right to find and retain customers, just as you cannot run a charity without the right to find donors and volunteers who provide the money and the labour for your good cause. The clarification is really important across a wide range of use cases in the economy, but particularly ours. It was recognised in GDPR in recital 47. What the legislation does is give illustrative examples that are drawn from recitals 47, 48 and 49. They are not new examples; they are just given main text credibility. It is an illustrative list. Really, any legal interest could be a legitimate interest for the purpose of data providing, subject to necessity and proportionality, which we discussed earlier with the Information Commissioner.

Carol Monaghan Portrait Carol Monaghan
- Hansard - - - Excerpts

Q We have heard already this morning that a number of words and phrases could have some ambiguity associated with them, such as the word “excessive”, and the Bill allowing certain cookies that are “low risk”. Do you think that the phrase “low risk” is well enough understood?

Chris Combemale: In the sector that I represent, we have a fairly clear understanding of the gradients of risk. As I was saying earlier, many companies do not share data with other companies. They are interested solely in the relationships that they have with their existing customers or prospects. In that sense, all the customer attitudes to privacy research that we do indicates that people are generally comfortable sharing data with companies they trust and do business with regularly.

Carol Monaghan Portrait Carol Monaghan
- Hansard - - - Excerpts

Q Would that then be the definition of low risk?

Chris Combemale: I would not want to suggest what the legal definition is. To us in direct marketing and in the Data and Marketing Association, existing customer relationships—loyal customers who trust and are sometimes passionate about the brands they interact with—are low risk. Higher risk is when you come to share data with other companies, but again much of that activity and data sharing is essential to creating relevance. With the right protections, it is not a hugely high-risk activity. Then you can move on up, so the higher the degree of automation and the higher the degree of third-party data, the greater the risk, and you have to put in place mitigations accordingly. I am not a lawyer—I am just a poor practitioner—so I cannot define it from a legal point of view, but it is clear in the context of our industry how risk elevates depending on what you are doing.

Carol Monaghan Portrait Carol Monaghan
- Hansard - - - Excerpts

Q I might come back to that in a second, but I think Neil wanted to add something.

Neil Ross: I was going to say that you can see how Chris has interpreted it through the lens of his industry, but the feedback we have had from our members, who operate across a range of industries, suggests that there is quite a lot of confusion about what that terminology might mean. The rest of the Bill aims to clarify elements of the GDPR and put them on the face of the Bill, but this provision seems to be going in the other direction. It raises concern and confusion.

That is why our approach has always been that you are going to get more clarity by aligning the Privacy and Electronic Communications Regulation 2003 more with the GDPR, which has clear legal bases, processes and an understanding of what is high and low risk—a balancing test, and so on—than through this fairly broad and poorly understood term “low risk”. We have concerns about how it will operate across a range of sectors.

Carol Monaghan Portrait Carol Monaghan
- Hansard - - - Excerpts

Q Chris, you said that you are not a lawyer and cannot define what low risk is, but there will of course have to be some sort of definition. Have we captured that well enough?

Chris Combemale: Coming back to our discussion about legitimate interest and the proportionality balancing test, or legitimate interest impact assessments, when you are thinking about what you are planning to do with your customers, it is a requirement of good marketing without the legislation, but also within the legislation, to think about how what you are planning to do will impact your customers’ privacy, and then to mitigate. The important thing is not to say, “There’s no risk,” “It is low risk,” or “It is high risk”; it is to understand that the higher the risk, the greater the mitigations that you have to put in place. You may conclude that you should not do something because the risk level is too high. That is what balancing tests do, and decisions and outcomes result from them.

Carol Monaghan Portrait Carol Monaghan
- Hansard - - - Excerpts

Q The potential difficulty here is that the responsibility is being put on the company. You have described a responsible company that categorises levels of risk and takes action accordingly. Without a clear definition, if it were a less scrupulous company, would there be a grey area?

Chris Combemale: We do a lot of work combating rogue traders, and we provide evidence to cases from our work with the telephone preference service and other activities. Rogue traders—especially those with criminal intent—will generally ignore the legislation anyway regardless of what you do and whether it lacks clarity or not, but I think you are right. An important part of GDPR is that it puts a lot of responsibility on companies to consider their particular activity, their particular customer base and the nature of their audience. Age UK, a charity that has a lot of vulnerable elderly customers, has to have greater protections and put more thought into how it is doing things than a nightclub marketing to under-30s, who are very technologically literate and digitally conversant.

When we do customer attitudes to privacy studies, we see three broad segmentations—data unconcerned, data pragmatist and data fundamentalist—and they require different treatment. It is incumbent on any company, in a marketing context, to understand who their audience and their customer base is, and design programmes appropriately to build trust and long-term relationships over time. That is an important element of GDPR, from a marketer’s perspective. I should add that it should not take legislation to force marketers to do that.

None Portrait The Chair
- Hansard -

There are five minutes left and there are two Members seeking to ask questions.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q With regards to children’s data rights, do you think the Bill will have any implications for the way in which the age-appropriate design code has been implemented by companies working within it at the moment? It is not expressly written into the Bill, but do you expect there to be change?

Neil Ross: No, I do not expect so. Given some of the exemptions for further processing, it might help improve compliance with the law, because compliance with the law in the public interest is then a basis on which you could process data further. It might make it easier for companies to implement the age-appropriate design code.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q Can you give any examples of that?

Neil Ross: It just gives additional clarity on when and where you can use data on various grounds. There are a wide range of circumstances that you can run into in implementing the age-appropriate design code, so having more flexibility in the law to know that you can process data to meet a legal objective, or for a public interest, would be helpful. The best example I can give is from the pandemic: the Government were requesting data from telecoms companies and others, and those companies were unsure of the legal basis for sharing that data and processing it further in compliance with a Government or regulator request. The Bill takes significant steps to try and improve that process.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q Could you give an example more directly related to children?

Neil Ross: I do not have one to hand, but we could certainly follow up.

Mike Amesbury Portrait Mike Amesbury
- Hansard - - - Excerpts

Q The Bill enables the commissioner to impose a fine of £1,000. Is that a reasonable deterrent?

Neil Ross: That is in relation to clause 85?

Mike Amesbury Portrait Mike Amesbury
- Hansard - - - Excerpts

Q For non-compliance.

Neil Ross: We do not think it is particularly appropriate for this scenario, given that the telecoms operators are just informing the ICO about activity that is happening on their service. It is not that they are the bad actors in the first instance; they are having to manage it. Ultimately, the first step is to clarify the aims of clause 85, and then whether the fine is appropriate is a subsequent question.

Mike Amesbury Portrait Mike Amesbury
- Hansard - - - Excerpts

Q For some companies, £1,000 will be small fry.

Neil Ross: It will vary from company to company. Most companies will always seek to comply with the law. If you feel you need some kind of deterrent, that is something for Parliament to consider. The first step is to make sure that the law is really clear about what companies are being asked to do. At the moment, that is not the situation we are in.

None Portrait The Chair
- Hansard -

There are two minutes left. Chi Onwurah has the last question.

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

Q Mr Combemale, you set out some of the challenges of having centralised cookie management, and how that would give more power to the browsers. What you did not set out was how we could give more control and power to customers—citizens—over how they use their data. What are you doing to ensure that consumers have more control over how their data is used? You talked about the little thing that you can click to stop our personal data being used—that has been in place for some time now and it is great. If we have the time, Mr Ross, what is your sector doing as well, because the technology should be there to help and empower people?

Chris Combemale: I think a lot of what our sector does voluntarily—setting aside the legislation—is the creation of what are called permission centres. You will be familiar with them from when you go to a website and it asks about categories of information or products that you are interested in. That allows consumers to express their interest. Within the legislation there is very clear data notification, required at the point that data is collected, which requires companies to ask you what you want to do. Whether it is consent or legitimate interest, consumers always have the right to opt out.

With marketing, there is an absolute right to ask not to receive marketing of any kind, whether that is email, direct mail or telephone, at any time. Companies have an obligation to follow that. When it comes to marketing, which is my subject matter expertise, consumers are very well protected and do exercise their rights to opt out. They are further protected by central services, for example the telephone preference service. That is a law that companies can look up; 70% or so of households have registered their telephone number there. I think there are a large number of protections in place, both through the legislation and voluntarily.

None Portrait The Chair
- Hansard -

Q Mr Ross, you have 30 seconds.

Neil Ross: There has been a big drive among many tech companies to explain better how they use and handle data practices. There is a drive within the sector to do that anyway. Some of that has come from legislative regulatory activity—for example, the Online Safety Bill and other places.

One thing I would say about this legislation is that it does give people more control over data through the privacy management frameworks. By taking a less strict tick-box approach to data-handling practices, there is the opportunity for core sectors or interest groups such as trade unions to put forward what their ideal data-handling practice should be for a company. As long as that complies with what the ICO sets out or the broad guardrails, then you can see a range of different handling practices adopted, depending on which sector you are in. That flexibility gives some power back to consumers and other interest groups.

None Portrait The Chair
- Hansard -

Gentlemen, you have been brilliant. Thank you very much indeed for your time this morning. We will now move on to the fourth panel.

Examination of Witnesses

10:50
Dr Jeni Tennison, Anna Thomas and Michael Birtwistle gave evidence.
None Portrait The Chair
- Hansard -

Q We will now hear oral evidence from Dr Jeni Tennison, founder and executive director of Connected by Data; Anna Thomas, co-founder and director at the Institute for the Future of Work; and Michael Birtwistle, associate director of AI law and regulation at the Ada Lovelace Institute. For this session we have until 11.25 am. Will the witnesses, from right to left, please be kind enough to introduce themselves to the Committee for the record?

Dr Tennison: Thank you very much for inviting me here today. My name is Dr Jeni Tennison. I am the executive director of Connected by Data, which is a campaign to give communities a powerful say in decisions about data. Prior to that I was the CEO of the Open Data Institute. I am also the co-chair of the data governance working group in the Global Partnership on Artificial Intelligence.

Anna Thomas: Good morning and thank you for having me. I am Anna Thomas, a founding director of the Institute for the Future of Work, a research and development institute exploring the impact of new technologies on work and working lives. I was formerly an employment barrister at Devereux Chambers. The institute is also the strategic research partner for the all-party parliamentary group on the future of work.

Michael Birtwistle: Good morning. I am Michael Birtwistle, an associate director at the Ada Lovelace Institute, responsible for law and policy. The Ada Lovelace Institute is an independent research institute with a mission to make sure that data and AI work for people and society. I was previously a policy adviser at the Centre for Data Ethics and Innovation.

None Portrait The Chair
- Hansard -

Welcome. Stephanie Peacock will start the questions.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q Good morning. To go first to Dr Jeni Tennison, do you think the general public and workers have a good level of trust and understanding in terms of how their data is being used? What does the Bill do, if anything, to help build or improve on that trust and understanding?

Dr Tennison: Surveys and public attitudes polling show that when you ask people about their opinions around the use of data, they have a good understanding about the ways in which it is going wrong, and they have a good understanding about the kinds of protections that they would like to see. The levels of trust are not really there.

A poll from the Open Data Institute, for example, shows that only 30% trust the Government to use data ethically. CDEI has described this as “tenuous trust” and highlighted that about 70% of the public think that the tech sector is insufficiently regulated. I do not think that the Bill addresses those issues of trust very well; in fact, it reduces the power individuals have and also the level of collective representation people can have, particularly in the work context. I think this will diminish trust in the way in which data is used.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q Do you believe the Government have consulted the public and data subjects such as workers appropriately during the process of formulating the Bill?

Dr Tennison: Obviously, there was a strong consultation exercise around the data reform Bill, as it was then characterised. However, there are elements of this Bill, in particular the recognised legitimate interests that are listed, that have not had detailed public consultation or scrutiny. There are also not the kinds of provisions that we would like to see on ongoing consultation with the public on specific questions around data processing in the future.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q What value do subject access requests hold for citizens, and how will changing the threshold for refusing a request or changing a request to “vexatious or excessive” impact citizens’ ability to exercise their rights?

Dr Tennison: Subject access requests are an important way in which citizens can work out what is happening within organisations with the data that is being held about them. There are already protections under UK GDPR against vexatious or excessive requests, and strengthening those as the Bill is doing is, I think, going to put off more citizens from making these kinds of requests.

It is worth noting that this is a specific design of the Bill. If you look at the impact assessment, this is where most of the cost to business is being saved; that is being done by refusing subject access requests. So I think we should be suspicious about what that looks like. Where we have been looking at the role of subject access requests in people exercising their rights, it is clear that that is a necessary step, and delays to or refusals of subject access requests would prevent people from exercising their rights.

We think that a better way of reducing subject access requests would be to have publication of things like the risk assessments that organisations have to do when there is high-risk processing—so that there is less suspicion on the part of data subjects and they do not make those requests in the first place.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q Thank you. I have a couple of questions for Anna Thomas now. Do the current laws around automated decision making do enough to protect workers and citizens from harm?

Anna Thomas: Referring partly to our work in “Mind the gap” and “The Amazonian Era”, as well as the report by the all-party parliamentary group on the future of work about use of AI in the workplace, we would say no. The aim of the Bill—to simplify—is very good. But particular areas in the Bill as it stands—eroded somewhat—are particularly problematic in the workplace. The automated ones that you ask about are really important with regard to the reduction of human involvement. But in addition to that are the need to assess in advance what the risks and impacts are, the requirement for consultation, and the access to relevant information. Those are all relevant and overlap with the automated decision making requirement.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q Linked to that, do you believe that the safeguards outlined in the Bill—having a right to human review, for example—are enough to protect workers from the potential harm of automated decision making?

Anna Thomas: Not in themselves. There is potential, in those areas, to correct that or to improve it in the course of the Bill’s proceedings, in order that the opportunities, as well as the risks, of putting this new Bill through Parliament are seized. But, no, because of the transformation of work and the extent of the impact, as well as the risks, that new technologies and automated technologies are having across work, not just on access to work, but on terms, conditions, nature, quality and models for work, the safeguards—there is, I think, increasing cross-party consensus about this—should be, in those areas, moving in the other direction.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q My final question is to Michael. Do you believe that the current regulation does enough to govern the use of biometric technologies?

Michael Birtwistle: No, we would say that it does not. The Ada Lovelace Institute published a couple of reports last year on the use of biometric data, arguing for a much stronger and coherent regulatory governance framework for biometric technologies. These are a set of technologies that are incredibly personal. We are used to their being talked about in terms of our faces or fingerprints, but actually it is a much wider range, involving any measurement to do with the human body, which can be used in emotional analysis—walking style or gait, your tone of voice or even your typing style. There is also a set of incoming, next-generation AI technologies that rely quite heavily on biometrics, so there is a question about future-proofing the Bill.

We have made two broad proposals. One is to increase the capability of the Information Commissioner’s Office to look specifically at biometrics—for example, to create and maintain a public register of private entities engaging in processing of biometric data, to have a proper complaints procedure, to publish annual reports and so on. There is a set of issues around increasing the capability of our institutions to deal with that.

Then there is a second question about scope. First, the current focus of biometric data and definition is on identifiability of personal data. There are many potentially problematic use cases of biometric data that do not need to know who you are in order to make a decision about you. We think it would be wise and would future-proof the regulation of this powerful technology to also include classification or categorisation as the purpose of those biometric technologies.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q You make a very interesting point there, Mr Birtwistle. With automated decision making, a lot of that could be done anonymously. The user is just the end product. They are being targeted through systems and do not need to be identified; the systems just need to know what their data profile is like in order to make a decision.

I am interested in the views of the other members of the panel as well. Do you think there needs to be a greater onus on data controllers to make clear to regulators what data they are gathering, how they are processing it and what decisions are being made based on that data, so that, particularly in an automated environment, while there may not be a human looking at every step in the chain, ultimately a human has designed the system and is responsible for how that system is working?

Michael Birtwistle: I think that is a really important point that is going to be very relevant as we read this Bill alongside the AI White Paper provisions that have been provided. Yes, there is definitely a need for transparency towards regulators, but if we are thinking about automated decision making, you also want a lot of the safeguards and the thinking to be happening within the firms on a proactive basis. That is why the provisions for automated decision making within the Bill are so important. We have concerns around whether the more permissive automated decision making approach in the Bill is actually going to lead to greater harms occurring as, effectively, it turns the making of those automated decisions from a sort of prohibition with exceptions into something that, for anything other than special category data, is permitted with some safeguards, which again there are questions around.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q On that point, just to be clear, as long as what someone is doing is not clearly and purely illegal, legitimate interest means you can do whatever you want.

Michael Birtwistle: Legitimate interest still has a balancing test within it, so you would not necessarily always be able to show that you had passed that test and to do whatever you want but, certainly, the provisions in the Bill around automated decisions bring legitimate interest into scope as something that it is okay to do automated processing around.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Dr Tennison?

Dr Tennison: On your first point, around the targets of decisions, one of the things that we would really argue for is changing the sets of people who have rights around automated decision making to those who are the subject of the decisions, not necessarily those who data is known about for those decisions. In data governance practice, we talk about these people as being decision subjects, and we think it is they who should have the rights over being informed about when automated decision making is happening, and other kinds of objection and so forth. That is because, in some circumstances, as you said, there might be issues where you do not have information about someone and nevertheless you are making decisions about them, or you have information about a subset of people, which you are then using to make a decision that affects a group of people. In those circumstances, which we can detail more in written evidence, we really need to have the decision subjects’ rights being exercised, rather than the data subjects’ rights —those who the data is known about.

On the legitimate interest point you raised, there is this balancing test that Michael talked about, that balances the interests of data subjects as well. We think that there should also be some tests in there that balance public interests, which may be a positive thing for using data, but also may be a negative thing. We know that there are collective harms that arise from the processing of data as well.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q I just want to make sure I have understood that point correctly. Let us say that someone is a recipient of an advert, not because they have been personally targeted, but because they have been targeted through data-matching tools such as lookalike audiences on Facebook. Would that be the sort of thing you are referring to?

Dr Tennison: Yes, it could be, or because they are using a specific browser, they are in a particular area from their IP or something like that. There are various ways in which people can be targeted and affected by those decisions. But we are not just talking about targeted advertising; we are talking about automated decisions in the workplace or automated decisions about energy bills and energy tariffs. There are lots of these decisions being made all the time.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q Is the gig economy an example of where the systems are biased towards workers who are always available for jobs, or biased towards people based on their proximity to a particular location for work?

Dr Tennison: Yes. Or they may be subject to things like robo-dismissal, where their performance is assessed and they get dismissed from the job, or they are no longer given jobs in a gig economy situation.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q Effectively a form of constructive dismissal.

Dr Tennison: Yes.

None Portrait The Chair
- Hansard -

I can see Anna Thomas chomping at the bit.

Anna Thomas: I would back up what Jeni is saying about group impacts in the workplace context. It is very important that individuals know how systems are used, why and where they have significant effects, and that risks and impacts are ascertained in advance. If it is just individuals and not groups or representatives, it may well not be possible to know, ascertain or respond to impacts in a way that will improve and maximise good outcomes for everybody—at an individual level and a firm level, as well as at a societal level.

I can give a few examples from work. Our research covers people being told about the rates that they should hit in order to keep their job, but not about the factors that are being taken into account. They are simply told that if you are not hitting that, you will lose your job. Another example is that customer interaction is often not taken into account, because it is not something that can be captured, broken down and assessed in an automated way by an algorithmic system. Similarly, older workers—they are very important at the moment, given that we need to fill vacancies and so on—are feeling that they are being “designed out”.

Our research suggests that if we think about the risks and impacts in advance and we take proportionate and reasonable steps to address them, we will get better outcomes and we will get innovation, because innovation should be more than simply value extraction in the scenarios that I have set out. We will improve productivity as well. There is increasing evidence from machine learning experts, economists and organisational management that higher levels of involvement will result in better outcomes.

None Portrait The Chair
- Hansard -

Mr Birtwistle?

Michael Birtwistle: I very much agree with my other panellists on those points. If you are thinking about concrete ways to improve what is in the Bill, the high level of protection around automated decision making is currently in article 22B. That looks at decisions using special category data, which, as an input, you could also add in there, looking at the output. You could include decisions that involve high-risk processing, which is already terminology used throughout the Bill. That would mean that, where automated decision making is used around decisions that involve high-risk processing, you would need meaningful human involvement, explicit consent or substantial public interest.

Carol Monaghan Portrait Carol Monaghan
- Hansard - - - Excerpts

Q Jeni, can I come back to you on automated decision making? You have suggested that a requirement to notify people when an automated decision is made about them would be a useful inclusion in the Bill. Do you think enough consideration has been given to that?

Dr Tennison: The main thing that we have been arguing for is that it should be the wider set of decision subjects, rather than data subjects, who get rights relating to notification, or who can have a review. It is really important that there be notification of automated decision making, and as much transparency as possible about the details of it, and the process that an organisation has gone through in making an impact assessment of what that might mean for all individuals, groups and collective interests that might be affected by that automated decision making.

Carol Monaghan Portrait Carol Monaghan
- Hansard - - - Excerpts

Q We can probably broadly split these decisions into two categories. Decisions are already being made by algorithms online, according to what we are looking at. If I look up a paint colour online, and then start getting adverts for different paint companies, I am not too worried about that. I am more concerned that decisions could be made in the workplace about me, or about energy tariffs, as we have heard. That is more serious. Is there a danger that if we notify individuals of all the automated decisions that are made, it will end up like the cookie scenario—we will just ignore it all?

Dr Tennison: I do not think it is a matter of notifying people about all automated decision making. The Bill suggests limiting that to legally or otherwise significant decisions, so that we have those additional rights only as regards things that will really have an impact on people’s lives.

Carol Monaghan Portrait Carol Monaghan
- Hansard - - - Excerpts

Q And you are not comfortable that those have been considered properly in the Bill.

Dr Tennison: I am not comfortable that they are directed to the right people.

Carol Monaghan Portrait Carol Monaghan
- Hansard - - - Excerpts

Q The subject, rather than the decision maker.

Dr Tennison: Yes.

Carol Monaghan Portrait Carol Monaghan
- Hansard - - - Excerpts

Anna, did you want to come in on that?

Anna Thomas: The last question about the threshold is really important, and it tends to suggest that work should have separate consideration, which is happening all over the world. Last week, Canada introduced its automated decision-making directive, and extended it to work. We have been working with it on that. Japan has a strategy that deals expressly with work. In the United States there are various examples, including the California Privacy Rights Act, of rules that give work special attention in this context. Our proposal for addressing the issue of threshold is that you should always provide notification, assess, and do your best to promote positive impacts and reduce negative ones if the decision-making impacts access to work, termination, pay, contractual status or terms, and, for the rest, when there is significant impact.

Carol Monaghan Portrait Carol Monaghan
- Hansard - - - Excerpts

Q Is there a danger that automated decisions could impact the Equality Act, if biases are not properly accounted for?

Anna Thomas: Yes, absolutely. In our model, we suggest that the impact assessment should incorporate not just the data protection elements, which we say remain essential, but equality of opportunity and disparity of outcome—for example, equal opportunity to promotion, or access to benefits. That should be incorporated in a model that forefronts and considers impacts on work.

Mike Amesbury Portrait Mike Amesbury
- Hansard - - - Excerpts

Q Anna, how would you strengthen the Bill? If you were to table an amendment around employees and AI, what would it be?

Anna Thomas: I would advise very clear additional rights, and a duty to notify in advance what, how and why AI is being used where it has these impacts, and where it meets the threshold that I was just asked about. I would also advise having more consultation throughout design, development and deployment, and ongoing monitoring, because AI changes, and there are impacts that we have not thought about or cannot ascertain in advance.

There should also be a separate obligation to conduct an algorithmic impact assessment. The Bill does nudge in that direction, but it says that there should be an assessment, rather than a data protection impact assessment. We suggest that the opportunity be grasped of clarifying that—at least in the workplace context, but arguably there are lessons more widely—the assessment ought to cover these fundamental aspects, and impacts at work.

Rupa Huq Portrait Dr Huq
- Hansard - - - Excerpts

Q It is good to see the Ada Lovelace Institute represented; she was a pioneering woman computer scientist who lived in my constituency, so it is a bit ironic that the one man here is representing the institute.

Michael Birtwistle: My colleagues could not be here, unfortunately, but they would have been better representatives in that sense.

Rupa Huq Portrait Dr Huq
- Hansard - - - Excerpts

I want to touch on the equality issue again. A 2019 UN report on the digital welfare state made the point that algorithms repeat existing biases and entrench inequalities. How do we get around that? There are a lot of issues around trust and people’s rights and protections when it comes to this data. On top of those, there is this issue. Does the legislation address that? How can we overcome it?

Dr Tennison: As I have mentioned, there need to be more points in the Bill where explicit consideration of the public interest, including equality, is written into the sets of considerations that organisations, the ICO and the Secretary of State need to take into account when they are exercising their rights. That includes ensuring that public interest and equality are an explicit part of assessments of high-risk processing. That will help us to make sure that in the assessment process, organisations are made to look beyond the impacts on individuals and data subjects, and to look at the whole societal and economic impacts—even at the environmental impacts—that there might be from the processing that they are looking to carry out.

Anna Thomas: I agree. To add to what I said before, it would help to require a technical bias audit as well as a wider equality impact assessment. One idea that you may wish to consider is this: in the same way that the public sector has an obligation sometimes to consider the reduction of wider inequalities, you could have—well, not a full private sector model requiring that; that may need to be built up over time. We could, at the very least, require consideration of the desirability of reducing inequalities of opportunity and outcome as part of determining our reasonable and proportionate mitigations in the circumstances; that would be easy to do.

Michael Birtwistle: I agree. There is also a question about institutional capability—ensuring that the institutions involved have the capability to react to the use of these technologies as they evolve. Specifically, it would be great to see the ICO asked in the Bill to produce guidance on how the safeguards in article 22C are to be implemented, as that will have a large effect on how automated decision making will be lived in practice and built into firms. The powers reserved for Ministers around interpreting meaningful human involvement, and legal and similarly significant effect, will also have a big impact. It would make more sense for that to be with the ICO.

Rupa Huq Portrait Dr Huq
- Hansard - - - Excerpts

Can I add one yes/no question?

None Portrait The Chair
- Hansard -

Yes.

Rupa Huq Portrait Dr Huq
- Hansard - - - Excerpts

Q If we have an already overburdened regulatory framework, and we put AI on top of it, will it just fall through the cracks? Is there a danger that AI gets forgotten?

Michael Birtwistle: Yes, if regulators are not properly empowered.

Anna Thomas: I strongly agree, but they could be properly empowered and resourced, and in some instances given extra powers to interrogate or to redress what they have found. We advised that there should be a forum in 2020, and are delighted to see the Digital Regulation Cooperation Forum. That could be given additional resources and additional bite, and we would certainly like to see work forefronted and involved in activities. The forum would be well placed, for example, to provide dedicated cross-cutting guidance on impacts in work.

Dr Tennison: I agree with the other panellists. The only thing I would add is that I think that the involvement of the public will be absolutely essential for moving trust forward in those circumstances.

None Portrait The Chair
- Hansard -

The last question is from Chi Onwurah.

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

Q Dr Tennison, could you give an example of the kind of abuse that you are most concerned about taking place if this Bill is passed unchanged, so that we can better understand your concern? And do I have time to ask—

None Portrait The Chair
- Hansard -

You have four minutes.

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

Great. Ms Thomas, presumably all the automated decisions will be subject to employment law. Would employees have the information they need to appeal decisions and take them to an industrial tribunal?

Dr Tennison: You asked what kind of abuse I am particularly concerned about. I echo some of Anna’s concerns around the work context and what that looks like. We have recently been doing some case studies, which again I can share, and they really bring home the kinds of issues that workers are subject to as automated decision making is rolled out in organisations.

More broadly, though, I am concerned about the gradual drift of reducing trust in the public sphere when it comes to the use of data by Governments and organisations. In some ways, I am more concerned about this leading to people not adopting technology and opting out of data collection because they are worried about what might happen. That would hold us back from the progress and the good uses of data that I would really like to see.

Michael Birtwistle: I agree with that very much. We need to think about past public concern around GP data sharing, contact tracing and the Ofqual exams algorithm. When people see their data being used in unexpected ways, or in ways that make them feel uncomfortable, they withdraw their consent and support for that use, and we as a society lose the benefits that data-driven technology can bring.

Anna Thomas: Employment law and the other laws in that context certainly help in some areas; for example, there is unfair dismissal protection, and redundancy protection under the information and consultation regulations. However, it is a patchwork, and it is not clear. Clarity is needed for businesses, to reassure people at work that the principles in the AI White Paper ultimately apply to their data, and to promote prosperity and wellbeing as widely as possible.

None Portrait The Chair
- Hansard -

I thank our three witnesses very much indeed; you have all been fantastic. We are very grateful to you for being here. That brings us to the end of our morning session. The Committee will meet again at 2 o’clock, here in the Boothroyd Room, to continue taking oral evidence. We heard from 10 witnesses this morning and will hear from 13 this afternoon.

Ordered, That further consideration be now adjourned.(Steve Double.)

11:23
Adjourned till this day at Two o’clock.

Data Protection and Digital Information (No. 2) Bill (Second sitting)

Committee stage
Wednesday 10th May 2023

(1 year, 1 month ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 10 May 2023 - (10 May 2023)
The Committee consisted of the following Members:
Chairs: † Mr Philip Hollobone, Ian Paisley
Amesbury, Mike (Weaver Vale) (Lab)
† Bristow, Paul (Peterborough) (Con)
† Clarke, Theo (Stafford) (Con)
† Collins, Damian (Folkestone and Hythe) (Con)
† Double, Steve (Lord Commissioner of His Majestys Treasury)
† Eastwood, Mark (Dewsbury) (Con)
Henry, Darren (Broxtowe) (Con)
Hunt, Jane (Loughborough) (Con)
† Huq, Dr Rupa (Ealing Central and Acton) (Lab)
† Long Bailey, Rebecca (Salford and Eccles) (Lab)
† Monaghan, Carol (Glasgow North West) (SNP)
† Onwurah, Chi (Newcastle upon Tyne Central) (Lab)
† Peacock, Stephanie (Barnsley East) (Lab)
† Richards, Nicola (West Bromwich East) (Con)
Simmonds, David (Ruislip, Northwood and Pinner) (Con)
† Wakeford, Christian (Bury South) (Lab)
† Whittingdale, Sir John (Minister for Data and Digital Infrastructure)
Huw Yardley, Bradley Albrow, Committee Clerks
† attended the Committee
Witnesses
Tom Schumacher, Chief Privacy Officer, Medtronic
Jonathan Sellors MBE, Legal Counsel and Company Secretary, UK Biobank
Harry Weber-Brown, Chief Engagement Officer, ZILO
Phillip Mind, Director, Digital Technology and Innovation, UK Finance
Keith Rosser, Chair, Better Hiring Institute
Helen Hitching, Deputy Director and Chief Data Officer, National Crime Agency
Aimee Reed, Director of Data, Metropolitan Police
Andrew Pakes, Director of Communications and Research, Prospect
Mary Towers, Policy Officer, TUC
Alexandra Sinclair, Research Fellow, Public Law Project
Ms Laura Irvine, convener of the Privacy Law sub-committee, Law Society of Scotland
Jacob Smith, UK Accountability Team Leader, Rights and Security International
Alex Lawrence-Archer, Solicitor for AWO (a data rights agency)
Public Bill Committee
Wednesday 10 May 2023
(Afternoon)
[Mr Philip Hollobone in the Chair]
Data Protection and Digital Information (No. 2) Bill
Examination of Witnesses
Tom Schumacher and Jonathan Sellors gave evidence.
14:00
None Portrait The Chair
- Hansard -

Welcome back. We are now on to our fifth witness panel and we will hear from Tom Schumacher, chief privacy officer at Medtronic, who has kindly joined via Zoom, and Jonathan Sellors, legal counsel and company secretary at UK Biobank, who is in the room. We have until 2.25 pm for this panel. Could the witnesses please introduce themselves for the record?

Jonathan Sellors: Good afternoon. I am Jonathan Sellors, general counsel of UK Biobank. To those who may not know, we are the largest globally accessible clinical research resource in the world. We comprise 500,000 UK-based participants, and we make de-identified data available to researchers to conduct clinical research in the public interest.

Tom Schumacher: Thank you so much for inviting me. I am Tom Schumacher, and I work for Medtronic as the chief data and privacy counsel. Medtronic is the world’s largest medical device maker, with 90,000 employees around the world and three manufacturing sites in the UK. We are headquartered in Ireland.

None Portrait The Chair
- Hansard -

Thank you both for joining us. Stephanie Peacock.

Stephanie Peacock Portrait Stephanie Peacock (Barnsley East) (Lab)
- Hansard - - - Excerpts

Q 82 Welcome to you both. My first question is to both witnesses. How easy is it currently for service users and care teams to access and share all of their relevant health and care data?

Jonathan Sellors: I am not sure I am the expert on this particular topic, because my experience is more research-based than in IT systems embedded in clinical care.

Tom Schumacher: I am also not as intimately familiar with that issue, but I would say that interoperability is absolutely critical. One of the challenges we experience with our technologies—I assume this is also the case for your health providers—is the ability to have high-quality data that means the same thing in different systems. That is a challenge that will be improved, but it is really a data challenge more than a privacy challenge. That is how I see it.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q Will the new definition in the Bill of what constitutes scientific research help people in your field to conduct more or better research? If so, what impact would this research have on citizens and healthcare?

Jonathan Sellors: I think it is a thoroughly useful clarification of what constitutes research. It is essentially welcome, because it was not entirely clear under the provisions of the General Data Protection Regulation what the parameters of research were, so this is a helpful clarification.

Tom Schumacher: I completely concur: it is very useful. I would say that a couple of things really stand out. One is that it makes it clear that private industry and other companies can participate in research. That is really important, particularly for a company like Medtronic because, in order to bring our products through to help patients, we need to conduct research, have real-world data and be able to present that to regulators for approval. It will be extremely helpful to have that broader definition.

The other component of the definition that is quite helpful is that it makes it explicit that technology development and other applied research constitutes research. I know there is a lot of administrative churn trying to figure out what constitutes research and what does not, and I think this is a really helpful piece of clarification.

John Whittingdale Portrait The Minister for Data and Digital Infrastructure (Sir John Whittingdale)
- Hansard - - - Excerpts

Q Perhaps I could ask you both to elaborate on how the existing definition and the current lack of clarity have impeded you in carrying out the research you would like to do and how this will change as a result of the Bill.

Tom Schumacher: Maybe I can give an example. One of the businesses we purchased is a business based in the UK called Digital Surgery. It uses inter-body videos to try to improve the surgery process and create technologies to aid surgeons in prevention and care. One of the challenges has been, to what extent is the use of surgery videos to create artificial intelligence and a better outcome for patient research? Ultimately, it was often the case that a particular site or hospital would agree, but it created a lot of churn, activity and work back and forth to explain exactly what was to be done. I think this will make it much clearer and easier for a hospital to say, “We understand this is an appropriate research use” and to be in a position to share that data according to all the protections that the GDPR provides around securing and de-identifying the data and so on.

Jonathan Sellors: I think our access test, which we apply to all our 35,000 users, is to ensure they are bona fide researchers conducting health-related research in the public interest. We quite often get asked whether the research they are planning to conduct is legitimate research. For example, a lot of genetic research, rather than being based on a particular hypothesis, is hypothesis-generating—they look at the data first and then decide what they want to investigate. This definition definitely helps clear up quite a few—not major, but minor—confusions that we have. They arise quite regularly, so I think it is a thoroughly helpful development to be able to point to something with this sort of clarity.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Q Can you say a little about the extent to which you have been a contributor to the design of the new provisions in the Bill and whether you are happy with the outcome of that?

Jonathan Sellors: The short answer would be yes. I was contacted by NHS England about the wording of some of the consent aspects, some of the research aspects and particularly some of the pseudonymisation aspects, because that is an important wall. Most research conducted is essentially on pseudonymised rather than identifiable data. The way it has been worded and clarified, because it makes an incremental improvement on what is already there in the GDPR, is very useful. I think it is a good job.

Tom Schumacher: Yes, I would say the same. NHS Transformation and the Department for Culture, Media and Sport, particularly Owen Rowland and Elisabeth Stafford, have been very willing to hear points of view from industry and very proactive in reaching out for our feedback. I feel like the result reflects that good co-ordination.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

Q Do you think the definition of what public health means in the context of the Bill is clear?

Jonathan Sellors: Yes, I think it is reasonably clear.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

What do you mean by that?

Jonathan Sellors: Like any lawyer, if I were asked to draft something, I would probably always look at it and say I could possibly improve it. However, I would actually look at this and say it is probably good enough.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q What do you think it means? What is the scope of it?

Jonathan Sellors: If I may, can I come back to you on that with a written response, when I have given it slightly further consideration? Would that be okay?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q Yes. What I would be interested in is that there could be medical research linked to physical ailments. It could also include mental health, which could, in this context, open up quite a wide range of different fields of research for commercial application as well—understanding people’s stimulus response to fear, anxiety and so on, some of which could have medical application and some of which could be purely commercial.

Jonathan Sellors: I think that, with health-related research that is in the public interest, it is relatively straightforward to spot what it is. Most research is going to have some commercial application because most of the pharma, molecules and medical devices are going to be commercially devised and developed. I do not think that the fact that something has a commercial interest should count it out in any way; it is just about looking at what the predominant interest is.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q I think that is right. I would welcome it if you were able to write to the Committee with some further thoughts on that. My point, I suppose, is that we have a pretty good idea of what we think public health research could be in this context, whether it is for commercial or non-commercial reasons. However, we want to be certain about whether that opens up other channels of research that others may regard as being not about solving public health problems, but just about the commercial exploitation of data.

Jonathan Sellors: Right, thank you. I understand.

Tom Schumacher: I concur with what the previous speaker said. In the medical device industry, we really focus on what is considered more traditional research, which fits well within the refined research definition that the Bill contains.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q I have a final question. We have this legislation, and then different tech companies and operating systems have separate guidelines that they work to as well. One of the issues the Government faced with, for instance, the covid vaccine app, was that it had to comply with the operating rules for Google and iOS, regardless of what the Government wanted it to do. Thinking of the work that your organisation has been involved in, are there still significant restrictions that go beyond the legal thresholds because different operating systems set different requirements?

Jonathan Sellors: I do not think I am really the best qualified person to talk about the different Android and Apple operating systems, although we did a lot of covid-related work during the pandemic, which we were not restricted from doing.

Tom Schumacher: I would say that this comes up quite a lot for Medtronic in the broader medtech industry. I would say a couple of things. First, this is an implementation issue more than a Bill issue, but the harmonisation of technical standards is absolutely critical. One of the challenges that we, and I am sure NHS trusts, experience is variability in technical and IT security standards. One of the real opportunities to streamline is to harmonise those standards, so that each trust does not have to decide for itself which international standard to use and which local standard to use.

I would also say that there is a lot of work globally to try to reach international standards, and the more that there can be consistency in standards, the less bureaucracy there will be and the better the protection will be, particularly for medical device companies. We need to build those standards into our product portfolio and design requirements and have them approved by notified bodies, so it is important that the UK does not create a new and different set of standards but participates in setting great international standards.

Rebecca Long Bailey Portrait Rebecca Long Bailey (Salford and Eccles) (Lab)
- Hansard - - - Excerpts

Q In relation to medical research, concerns have been raised that the Bill might risk a divergence from current EU adequacy and that that might have quite a significant detrimental impact on collaboration, which often happens across the EU on medical research. Are you concerned about that, and what should the Government do to mitigate it?

Jonathan Sellors: I think that it is absolutely right to be concerned about whether there will be issues with adequacy, but my evaluation, and all the analysis that I have read from third parties, particularly some third-party lawyers, suggests that the Bill does not or should not have any impact on the adequacy decision at all—broadly because it takes the sensible approach of taking the existing GDPR and then making incremental explanations of what certain things actually mean. There are various provisions of GDPR—for example, on genetic data and pseudonymisation—that are there in just one sentence. It is quite a complicated topic, so having clarification is thoroughly useful, and I do not think that that should have any impact on the adequacy side of it. I think it is a very important point.

Tom Schumacher: I agree that it is a critical point. I also feel as though the real value here is in clarifying what is already permitted in the European GDPR but doing it in a way that preserves adequacy, streamlines and makes it easier for all stakeholders to reach a quick and accurate decision. I think that adequacy will be critical. I just do not think that the language of the text today impacts the ability of it to be adequate.

Chi Onwurah Portrait Chi Onwurah (Newcastle upon Tyne Central) (Lab)
- Hansard - - - Excerpts

Q I know that you are very supportive of the Bill, but I wonder whether you see risks to patients and service users from facilitating a greater sharing of health and care data. Could you each answer that question?

Jonathan Sellors: I think that data sharing, of one sort or another, absolutely underpins medical research. You need to be able to do it internationally as well; it is not purely a UK-centric activity. The key is in making sure that the data that you are using is properly de-identified, so that research can be conducted on patients, participants and resources in a way that does not then link back to their health data and other data.

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

Q So it has to be de-identified. We will return to that. But you do not see any other risks?

Jonathan Sellors: Let me put it this way: poor-quality research, undertaken in an unfortunate way, is always going to be a problem, but good-quality research, which has proper ethical approval and which is done on data that is suitably managed and collated, is an essential thing to be able to do.

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

Q I agree with you. Sorry, I did not quite hear what you said—approval by whom?

Jonathan Sellors: Approval by the relevant ethics committee.

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

Q Right. Is it a requirement of the Bill that the research should have the approval of the relevant ethics committee?

Jonathan Sellors: I do not think that it is a requirement of this Bill, but it is a requirement of pretty much most research that takes place in the UK.

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

Q But not all research, surely, because the definition of research is something that can

“reasonably be described as scientific”

research. You would see concerns, then, if data was to be shared for research that was carried out outside of ethics committee approvals. I do not want to put words into your mouth, but I am just trying to understand.

Jonathan Sellors: Sure. I think it depends on the nature of the data that you are trying to evaluate. In other words, if you are looking at aggregated or summary datasets, I do not think there is any particular issue, but when you are looking at individual-level data, that has to be suitably de-identified in order for research to be safely conducted.

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

Q On the point of de-identifying or pseudonymisation, do you recognise that there have been examples of pseudonymised data that has been re-identified, and that, particularly given the rise of huge datasets, artificial intelligence and so on, there is a risk of un-de-identifying pseudonymised data?

Jonathan Sellors: There is always a risk, but I think the way it is expressed in the Bill is actually quite measured. In other words, it takes a reasonable approach to what steps can constitute re-identification. There are a certain police-related examples whereby samples are found on crime scenes. The individuals can be identified, certainly, if you are on the police database, but if they are not on a reference database, it is extremely difficult to re-identify them, other than with millions of pounds-worth of police work. For all practical purposes, it is actually de-identified. Saying something is completely de-identified is quite difficult.

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

Q Yes, I certainly agree with that—it is almost impossible—but I do think it is possible to re-identify data without spending millions of pounds, especially when it is correlated with other large datasets. Would you recognise that?

Jonathan Sellors: I definitely recognise that. That is one of our principal bits of concern, but usually the identifiers are the relatively simple ones. In other words, you can re-identify me quite easily by my seven-digit postcode and my age and my gender. Obviously, when we release data, we make sure not to do that. Releasing quite a big bit of my genetic sequence does not make me re-identifiable.

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

Currently.

Jonathan Sellors: Currently—I accept that.

Tom Schumacher: I would say a couple of things. It is important to know that the Bill preserves the full array of safeguards in the GDPR around data minimisation, access controls and making sure that you have de-identified the data as much as possible for the purpose you are going to use it for. The opportunity that our company is quite concerned about is that, without some elements of real-world data, we are not going to be able to eliminate the bias that we see in the system. We are not going to be able to personalise medicine, and we are not going to be able to get our products approved, because our regulating bodies are now looking at and mandating that the technology we use is tested in different attributes that are relevant for that technology.

As an example, there are very few data pieces that we need for our digital surgery business, but we might need gender, weight and age. The Bill will allow customisation to say, “Okay, what are you going to do to make sure that only two or three data scientists see that data? How are you going to house it in a secure, separate environment? How are you going to make sure that you have security controls around that?” I think the Bill allows that flexibility to try to create personalised medicine, but I do not believe that the Bill opens up a new area of risk for re-identification provided that the GDPR safeguards remain.

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

Q Let me ask a follow-up question. I recognise that your intent in research is ethical—there are ethics committees involved. Given the definition of scientific research to be anything that can be reasonably described as scientific, what is to stop data being shared for the purposes of, for example, justifying anti-covid vaccination conspiracy theories? Do you recognise that there are purposes that could be described as research but which many people would not want their data to be used for?

Tom Schumacher: In isolation, that would be a risk, but in the full context of the interrelationship between the data owner and controller and the manufacturer, there would be a process by which you would define the legitimate use you are going to use that data for, and that would be something that you would document and would go on your system. I do not believe that using data for political purposes would constitute research in the way that you would think about it in this Bill. Certainly the UK ICO is well regarded for providing useful interpretation guidance. I think that that office would be able to issue appropriate guardrails to limit those sorts of abuses.

Jonathan Sellors:</