Data Protection and Digital Information (No. 2) Bill Debate

Full Debate: Read Full Debate
Department: Department for Science, Innovation & Technology

Data Protection and Digital Information (No. 2) Bill

Daniel Zeichner Excerpts
2nd reading
Monday 17th April 2023

(1 year ago)

Commons Chamber
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Daniel Zeichner Portrait Daniel Zeichner (Cambridge) (Lab)
- View Speech - Hansard - -

My interest in this debate comes from my representing a science and research city, where data, and transferring it, is key, and from my long-term background in information technology. Perhaps as a consequence of both, back in 2018 I was on the Bill Committee that had the interesting task of implementing GDPR, even though, as my hon. Friend the Member for Bristol North West (Darren Jones)—my good friend—pointed out at the time, none of us had the text in front of us. I think he perhaps had special access to it. In those long and complicated discussions, there were times when I was not entirely sure that anyone in the room fully gripped the complexity of the issues.

I recall that my right hon. Friend the Member for Birmingham, Hodge Hill (Liam Byrne) persistently called for a longer-term vision that would meet the fast-changing challenges of the digital world, and Labour Members constantly noted the paucity of resources available to the Information Commissioner’s Office to deal with those challenges, notwithstanding yellow-vested people entering offices. Five years on, I am not sure that much has changed, because the Bill before us is still highly technical and detailed, and once again the key issues of the moment are being dodged.

I was struck by the interesting conversations on the Conservative Benches, which were as much about what was not being tackled by the Bill as what is being tackled —about the really hot issues that my hon. Friend the Member for Manchester Central (Lucy Powell) mentioned in her Front-Bench speech, such as ChatGPT and artificial intelligence. Those are the issues of the moment, and I am afraid that they are not addressed in the Bill. I make the exact point I made five years ago: there is the risk of hard-coding previous prejudice into future decision making. Those are the issues that we should be tackling.

I chair the all-party parliamentary group on data analytics, which is carrying out a timely review of AI governance. I draw Members’ attention to a report made by that group, with the help of my hon. Friend the Member for Bristol North West, called “Trust, Transparency and Technology”. It called for, among other things, a public services licence to operate, and transparent, standardised ethics and rules for public service providers such as universities, police, and health and care services, so that we can try to build the public confidence that we so need. We also called for a tough parliamentary scrutiny Committee, set up like the Public Accounts Committee or the Environmental Audit Committee, to make sure the public are properly protected. That idea still has strong resonance today.

I absolutely admit that none of this is easy, but there are two particular areas that I would like to touch on briefly. One, which has already been raised, is the obvious one of data adequacy. Again, I do not feel that the argument has really moved on that much over the years. Many of the organisations producing briefings for this debate highlight the risks, and back in 2018—as I think the right hon. Member for Maldon (Sir John Whittingdale) pointed out—there were genuine concerns that we would not necessarily achieve an adequacy agreement with the European Union. Frankly, it was always obvious that this was going to be a key point in future trade negotiations with the EU and others, and I am afraid that that is the way it has played out.

It is no surprise that adequacy is often a top issue, because it is so essentially important, but that of course means that we are weakened when negotiation comes to other areas. Put crudely, to get the data adequacy agreements we need, we are always going to be trading away something else, and while in my opinion the EU is always unlikely to withhold at the very end, the truth is that it can, and it could. That is a pretty powerful weapon. On the research issues, I would just like to ask the Minister whether, in summing up, he could comment on the concerns that were raised back in 2018 about the uncertainty for the research sector, and whether he is confident that what is proposed now—in my view, it should have been done then—can provide the clarity that is needed.

On a more general note, one of the key Cambridge organisations has pointed out to me that, in its view, it is quite hard to see the point of this Bill for organisations that are operating globally because, as the EU GDPR has extraterritorial effect, they are still going to need to meet those standards for much of what they do. It would simply be too complicated to try to apply different legal regimes to different situations and people. That is the basic problem with divergence: when organisations span multiple jurisdictions, taking back control is frankly meaningless. Effectively, it cedes control to others without having any influence—the worst of all worlds. That organisation also tells me that it has been led to believe by the Government, as I think was echoed in some of the introductory points, that any organisation wishing to carry on applying current legal standards will, by default, meet those in the new Bill. It is sceptical about that claim, and it would like some confirmation, because it rightly wonders how that can be the case when new concepts and requirements are introduced and existing ones amended.

There is much, much more that could be said, has been said and will be said by others, including genuine concerns about the weakening of rights around subject access requests and some of the protections around algorithmic unfairness. Those need to be tested and scrutinised in Committee; frankly, too much cannot just be left to ministerial judgment. Huge amounts of data are now held about all of us, and the suspicion is rightly held that decisions are sometimes made without our knowledge, decisions that can have a direct impact on our lives. I think we can all agree that data used well can be transformative and a power for good, but that absolutely relies on confidence and trust, which in turn requires a strong regulatory framework that engenders that trust. It feels to me like this Bill fails to meet some of those challenges. It needs to be strengthened and improved.