Data Protection and Digital Information (No. 2) Bill

Damian Collins Excerpts
2nd reading
Monday 17th April 2023

(1 year ago)

Commons Chamber
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- View Speech - Hansard - -

I am delighted to speak in support of this long-awaited Bill. It is a necessary piece of legislation to learn the lessons from GDPR and look at how we can improve the system, both to make it easier for businesses to work with and to give users and citizens the certainty they need about how their data will be processed and used.

In bringing forward new measures, the Bill in no way suggests that we are looking to move away from our data adequacy agreements with the European Union. Around the world, in north America, Europe, Australia and elsewhere in the far east, we see Governments looking at developing trusted systems for sharing and using data and for allowing businesses to process data across international borders, knowing that those systems may not be exactly the same, but they work to the same standards and with similar levels of integrity. That is clearly the direction that the whole world wants to move in and we should play a leading role in that.

I want to talk briefly about an important area of the Bill: getting the balance between data rights and data safety and what the Bill refers to as the “legitimate interest” of a particular business. I should also note that this Bill, while important in its own right, sits alongside other legislation—some of it to be introduced in this Session and some of it already well on its way through the Parliamentary processes—dealing with other aspects of the digital world. The regulation of data is an aspect of digital regulation; it is in some ways the fuel that powers the digital experience and is relevant to other areas of digital life as well.

To take one example, we have already established and implemented the age-appropriate design code for children, which principally addresses the way data is gathered from children online and used to design services and products that they use. As this Bill goes through its parliamentary stages, it is important that we understand how the age-appropriate design code is applied as part of the new data regime, and that the safeguards set out in that code are guaranteed through the Bill as well.

There has been a lot of debate, as has already been mentioned, about companies such as TikTok. There is a concern that engineers who work for TikTok in China, some of whom may be members of the Chinese Communist party, have access to UK user data that may not be stored in China, but is accessed from China, and are using that data to develop products. There is legitimate concern about oversight of that process and what that data might be used for, particularly in a country such as China.

However, there is also a question about data, because one reason the TikTok app is being withdrawn from Government devices around the world is that it is incredibly data-acquisitive. It does not just analyse how people use TikTok and from that create data profiles of users to determine what content to recommend to them, although that is a fundamental part of the experience of using it; it is also gathering, as other big apps do, data from what people do on other apps on the same device. People may not realise that they have given consent, and it is certainly not informed consent, for companies such as TikTok to access data from what they do on other apps, not just when they are TikTok.

It is a question of having trusted systems for how data can be gathered, and giving users the right to opt out of such data systems more easily. Some users might say, “I’m quite happy for TikTok or Meta to have that data gathered about what I do across a range of services.” Others may say, “No, I only want them to see data about what I do when I am using their particular service, not other people’s.”

The Online Safety Bill is one of the principal ways in which we are seeking to regulate AI now. There is debate among people in the tech sectors; a letter was published recently, co-signed by a number of tech executives, including Elon Musk, to say that we should have a six-month pause in the development of AI systems, particularly for large language models. That suggests a problem in the near future of very sophisticated data systems that can make decisions faster than a human can analyse them.

People such as Eric Schmidt have raised concerns about AI in defence systems, where an aggressive system could make decisions faster than a human could respond to them, to which we would need an AI system to respond and where there is potentially no human oversight. That is a frightening scenario in which we might want to consider moratoriums and agreements, as we have in other areas of warfare such as the use of chemical weapons, that we will not allow such systems to be developed because they are so difficult to control.

If we look at the application of that sort of technology closer to home and some of the cases most referenced in the Online Safety Bill, for example the tragic death of the teenager Molly Russell, we see that what was driving the behaviour of concern was data gathered about a user to make recommendations to that person that were endangering their life. The Online Safety Bill seeks to regulate that practice by creating codes and responsibilities for businesses, but that behaviour is only possible because of the collection of data and decisions made by the company on how the data is processed.

This is where the Bill also links to the Government’s White Paper on AI, and this is particularly important: there must be an onus on companies to demonstrate that their systems are safe. The onus must not just be on the user to demonstrate that they have somehow suffered as a consequence of that system’s design. The company should have to demonstrate that they are designing systems with people’s safety and their rights in mind—be that their rights as a worker and a citizen, or their rights to have certain safeguards and protections over how their data is used.

Companies creating datasets should be able to demonstrate to the regulator what data they have gathered, how that data is being trained and what it is being used for. It should be easy for the regulator to see and, if the regulator has concerns up-front, it should be able to raise them with the company. We must try to create that shift, particularly on AI systems, in how systems are tested before they are deployed, with both safety and the principles set out in the legislation in mind.

Kit Malthouse Portrait Kit Malthouse
- Hansard - - - Excerpts

My hon. Friend makes a strong point about safety being designed, but a secondary area of concern for many people is discrimination—that is, the more data companies acquire, the greater their ability to discriminate. For example, in an insurance context, we allow companies to discriminate on the basis of experience or behaviour; if someone has had a lot of crashes or speeding fines, we allow discrimination. However, for companies that process large amounts of data and may be making automated decisions or otherwise, there is no openly advertised line of acceptability drawn. In the future it may be that datasets come together that allow extreme levels of discrimination. For example, if they linked data science, psychometrics and genetic data, there is the possibility for significant levels of discrimination in society. Does he think that, as well as safety, we should be emphasising that line in the sand?

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - -

My right hon. Friend makes an extremely important point. In some ways, we have already seen evidence of that at work: there was a much-talked-about case where Amazon was using an AI system to aid its recruitment for particular roles. The system noticed that men tended to be hired for that role and therefore largely discarded applications from women, because that was what the data had trained it to do. That was clear discrimination.

There are very big companies that have access to a very large amount of data across a series of different platforms. What sort of decisions or presumptions can they make about people based on that data? On insurance, for example, we would want safeguards in place, and I think that users would want to know that safeguards are in place. What does data analysis of the way in which someone plays a game such as Fortnite—where the company is taking data all the time to create new stimuli and prompts to encourage lengthy play and the spending of money on the game—tell us about someone’s attitude towards risk? Someone who is a risk taker might be a bad risk in the eyes of an insurance company. Someone who plays a video game such as Fortnite a lot and sees their insurance premiums affected as a consequence would think, I am sure, that that is a breach of their data rights and something to which they have not given any informed consent. But who has the right to check? It is very difficult for the user to see. That is why I think the system has to be based on the idea that the onus must rest on the companies to demonstrate that what they are doing is ethical and within the law and the established guidelines, and that it is not for individual users always to demonstrate that they have somehow suffered, go through the onerous process of proving how that has been done, and then seek redress at the end. There has to be more up-front responsibility as well.

Finally, competition is also relevant. We need to safeguard against the idea of a walled garden for data meaning that companies that already have massive amounts of data, such as Google, Amazon and Meta, can hang on to what they have, while other companies find it difficult to build up meaningful datasets and working sets. When I was Chairman of the then Digital, Culture, Media and Sport Committee, we considered the way in which Facebook, as it then was, kicked Vine—a short-form video sharing app—off its platform principally because it thought that that app was collecting too much Facebook user data and was a threat to the company. Facebook decided to deny that particular business access to the Facebook platform. [Interruption.] I see that the Under-Secretary of State for Science, Innovation and Technology, my hon. Friend the Member for Sutton and Cheam (Paul Scully), is nodding in an approving way. I hope that he is saying silently that that is exactly what the Bill will address to ensure that we do not allow companies with big strategic market status to abuse their market power to the detriment of competitive businesses.

Data Protection and Digital Information (No. 2) Bill (First sitting) Debate

Full Debate: Read Full Debate

Data Protection and Digital Information (No. 2) Bill (First sitting)

Damian Collins Excerpts
Committee stage
Wednesday 10th May 2023

(11 months, 3 weeks ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 10 May 2023 - (10 May 2023)
Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - -

I am not sure whether this is directly relevant to the Bill or adjacent to it, but I am an unpaid member of the board of the Centre for Countering Digital Hate, which does a lot of work looking at hate speech in the online world.

Mark Eastwood Portrait Mark Eastwood (Dewsbury) (Con)
- Hansard - - - Excerpts

Given that one of today’s witnesses is from Prospect, I wish to declare that I am a member of that union.

--- Later in debate ---
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

That is useful. Thank you.

Damian Collins Portrait Damian Collins
- Hansard - -

Q Continuing with that theme, the Bill uses a broader definition of “recognised legitimate interests” for data controllers. How do you think the Bill will change the regime for businesses? What sort of things might they argue they should be able to do under the Bill that they cannot do now?

John Edwards: There is an argument that there is nothing under the Bill that they cannot do now, but it does respond to a perception that there is a lack of clarity and certainty about the scope of legitimate interests, and it is a legitimate activity of lawmakers to respond to such perceptions. The provision will allow doubt to be taken out of the economy in respect of aspects such as, “Is maintaining the security of my system a legitimate interest in using this data?” Uncertainty in law is very inefficient—it causes people to seek legal opinions and expend resources away from their primary activity—so the more uncertainty we can take out of the legislation, the greater the efficiency of the regulation. We have a role in that at the Information Commissioner’s Office and you as lawmakers have just as important a role.

Damian Collins Portrait Damian Collins
- Hansard - -

Q How would you define that clarity that the Bill is seeking? If a data controller thinks, “Well, if I have legitimate business interests, I can make an excuse for doing whatever I like,” that surely is not what the Bill intends. How would you define the clarity that you say the Bill seeks?

John Edwards: You are right that it is the controller’s assessment and that they are entitled to make that assessment, but they need to be able to justify and be accountable for it. If we investigate a matter where a legitimate interest is asserted, we would be able to test that.

Damian Collins Portrait Damian Collins
- Hansard - -

Q How would you test it?

John Edwards: Well, through the normal process of investigation, in the same way as we do now. We would ask whether this was in the reasonable contemplation of the individual who has contributed their data as a necessary adjunct to the primary business activity that is being undertaken.

Damian Collins Portrait Damian Collins
- Hansard - -

Q Does this change things very much? It sounds like you are saying that business may assert it has a legitimate interest, but if you think it does not, you can investigate and take action as the law stands currently, effectively.

John Edwards: Yes, that is right. But the clarity will be where specific categories of legitimate interest are specified in the legislation. Again, that will just take out the doubt, if there is doubt as to whether a particular activity falls within scope.

Damian Collins Portrait Damian Collins
- Hansard - -

Q Is more clarity needed about the use of inferred data? Major social media platforms rely on inferred data to drive their recommendation tools and systems. There are then questions about whether inferred data draws on protected data characteristics without user permission. A platform might say that that is part of its recognised legitimate business interests, but users might say that it is an infringement of their data rights. Is that clear enough?

John Edwards: I am afraid that I have to revert to the standard, which is, “It depends.” These are questions that need to be determined on a case-by-case basis after examination ex post. It is a very general question that you ask. It depends on what the inferred data is being used for and what it is. For example, my office has taken regulatory action against a company that inferred health status based on purchasing practices. We found that that was unlawful and a breach of the General Data Protection Regulation, and we issued a fine for the practice. Again, the law is capable of regulating inferred data, and there is no kind of carte blanche for controllers to make assumptions about people based on data points, whether collected from or supplied by the individual or not.

Damian Collins Portrait Damian Collins
- Hansard - -

Q Your predecessor raised the issue of the use of inferred data among users’ protected data characteristics—political opinions, religious beliefs, sexual orientation—and said that, without the user’s informed consent, that could not be legal. Do you agree with that?

John Edwards: I am not aware of the statement she made or the context in which she made it, so it is difficult for me to say whether she agreed it. Certainly, informed consent is not the only lawful basis for a data processing activity and it may be that data about protected activities can be inferred and used in some circumstances. I would be happy to come back to you having checked that quote and to give you my views as to whether I agree with it in the context in which it was made.

Damian Collins Portrait Damian Collins
- Hansard - -

Q These are quite important matters because inferred data is such an important part of data processing for major platforms, be it a company assessing someone’s attitude to risk and how that affects the way they might use a gambling product, versus taking someone’s personal, private information, inferring things from it and making them open to suggestions they may not want to receive without their informed consent. That is a grey area, and I wonder whether you think the Bill provides greater clarity, or you think there needs to be more clarity still.

John Edwards: I think there is sufficient clarity. I am not sure whether the Bill speaks to the point you have just made, but for me the overarching obligation to use data fairly enables us to make assessments about the legitimacy of the kinds of practices you are describing.

None Portrait The Chair
- Hansard -

It is a really tight timetable this morning and we have nine minutes left. The Minister wants to ask some questions and there are three Members from the Opposition. I will call the Minister now. Perhaps you would be kind enough, Minister, to leave time for one question each from our three Members of the Opposition.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Sorry. It must be one quick question and one quick answer. We must finish at 10.25 am. Damian Collins.

Damian Collins Portrait Damian Collins
- Hansard - -

Q Ms Artz, one of the complaints about the current GDPR regime has been, for example, that oligarchs use it aggressively to target investigative journalists conducting legitimate investigations into their business activities, to bombard them with data access requests. Do you think that the provisions in the Bill around vexatious requests will help in that situation? Do you think that it will make any difference?

Vivienne Artz: I think it will help a little bit in terms of the threshold of “vexatious”. I think the other piece that will help is the broadening of the provisions around legitimate interests, because now there is an explicit legitimate interest for fraud detection and prevention. At the moment, it is articulated mostly as to prevent a crime. I would suggest that it could be broadened in the context of financial crime, which has anti-money laundering, sanctions screening and related activities, so that firms can actually process data in that way.

Those are two different things: the one is processing data around sanctioned individuals and such like in the context of suspicious activities, and the other is the right of a subject access to remove their data. Even if they make that subject access request, the ability now to balance it against broader obligations where there is a legitimate interest is incredibly helpful.

None Portrait The Chair
- Hansard -

I thank all three witnesses for their time this morning and their extremely informative answers to the questions. Our apologies from Parliament for the tech issues that our two Zoom contestants had to endure. Thank you very much indeed. We will now move on to our third panel.

Examination of Witnesses

Neil Ross and Chris Combemale gave evidence.

--- Later in debate ---
None Portrait The Chair
- Hansard -

There are five minutes left and there are two Members seeking to ask questions.

Damian Collins Portrait Damian Collins
- Hansard - -

Q With regards to children’s data rights, do you think the Bill will have any implications for the way in which the age-appropriate design code has been implemented by companies working within it at the moment? It is not expressly written into the Bill, but do you expect there to be change?

Neil Ross: No, I do not expect so. Given some of the exemptions for further processing, it might help improve compliance with the law, because compliance with the law in the public interest is then a basis on which you could process data further. It might make it easier for companies to implement the age-appropriate design code.

Damian Collins Portrait Damian Collins
- Hansard - -

Q Can you give any examples of that?

Neil Ross: It just gives additional clarity on when and where you can use data on various grounds. There are a wide range of circumstances that you can run into in implementing the age-appropriate design code, so having more flexibility in the law to know that you can process data to meet a legal objective, or for a public interest, would be helpful. The best example I can give is from the pandemic: the Government were requesting data from telecoms companies and others, and those companies were unsure of the legal basis for sharing that data and processing it further in compliance with a Government or regulator request. The Bill takes significant steps to try and improve that process.

Damian Collins Portrait Damian Collins
- Hansard - -

Q Could you give an example more directly related to children?

Neil Ross: I do not have one to hand, but we could certainly follow up.

Mike Amesbury Portrait Mike Amesbury
- Hansard - - - Excerpts

Q The Bill enables the commissioner to impose a fine of £1,000. Is that a reasonable deterrent?

Neil Ross: That is in relation to clause 85?

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - -

Q You make a very interesting point there, Mr Birtwistle. With automated decision making, a lot of that could be done anonymously. The user is just the end product. They are being targeted through systems and do not need to be identified; the systems just need to know what their data profile is like in order to make a decision.

I am interested in the views of the other members of the panel as well. Do you think there needs to be a greater onus on data controllers to make clear to regulators what data they are gathering, how they are processing it and what decisions are being made based on that data, so that, particularly in an automated environment, while there may not be a human looking at every step in the chain, ultimately a human has designed the system and is responsible for how that system is working?

Michael Birtwistle: I think that is a really important point that is going to be very relevant as we read this Bill alongside the AI White Paper provisions that have been provided. Yes, there is definitely a need for transparency towards regulators, but if we are thinking about automated decision making, you also want a lot of the safeguards and the thinking to be happening within the firms on a proactive basis. That is why the provisions for automated decision making within the Bill are so important. We have concerns around whether the more permissive automated decision making approach in the Bill is actually going to lead to greater harms occurring as, effectively, it turns the making of those automated decisions from a sort of prohibition with exceptions into something that, for anything other than special category data, is permitted with some safeguards, which again there are questions around.

Damian Collins Portrait Damian Collins
- Hansard - -

Q On that point, just to be clear, as long as what someone is doing is not clearly and purely illegal, legitimate interest means you can do whatever you want.

Michael Birtwistle: Legitimate interest still has a balancing test within it, so you would not necessarily always be able to show that you had passed that test and to do whatever you want but, certainly, the provisions in the Bill around automated decisions bring legitimate interest into scope as something that it is okay to do automated processing around.

Damian Collins Portrait Damian Collins
- Hansard - -

Dr Tennison?

Dr Tennison: On your first point, around the targets of decisions, one of the things that we would really argue for is changing the sets of people who have rights around automated decision making to those who are the subject of the decisions, not necessarily those who data is known about for those decisions. In data governance practice, we talk about these people as being decision subjects, and we think it is they who should have the rights over being informed about when automated decision making is happening, and other kinds of objection and so forth. That is because, in some circumstances, as you said, there might be issues where you do not have information about someone and nevertheless you are making decisions about them, or you have information about a subset of people, which you are then using to make a decision that affects a group of people. In those circumstances, which we can detail more in written evidence, we really need to have the decision subjects’ rights being exercised, rather than the data subjects’ rights —those who the data is known about.

On the legitimate interest point you raised, there is this balancing test that Michael talked about, that balances the interests of data subjects as well. We think that there should also be some tests in there that balance public interests, which may be a positive thing for using data, but also may be a negative thing. We know that there are collective harms that arise from the processing of data as well.

Damian Collins Portrait Damian Collins
- Hansard - -

Q I just want to make sure I have understood that point correctly. Let us say that someone is a recipient of an advert, not because they have been personally targeted, but because they have been targeted through data-matching tools such as lookalike audiences on Facebook. Would that be the sort of thing you are referring to?

Dr Tennison: Yes, it could be, or because they are using a specific browser, they are in a particular area from their IP or something like that. There are various ways in which people can be targeted and affected by those decisions. But we are not just talking about targeted advertising; we are talking about automated decisions in the workplace or automated decisions about energy bills and energy tariffs. There are lots of these decisions being made all the time.

Damian Collins Portrait Damian Collins
- Hansard - -

Q Is the gig economy an example of where the systems are biased towards workers who are always available for jobs, or biased towards people based on their proximity to a particular location for work?

Dr Tennison: Yes. Or they may be subject to things like robo-dismissal, where their performance is assessed and they get dismissed from the job, or they are no longer given jobs in a gig economy situation.

Damian Collins Portrait Damian Collins
- Hansard - -

Q Effectively a form of constructive dismissal.

Dr Tennison: Yes.

None Portrait The Chair
- Hansard -

I can see Anna Thomas chomping at the bit.

Anna Thomas: I would back up what Jeni is saying about group impacts in the workplace context. It is very important that individuals know how systems are used, why and where they have significant effects, and that risks and impacts are ascertained in advance. If it is just individuals and not groups or representatives, it may well not be possible to know, ascertain or respond to impacts in a way that will improve and maximise good outcomes for everybody—at an individual level and a firm level, as well as at a societal level.

I can give a few examples from work. Our research covers people being told about the rates that they should hit in order to keep their job, but not about the factors that are being taken into account. They are simply told that if you are not hitting that, you will lose your job. Another example is that customer interaction is often not taken into account, because it is not something that can be captured, broken down and assessed in an automated way by an algorithmic system. Similarly, older workers—they are very important at the moment, given that we need to fill vacancies and so on—are feeling that they are being “designed out”.

Our research suggests that if we think about the risks and impacts in advance and we take proportionate and reasonable steps to address them, we will get better outcomes and we will get innovation, because innovation should be more than simply value extraction in the scenarios that I have set out. We will improve productivity as well. There is increasing evidence from machine learning experts, economists and organisational management that higher levels of involvement will result in better outcomes.

Data Protection and Digital Information (No. 2) Bill (Second sitting) Debate

Full Debate: Read Full Debate

Data Protection and Digital Information (No. 2) Bill (Second sitting)

Damian Collins Excerpts
Committee stage
Wednesday 10th May 2023

(11 months, 3 weeks ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 10 May 2023 - (10 May 2023)
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Q Can you say a little about the extent to which you have been a contributor to the design of the new provisions in the Bill and whether you are happy with the outcome of that?

Jonathan Sellors: The short answer would be yes. I was contacted by NHS England about the wording of some of the consent aspects, some of the research aspects and particularly some of the pseudonymisation aspects, because that is an important wall. Most research conducted is essentially on pseudonymised rather than identifiable data. The way it has been worded and clarified, because it makes an incremental improvement on what is already there in the GDPR, is very useful. I think it is a good job.

Tom Schumacher: Yes, I would say the same. NHS Transformation and the Department for Culture, Media and Sport, particularly Owen Rowland and Elisabeth Stafford, have been very willing to hear points of view from industry and very proactive in reaching out for our feedback. I feel like the result reflects that good co-ordination.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - -

Q Do you think the definition of what public health means in the context of the Bill is clear?

Jonathan Sellors: Yes, I think it is reasonably clear.

Damian Collins Portrait Damian Collins
- Hansard - -

What do you mean by that?

Jonathan Sellors: Like any lawyer, if I were asked to draft something, I would probably always look at it and say I could possibly improve it. However, I would actually look at this and say it is probably good enough.

Damian Collins Portrait Damian Collins
- Hansard - -

Q What do you think it means? What is the scope of it?

Jonathan Sellors: If I may, can I come back to you on that with a written response, when I have given it slightly further consideration? Would that be okay?

Damian Collins Portrait Damian Collins
- Hansard - -

Q Yes. What I would be interested in is that there could be medical research linked to physical ailments. It could also include mental health, which could, in this context, open up quite a wide range of different fields of research for commercial application as well—understanding people’s stimulus response to fear, anxiety and so on, some of which could have medical application and some of which could be purely commercial.

Jonathan Sellors: I think that, with health-related research that is in the public interest, it is relatively straightforward to spot what it is. Most research is going to have some commercial application because most of the pharma, molecules and medical devices are going to be commercially devised and developed. I do not think that the fact that something has a commercial interest should count it out in any way; it is just about looking at what the predominant interest is.

Damian Collins Portrait Damian Collins
- Hansard - -

Q I think that is right. I would welcome it if you were able to write to the Committee with some further thoughts on that. My point, I suppose, is that we have a pretty good idea of what we think public health research could be in this context, whether it is for commercial or non-commercial reasons. However, we want to be certain about whether that opens up other channels of research that others may regard as being not about solving public health problems, but just about the commercial exploitation of data.

Jonathan Sellors: Right, thank you. I understand.

Tom Schumacher: I concur with what the previous speaker said. In the medical device industry, we really focus on what is considered more traditional research, which fits well within the refined research definition that the Bill contains.

Damian Collins Portrait Damian Collins
- Hansard - -

Q I have a final question. We have this legislation, and then different tech companies and operating systems have separate guidelines that they work to as well. One of the issues the Government faced with, for instance, the covid vaccine app, was that it had to comply with the operating rules for Google and iOS, regardless of what the Government wanted it to do. Thinking of the work that your organisation has been involved in, are there still significant restrictions that go beyond the legal thresholds because different operating systems set different requirements?

Jonathan Sellors: I do not think I am really the best qualified person to talk about the different Android and Apple operating systems, although we did a lot of covid-related work during the pandemic, which we were not restricted from doing.

Tom Schumacher: I would say that this comes up quite a lot for Medtronic in the broader medtech industry. I would say a couple of things. First, this is an implementation issue more than a Bill issue, but the harmonisation of technical standards is absolutely critical. One of the challenges that we, and I am sure NHS trusts, experience is variability in technical and IT security standards. One of the real opportunities to streamline is to harmonise those standards, so that each trust does not have to decide for itself which international standard to use and which local standard to use.

I would also say that there is a lot of work globally to try to reach international standards, and the more that there can be consistency in standards, the less bureaucracy there will be and the better the protection will be, particularly for medical device companies. We need to build those standards into our product portfolio and design requirements and have them approved by notified bodies, so it is important that the UK does not create a new and different set of standards but participates in setting great international standards.

Rebecca Long Bailey Portrait Rebecca Long Bailey (Salford and Eccles) (Lab)
- Hansard - - - Excerpts

Q In relation to medical research, concerns have been raised that the Bill might risk a divergence from current EU adequacy and that that might have quite a significant detrimental impact on collaboration, which often happens across the EU on medical research. Are you concerned about that, and what should the Government do to mitigate it?

Jonathan Sellors: I think that it is absolutely right to be concerned about whether there will be issues with adequacy, but my evaluation, and all the analysis that I have read from third parties, particularly some third-party lawyers, suggests that the Bill does not or should not have any impact on the adequacy decision at all—broadly because it takes the sensible approach of taking the existing GDPR and then making incremental explanations of what certain things actually mean. There are various provisions of GDPR—for example, on genetic data and pseudonymisation—that are there in just one sentence. It is quite a complicated topic, so having clarification is thoroughly useful, and I do not think that that should have any impact on the adequacy side of it. I think it is a very important point.

Tom Schumacher: I agree that it is a critical point. I also feel as though the real value here is in clarifying what is already permitted in the European GDPR but doing it in a way that preserves adequacy, streamlines and makes it easier for all stakeholders to reach a quick and accurate decision. I think that adequacy will be critical. I just do not think that the language of the text today impacts the ability of it to be adequate.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Q Moving on to the digital identity provisions, clearly some people are already familiar with this, but there is still a degree of suspicion. To what extent do you think that the consumer needs persuasion about the security and the benefits of digital identity services? Do you see that as being addressed by the provisions in the Bill?

Harry Weber-Brown: That is a very good question. I did quite a lot of consumer research in my previous capacity, and consumers are initially quite sceptical, asking “Why are you asking me for identity details and things?” You have to explain fully why you are doing that. Certainly having Government support and things like the trust framework and a certification regime to make sure that the consumer knows whom they are dealing with when they are passing over sensitive data will help to build the trust to ensure that consumers will utilise this.

The second part to that is what types of services are built on top of the identity system. If I have the identity verified to an AML—anti-money laundering—standard for financial services, I could use it for a whole suite of other types of activity. That could be the purchase of age-restricted products, or sharing data with my independent financial adviser; it could reduce fraud in push payments, and so on. There is a whole suite of different types of services; you would not be using it just for onboarding. I think the Government support of this under digital verification services, part 2 of the Bill, is critical to make sure it happens.

It is opt-in. We are not saying to people that they have to get an identity card, which obviously is not hugely popular; but if we can demonstrate the value of having a digital identity, with support and trust—with the trust framework and certification with Government—we will not necessarily need to run a full marketing campaign to make sure that consumers use this.

Look at other territories—for example, Norway with Vipps, or Sweden’s BankID. I think about 98% of the population now use ID in a digital format; it is very commonplace. It is really a question of looking at the use cases—examples of how the consumer could utilise this—and making sure they receive utility and value from the setting up and the utilisation of the ID. The ID by itself is not necessarily compelling enough; the point is what you can use it for.

Phillip Mind: Trust and acceptance are key issues, and the Bill lays the legislative foundations for that. We already assert our identity digitally when we open accounts, but we do so on a one-off basis. The challenge is to go from doing so on a one-off basis to creating a digital token that is safe and secure and that allows us to reuse that digital identity. For that to work, that token has to be widely accepted, and that is a really complex strategic challenge, but the Bill lays the foundations.

We will transact digitally more and more; that is for sure. At the moment, we have a consultation, from the Treasury and the Bank of England, on a central bank digital currency. Arguably, that would benefit hugely from a reusable digital identity, but we need to be able to create the token in the right way. It could be enabling for people who have access to a smartphone but do not have a passport or driving licence; it could also build inclusion, in terms of identity. So we are very supportive of a reusable digital identity, but it is a big challenge, and the challenge is gaining trust and acceptance.

Damian Collins Portrait Damian Collins
- Hansard - -

Q Mr Weber-Brown, you in particular have spoken about the consumer benefits of data sharing—having a wider choice of products and services. What do you see as the principal business benefits for financial service providers? How wide would you like the scope of their access to data to be?

Harry Weber-Brown: Financial services obviously rely heavily on data to be able to fashion their products accordingly and make them personal, so I think it is critical to have a smart data regime where everything is collected in a single format—what is known as an API, an application programming interface, which is a common way of securely sharing data.

Some of the other use cases from smart data that would benefit business would be things like sharing data around fact find. For example, if someone wants to instruct an independent financial adviser, could they not use this as a way of speeding up the process, rather than having to wait on letters of authority, which are written and take time? Similarly, with pension providers, if I wanted to move from one pension to another or to consolidate things, could we use the smart data to get an illustration of what impact that might have, so that before I ported it over I could see that?

For big financial services firms—well, for all of them—efficiencies are delivered because, as my colleague said, we are using digital as opposed to having to rely on manual processing. As long as the safeguards are put in place, that spawns a whole array of different types of use case, such as with regulatory reporting. If I need to report things to the regulator, could I use smart data provision to do that? That would benefit businesses. A lot of the financial services industry still relies on reporting on Excel spreadsheets and CSV files, so if we can digitise that, it would certainly make it a much more efficient economy.

Damian Collins Portrait Damian Collins
- Hansard - -

Q Can you understand that there might also be concerns on the consumer side about data profiling consumers based on risk? That would make a lot of sense for financial services. You have described certain financial products, but equally there are people offering loans, mortgages, insurance and things like that who will be very keen to understand more about their customers before pricing their products accordingly.

Phillip Mind: A digital identity gives customers more control. One of the issues that we face at the moment when we present a passport or driving licence is that we cannot minimise the data there. There is a data minimisation opportunity and benefit.

For businesses and customers, too, identity is a key issue when we transact digitally. There are risks around profiling, but there are real opportunities around anti-fraud as well. Being absolutely clear about who we are transacting with and being able to prove incontrovertibly who we are through a safe and secure token will deliver huge benefits to the economy.

Damian Collins Portrait Damian Collins
- Hansard - -

We talked in the previous session about the undoubted benefits, which you have set out clearly. Equally, however, consumers will still want to know what sort of data about them is being used and who has access to it. For example, if a video games maker is profiling the attitudes of players to risk, in order to stimulate them with risk-and-reward opportunities within a game like Fortnite, consumers might understand how that makes their gameplay more interesting. They might consent to that, but they might not necessarily want a financial services provider to have access to that information, because it could create a picture of them that is not flattering.

Harry Weber-Brown: That is a perfectly good challenge. There is a spawning part of the industry around consent dashboards. The idea there is that we put much more control in the hands of the consumer, so that they can see where they have given consent to share data and what data has been shared, while also having the right of revocation and so on. There are technical workarounds to ensure that consumers are much more empowered to control their data. Certainly the legislation supports that, but there will be the technical implementation that sits behind it to ensure that the GDPR is abided by and that the smart data will facilitate better services to consumers. The technology is the answer, but the smart data will open up the opportunity to make sure that the consumer is protected, while with things like consent dashboards they can take better control of where their data is being shared.

Phillip Mind: The interesting thing about digital identity is that it creates a tether. In the future, you will be able to tether digitalised tokens such as securities or deeds to an identity in a safe way, but you could also tether consent to a digital identity, giving a customer or citizen a more holistic view of what they have consented to and where. As Harry says, for those who have real data literacy issues, we will see intermediaries offering services around consent. Those services exist in other jurisdictions.

Damian Collins Portrait Damian Collins
- Hansard - -

I think the Estonian digital ID model works in a very similar way.

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

Q You have both spoken very passionately, if I may say so, about the importance of citizens being in control of their data, particularly with open banking. We all take very seriously our financial data and the importance of trust and empowerment in these services. Can you say how the Bill will improve trust and control for citizens, or how it should do so?

Harry Weber-Brown: Part 2 of the Bill sets out the trust framework, which was being developed by the then Department for Digital, Culture, Media and Sport and which now comes under the Department for Science, Innovation and Technology. It will give certainty to the marketplace that any firm that wishes to store data—what is commonly known as an identity provider—will have to go through a certification regime. It will have to be certified against a register, which means that as a consumer I will know that I can trust that organisation because it will be following the trust framework and the policies that sit within it. That is critical.

Similarly, if we are setting up schemes with smart data we will need to make sure that the consumer is protected. That will come through in secondary legislation and the devil will be in the detail of the policies underpinning it, in a similar way to open banking and the pensions dashboard.

Further to the previous session, the other thing I would say is that we are talking on behalf of financial services, but parts 2 and 3 of the Bill also refer to other sectors: they apply equally to health, education and so on. If as a consumer I want to take more control of my data, I will want to be able to use it across multiple services and get a much more holistic view not just of my finances, but of my health information and so on.

One area that is particularly developing at the moment is the concept of self-sovereign identity, which enables me as a consumer to control my identity and take the identity provider out of the equation. I do not want to get too technical, but it involves storing my information on a blockchain and sharing my data credentials only when I need to do so—obviously it follows data minimisation. There are evolving schemes that we need to ensure the Bill caters for.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Q So would you expect that as a result of the Bill, the bar to obtain certification will be higher?

Keith Rosser: Yes. The requirement on DVSs to tackle fraud should be higher than it currently is.

Damian Collins Portrait Damian Collins
- Hansard - -

Q I want to follow on from the Minister’s questions. Looking at other legislation that is going through Parliament, particularly the anti-fraud provisions in the Online Safety Bill, one of the important areas is the extent to which regulators should expect companies to have good upstream solutions in place to combat fraud. Rather than chasing every example that they come across, they need things that block it in the first place. Do you see the provisions in this Bill as being helpful? Would you expect regulators to act on that and to direct companies to use systems that are known to be safe?

Keith Rosser: Absolutely. I will give a quick example relating to the Online Safety Bill and hiring, which I am talking about. If you look at people getting work online by applying through job boards or platforms, that is an uncertified, unregulated space. Ofcom recently did research, ahead of the Online Safety Bill, that found that 30% of UK adults have experienced employment scams when applying for work online, which has a major impact on access to and participation in the labour market, for many reasons.

Turning the question the other way around, we can also use that example to show that where we do have uncertified spaces, the risks are huge, and we are seeing the evidence of that. Specifically, yes, I would expect the governance body or the certification regime, or both, to really put a requirement on DVSs to do all the things you said—to have better upstream processes and better technology.

Also, I think there is a big missing space, given that we have been live with this in hiring for eight months, to provide better information to the public. At the moment, if I am a member of the public applying for a job and I need to use my digital identity, there is no information for me to look at, unless the employer—the end user—is providing me with something up front. Many do not, so I go through this process without any information about what I am doing. It is a real missed opportunity so far, but now we can right that to make sure that DVSs are providing at least basic information to the public about what to do, what not to do, what questions to ask and where to get help.

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

Q Thank you very much for your evidence so far. It is going to be informative about the use of digital ID in recruitment. You said earlier that it helps to separate away from geography, which implied that the digital ID did not reference the location or the home address of the person who was being ID’d. What does the digital ID ID? Part of the reason behind that question is this: is it simply providing identification, or could it also be used as part of the triage process? Can that be done algorithmically, with some of the dangers that we see in algorithmic, automated decision making?

Keith Rosser: Those are several really good questions. I will use an example about location from the other perspective, first of all. At the moment, Home Office policy has not caught up with digital identity, and we are addressing that. There is a real opportunity to right that. It means that one in five work seekers right now cannot use digital identity to get a job, because they do not have an in-date British or Irish passport. If you have a visa or an in-date British or Irish passport, that is fine, but if you are among the one in five people in the country who do not have an in-date passport, you cannot. Those people have to visit the premises of the employer face to face to show their documents, or post their original documents across the UK.

This has really created a second-class work seeker. There are real dangers here, such as that an employer might decide to choose person one because they can hire them a week faster than person two. There is a real issue about this location problem. Digital identity could sever location to allow people more opportunities to work remotely across the UK.

There were really good questions about other information. The Bill has a provision for other data sharing. Again, there is the potential and the opportunity here to make UK hiring the fastest globally by linking other datasets such as HMRC payroll data. Rather than looking at a CV and wondering whether the person really worked in those places, the HMRC data could just confirm that they were employed by those companies.

There is a real opportunity to speed up the verification but, as I want to acknowledge and as you have referred to, there is certainly also a risk. Part of our mission is to make UK hiring fairer, not just faster and safer. I want to caution against going to a degree of artificial intelligence algorithmic-based hiring, where someone is not actually ever in front of a human, whether by Teams video or in person, and a robot is basically assessing their suitability for a job. We have those risks and would have them anyway without this Bill. It is really important as we go forward that we make sure we build in provisions somewhere to ensure that hiring remains a human-on-human activity in some respects, not a completely AI-based process.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Q On a separate issue, at the moment we have a range of bodies responsible for different aspects of surveillance, such as the Biometrics Commissioner, the Investigatory Powers Commissioner and the Surveillance Camera Commissioner. Those are being brought together into either the Information Commissioner or the Investigatory Powers Commissioner. To what extent do you think that will improve the overall oversight of surveillance?

Aimee Reed: Policing thinks that that will significantly simplify things. It will not reduce the level of oversight and scrutiny that will be placed upon us, which is the right thing to do. In terms of the simplicity of that and the regimes that we are under, we are very supportive of that change.

Helen Hitching: Likewise, we are supportive and welcome the simplification. We do note, however, that the Biometrics Commissioner currently has a keen focus on developing technology in a legal manner and consults with the public. We would ask that there remains a focus on that oversight of biometrics, to assure the public that that work remains a priority once the regulation of biometrics transfers to the Information Commissioner’s Office and to make sure that that focus is retained.

Damian Collins Portrait Damian Collins
- Hansard - -

Q How easy do you find it to gather data as part of investigations at the moment, particularly if you are working with companies that provide services to individuals? Do you think the provisions in the Bill will make that any easier?

Aimee Reed: On balance, it will make things easier. We are retaining the very different sections of the Act under which different organisations operate, and the sections that look to improve joint working across part 3 and part 4 agencies are very welcome. At the moment that is not about simplifying the relationships between those in, say, part 2 and part 3, albeit data sharing is entirely possible. In essence, it is going to get simpler and easier to share data, but without losing any of the safeguards.

Damian Collins Portrait Damian Collins
- Hansard - -

Q In terms of criminal investigations, practically how easy is it to get hold of data and information that you consider to be important, particularly if it is from private companies?

Aimee Reed: It is not as easy as we would like it to be, and provision is not made in the Bill to make that easier. There are some discussions about it going into the Online Safety Bill and other areas. It could be easier. We would push harder in the future, but at the moment, getting parity across the other areas and around national security is a focus that we welcome.

Helen Hitching: I want to pick up on the fact that safeguards are not reducing. It is key that the agency notes the point that our safeguards are not being lowered because of this.

Mark Eastwood Portrait Mark Eastwood (Dewsbury) (Con)
- Hansard - - - Excerpts

Q I have been on the parliamentary police and fire service scheme, so I have spent a lot of time with the police. One of the big frustrations from the police’s point of view is the lack of free flow of information, particularly when it concerns charging decisions, along with redaction, which potentially causes some antagonism between the two. I know this is not strictly covered in the Bill, but would it be beneficial to both parties if you were able to share unredacted information before a charging decision is made?

Aimee Reed: I will answer that in respect of where we are now in national policing. It would be of considerable benefit if the guidance was clearer that we could share information without having to redact it, certainly pre-charge, to enable better and easier charging decisions—to be honest—within the Crown Prosecution Service. It would also reduce the current burden on officers: you can think about the volume of data they have to hand over, and it can be video, audio, transcripts—it is not just witness statements, as it used to be 20 or 30 years ago. Reducing that burden would be significant for frontline officers and unleash them to be able to do other things.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Q May I ask a relatively simple question? Obviously your concern is the protection of workers’ rights, and safeguards against discrimination and other potential adverse consequences of technology. We will debate the provisions of the Bill in those areas in the coming weeks—I suspect at some length—but would you nevertheless accept that the overall impact of the legislation, if we get this right, will be beneficial to your members in terms of the promotion of growth and potential future job opportunities?

Andrew Pakes: “If we get this right” is doing a lot of heavy lifting there; I will leave it to Members to decide the balance. That should be the goal. There is a wonderful phrase from the Swedish trade union movement that I have cited before: “Workers should not be scared of the new machines; they should be scared of the old ones.” There are no jobs, there is no prosperity and there is no future for the kind of society that our members want Britain to be that does not involve innovation and the use of new technology.

The speed at which technology is now changing and the power of this technology compared with previous periods of economic change make us believe that there has to be a good, robust discussion about the balances of checks and balances in the process. We have seen in larger society—whether through A-level results, the Post Office or other things—that the detriment is significant on the individuals impacted if legislators get that balance wrong. I agree with the big principle and I will leave you to debate that, but we would certainly urge that checks and balances need to be balanced, not one-sided.

Mary Towers: Why does respect for fundamental rights have to be in direct conflict with growth and innovation? There is not necessarily any conflict there. Indeed, in a workplace where people are respected, have dignity at work and are working in a healthy way, that can only be beneficial for productivity and growth.

Damian Collins Portrait Damian Collins
- Hansard - -

Q I have been listening carefully to what you have been saying and it strikes me that there are two issues: the use of technology in the general workplace, and the rights of workers who work through technology to do their jobs. In the workplace itself, data gathering and analysis has always existed to some extent. If we were having this conversation in the 1960s, we would have been talking about people doing time-motion studies of people in factories to work out what efficiency looked like. Is your concern in respect of a general working environment that employers are transparent about what sort of data they gather and how they use it?

Andrew Pakes: That is the first base. The power of technology is changing so quickly, and the informal conversations we have every day with employers suggest that many of them are wrestling with the same questions that we are. If we get this legislation right, it is a win-win when it comes to the question of how we introduce technology in workspaces.

You are right to identify the changing nature of work. We would also identify people analytics, or the use of digital technology to manage people. How we get that right is about the balance: how do you do it without micromanaging, without invading privacy, without using technology to make decisions without—this is a horrible phrase, but it is essentially about accountability—humans in the loop? Good legislation in this area should promote innovation, but it should also have due regard to balancing how you manage risks and reduce harms. That is the element that we want to make sure comes through in the legislation in its final form.

Damian Collins Portrait Damian Collins
- Hansard - -

Q So you do not have an in-principle objection to the use of technology to monitor the efficiency, output and performance of employees within a working environment, but you think it needs to be based on agreed criteria—that employers need to be transparent about how they are gathering data and what they are using it for.

Andrew Pakes: Absolutely. Let me give you a quick example of one piece of technology that we have negotiated in some areas: GPS tracking. It might be old technology, compared with many things that you are looking at. We represent frontline workers who often work alone, outside, or in spaces where their work could be risky. If those people cannot answer their radio or phone, it is in the legitimate interests of all of us to see where they are, in case they have had an accident or are in a dangerous situation. We can see a purpose to that technology. In negotiation with employers, we have often said, “This is good technology for keeping people safe, but we are not happy with it being used in performance reviews.” We are not happy with people saying, “I am sorry, Mr Collins, but you seem to spend a lot of time in the same café each lunch time.”

The issue is not the technology, but its application. Technology that is used to increase safety is very good, but the risk is that it will be used to performance-manage people; employers may say, “You are not doing enough visits,” “You aren’t working fast enough,” or, “You don’t drive fast enough between jobs.” We need balance and control, as opposed to ruling out technology that can keep people safe and well.

Damian Collins Portrait Damian Collins
- Hansard - -

Q For some people, their job is done through technology. Take a gig economy worker working for a delivery company. Do you have concerns about how app developers design their systems and their relationship to the worker? For example, you may work for a company that does not pay you for your waiting time. You are not working contracted hours; you are working in the gig economy, on a “turn up and get paid” basis. The system may have been designed to favour people who are always on the app and always ready for work, even if they are not being paid for that, over people who log on only at particular times. The app developer may not be very transparent about that, because they do not want to be named and shamed for treating their workers that way. Good and bad employers would say that there are people working to different standards, but do you feel that there is still a lack of transparency in the gig economy about how different apps process and use data, and the impact that has on the day-to-day working life of the people who use those apps?

Andrew Pakes: From my perspective, yes.

Mary Towers: The TUC has red lines relating to the use of these types of technologies. One is that we simply should not have technologies at work that are not transparent and that operate in a way that people do not understand. The principle of explainability is really important to us. People need to understand when the technologies are operating, and how they operate in relation to them. On top of that, it is absolutely vital that discriminatory data processing does not take place. The example that you gave from the gig economy is potentially of a discriminatory pay calculation—of an algorithm that might be calculating different rates of pay for individuals who are carrying out exactly the same work. The algorithm is potentially replicating existing inequalities in pay that are rooted in gender or race.

Damian Collins Portrait Damian Collins
- Hansard - -

Q The issue is not different rates of pay per task, but the amount of paid work that someone might get within a period.

Mary Towers: Yes. Drivers are a good example. People drive a certain distance to pick people up or deliver items. Even when the driving time is exactly the same, people may be paid different rates, because the algorithm will have worked out how long certain groups of people are likely to wait before they accept a gig, for example. I emphasise that, in our view, those sorts of issues are not restricted to the gig economy; they spread way beyond it, into what one might consider to be the far more traditional professions. That is where our red lines are. They relate to transparency, explainability, non-discrimination and, critically, worker and union involvement at each stage of the AI value chain, including in the development of that type of app—you mentioned development. Unless the worker voice is heard at development stage, the likelihood is that worker concerns, needs and interests will not be met by the technology. It is a vital principle to us that there be involvement of workers and unions at each stage of the AI value chain—in development, application and use.

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

Q Welcome to both of you. Apologies for my misuse of my own technology earlier.

The Minister talked about the need for growth, which has been sadly lacking in our economy for the last 13 years. Obviously, technology can make huge improvements to productivity for those in the workforce. Mr Pakes, as someone whose members are involved in technology, scientific and IT organisations, I wonder whether you would agree with this, which comes from my experience in the diffusion of technology. Is it possible to get the best from technology in an organisation or company without the people who will be using it, or the people on whom it will be used, being an active part of that diffusion of technology, and understanding and participating in its use?

Andrew Pakes: Absolutely. That has always been how productivity has improved or changed, in effect, the shop floor. If you are asking, “What problems are you using technology to solve?”, it may well be a question better asked by the people delivering the product or service than necessarily the vendor selling the software, whether that is old or new technology. I encourage the Committee to look at the strong evidence among our competitors who rate higher, in terms of productivity and innovation, than the UK, where higher levels of automation in the economy are matched by higher levels of worker participation. Unions are the most common form, but often it can be works councils or small businesses in terms of co-design and collaboration. We see that social partnership model of the doers, who identify and solve problems, being the people who do that.

We have good examples. We represent members in the nuclear sector who are involved in fusion, small modular reactors or other technology, where the employer-union relationship is critical to the UK’s intellectual property and the drive to make those successful industries. In the motor industry and other places where the UK has been successful, we can see that that sense of social partnership has been there. We have examples around using AI or the monitoring of conversations or voices. Again, I mentioned GPS tracking, but in safety-critical environments, where our members want to be kept safe, they know that technology can help them. Having that conversation between the workforce and the employer can come up with a solution that is not only good for our members, because they stay safe and understand what the safety regime is, but good for the employer, because days are not lost through illness or accidents. For me, that sense of using legislation like this to underpin good work conversations in the data setting is what the mission of this Bill should be about.

--- Later in debate ---
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q A final question: do you think that the definitions of “vexatious” and “excessive” are clear enough not to be abused by controllers who simply do not want to carry out subject access requests?

Alex Lawrence-Archer: The new definitions, particularly the list of factors to be taken into consideration in determining whether the test is met, provide a lot of breathing room for controllers, whether or not they have good intentions, to make arguments that they do not need to comply with the right of access. If you are looking not to comply or if you have an incentive not to, as many controllers do, that does not necessarily mean that you are acting in bad faith; you might just not want to hand over the data and think that you are entitled not to do so. If you are looking not to comply, you will look at the Act and see lots of hooks that you can hang arguments on. Ultimately, that will come back to individuals who are just trying to exercise their rights and who will be engaged in big arguments with big companies and their lawyers.

Damian Collins Portrait Damian Collins
- Hansard - -

Q The age-appropriate design code for children was mentioned in our session this morning. Do you have any thoughts on what the Bill could mean for the application of that design code, which was obviously prepared for an environment in which GDPR was enshrined in UK data law?

Alex Lawrence-Archer: The age-appropriate design code was a real success for the UK in terms of its regulation and its reputation internationally. It clarified the rights that children have in relation to the processing of their personal data. However, those rights are only helpful if you know what is happening to your personal data, and if and when you find out that you can exercise your rights in relation to that processing.

As I have said, what the Bill does—again, perhaps inadvertently—is undermine in a whole host of ways your ability to know what is happening with your personal data and to do something about it when you find out that things have gone wrong. It seems to me that on the back of a notable success in relation to the AADC, we are now, with this Bill, moving in rather a different direction in terms of that argument for protection of personal data.

Looking at the even longer term, there will be some slightly more nuanced changes if and when the AADC comes to be amended or redrafted, because of the role of the ICO and the factors that it has to take into account in its independence, which again you have already heard about. So you could, in the long term, see a new version of the AADC that is more business-friendly, potentially, because of this Bill.

Damian Collins Portrait Damian Collins
- Hansard - -

Q In terms of access to personal data, a lot of what we are talking about, certainly when we are talking about children, relates to what we generally call big-tech companies. A lot of the age-appropriate design code is focused on children’s interface with services like Instagram, YouTube, TikTok and so on, of which they are heavy users. Are you concerned that because data may be stored in such a way that it is difficult for an external person to locate to an individual user, companies may use that as an excuse to be much looser in their application of the protections for children?

Alex Lawrence-Archer: There are a bunch of different ways in which companies will take advantage of the new grey areas that the Bill opens up to carry out processing with less transparency and less respecting of the rights of the people whose data they are processing. If we take just the definition of research, for example, it will be much easier to carry out research for a large platform that already has lots of personal data. The GDPR already provides for a lot of exemptions when you are carrying out research; the Bill dramatically expands that definition. If you are a Google or a YouTube, then yes, you are much freer to carry out processing that you consider to be research without necessarily being transparent about it to the users affected, those whose data it concerns.

Damian Collins Portrait Damian Collins
- Hansard - -

Q The project that triggered the initial Cambridge Analytica scandal was in theory academic research on personality profiling, so there are lots of ways in which the definition can be stretched, for sure. Earlier, I asked the Information Commissioner about the definition of legitimate interests for companies. He seemed to think that if he thought that someone did not have a legitimate interest, he could still investigate it and therefore the Bill did not make much difference, but are you reassured by what he said?

Alex Lawrence-Archer: We need to distinguish between two things: one is the introduction of some examples of what may be legitimate interests, which is not a particular concern because they replicate what is already in a recital; and, separately and of much greater concern, the introduction of recognised legitimate interests. I think that that is quite a radical departure from legitimate interests under the current regime. The Bill possibly misguides people, because it uses the language of legitimate interests, but it works in a very different way.

If you have a legitimate interest under the current regime, you must balance your interests against those of data subjects, and that is not something that is required if you can rely on a recognised legitimate interest under the new regime. The recognised legitimate interests are very broad—prevention of crime, for example, does not mean that that has to be done by the police. That is about opening up such processing for any kind of controller, which could be your neighbour or local corner shop, who can rely on that recognised legitimate interest with no requirement to consider the data subject’s interest at all. That is a radical departure, because the concept of balancing the interests of the data subject and of the controller is absolutely fundamental to our current regime.

Damian Collins Portrait Damian Collins
- Hansard - -

Q In that case, on recognised legitimate interests, if someone says that their legitimate interest is the prevention of crime, they can define that in any way that they like in how they might seek to process or analyse behaviour patterns in their systems?

Alex Lawrence-Archer: I do not want to overstate the case. You must be able to demonstrate that the processing is necessary for a recognised legitimate interest; it has got to make sense—but you do not have to consider anyone else’s interests.

For example, in some recent cases, neighbours were operating CCTV that captured lots of the personal data of their neighbours. An important argument to show that that was unlawful was that yes, the processing was necessary for the detection of crime—that is what the CCTV was for—but the interests of the neighbours, views of whose gardens and front windows were being captured, overrode the legitimate interests of the controller. That is how it works under the current regime. Under the new regime, you would not have to consider the interests of the neighbours in the use of that CCTV system. You would be able to rely on the recognised legitimate interest.

Damian Collins Portrait Damian Collins
- Hansard - -

Q Effectively, you would not need to consider whether the use of that technology in that case was disproportionate to the risk.

Alex Lawrence-Archer: Yes.

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

Q We heard from some witnesses today that greater ease of access to data will increase competition for those such as Google and Meta that have large amounts of data as it is. What do you think the impact of this Bill will be for big tech?

Alex Lawrence-Archer: I think the Bill is quite big tech-friendly, and the way that it deals with research is well illustrative of that. One of the objectives of the Bill is obviously to boost the use of personal data for academic research, which is a really laudable objective. However, the main change—in fact the only change I can think of off the top of my head—that it makes is to broaden the definition of academic research. That helps people who already have lots of personal data they might do research with; it does not help you if you do not have personal data. That is one of the major barriers for academics at the moment: they cannot get access to the data they need.

The Bill does nothing to incentivise or compel data controllers such as online platforms to actually share data and get it moving around the system for the purposes of academic research. This is in stark contrast to the approach being taken elsewhere. It is an issue the EU is starting to grapple with in a particular domain of research with article 40 of the Digital Services Act. There is a sense that we are falling behind a little bit on that key barrier to academic research with personal data.

Data Protection and Digital Information (No. 2) Bill (Third sitting) Debate

Full Debate: Read Full Debate

Data Protection and Digital Information (No. 2) Bill (Third sitting)

Damian Collins Excerpts
Committee stage
Tuesday 16th May 2023

(11 months, 2 weeks ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 16 May 2023 - (16 May 2023)
I would like to finish by asking the Minister whether his Department has considered the impact of the new legislation on commercial scientific processing on children specifically. If so, what measures have been taken to ensure that the Bill does not put children’s personal data at risk of exploitation?
Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - -

I wish to pose a couple of questions, after two thoughtful and well-presented amendments from those on the Opposition Front Bench. With regard to children and the use of apps such as TikTok, what assurance will the Government seek to ensure that companies that process and store data abroad are abiding by the principles of our domestic legislation? I mention TikTok directly because it stores data from UK users, including children, in Singapore, and it has made clear in evidence to the Joint Committee on the Online Safety Bill that that data is accessed by engineers in China who are working on it.

We all know that when data is taken from a store and used for product development, it can be returned in its original state but a huge amount of information is gathered and inferred from it that is then in the hands of engineers and product developers working in countries such as China and under very different jurisdictions. I am interested to know what approach we would take to companies that store data in a country where we feel we have a data equivalence regime but then process the data from a third location where we do not have such a data agreement.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I welcome the recognition of the importance of allowing genuine research and the benefits that can flow from it. Such research may well be dependent on using data and the clause is intended to provide clarity as to exactly how that can be done and in what circumstances.

I will address the amendments immediately. I am grateful to the hon. Member for Barnsley East for setting out her arguments and we understand her concerns. However, I think that the amendments go beyond what the clause proposes and, in addition, I do not think that there is a foundation for those concerns. As we have set out, clause 2 inserts in legislation a definition for processing for scientific research, historical research and statistical purposes. The definition of scientific research purposes is set out as

“any research that can be reasonably described as scientific”

and I am not sure that some of the examples that the hon. Lady gave would meet that definition.

The definitions inserted by the clause are based on the wording in the recitals to the UK GDPR. We are not changing the scope of these definitions, only their status in the legislation. They will already be very familiar to people using them, but setting them out in the Bill will provide more clarity and legal certainty. We have maintained a broad scope as to what is allowed to be included in scientific research, with the view that the regulator can add more nuance and context through guidance, as is currently the case. The power to require codes of practice provides a route for the Secretary of State to require the Information Commissioner to prepare any code of practice that gives guidance on good practice in processing personal data.

There will be situations where non-statutory guidance, which can be produced without being requested under regulations made by the Secretary of State, may be more appropriate than a statutory code of practice. Examples of the types of activity that are considered scientific research and the indicative criteria that a researcher should demonstrate are best placed in non-statutory guidance produced by the Information Commissioner’s Office. That will give flexibility to amend and change the examples when necessary, so I believe that the process does not change the provision. However, putting it in the legislation, rather than in the recitals, will impose stronger safeguards and make things clearer. Once the Bill has come into effect, the Government will continue to work with the ICO to update its already detailed and helpful guidance on the definition of scientific research as necessary.

Amendment 66 would prohibit the use of children’s data for commercial purposes under the definition of scientific research. The definition inserted by clause 2 includes the clarification that processing for scientific research carried out as a commercial activity can be considered processing for scientific research purposes. Parts of the research community asked for that clarification in response to our consultation. It reflects the existing scope, as is already clear from the ICO’s guidance, and we have seen that research by commercial bodies can have immense societal value. For instance, research into vaccines and life-saving treatments is clearly in the public interest. I entirely understand the hon. Lady’s concern for children’s privacy, but we think that her amendment could obstruct important research by commercial organisations, such as research into children’s diseases. I think that the Information Commissioner would make it clear as to whether or not the kind of example that the hon. Lady gave would fall within the definition of research for scientific purposes.

I also entirely understand the concern expressed by my hon. Friend the Member for Folkestone and Hythe. I suspect that the question about the sharing of data internationally, particularly, perhaps, by TikTok, may recur during the course of our debates. As he knows, we would share data internationally only if we were confident that it would still be protected in the same way that it is here, which would include considering the possibility of whether or not it could then be passed on to a third country, such as China.

I hope that I can reassure the hon. Lady that emphasising the safeguards that researchers must comply with in clause 22 to protect individuals relates to all data used for these purposes, including children’s data and the protections afforded to children under the UK GDPR. For those reasons, I hope that she will be willing to withdraw her amendment.

--- Later in debate ---
The regulation-making powers in the clause will also be subject to the new requirements in clause 44. They provide that any regulations made under the UK GDPR are subject to consultation with the commissioner and such other persons as the Secretary of State considers appropriate.
Damian Collins Portrait Damian Collins
- Hansard - -

Will my right hon. Friend confirm whether the Information Commissioner’s advice will be published, either by the commissioner, the Minister or Parliament—perhaps through the relevant Select Committee?

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I am not sure it would necessarily be published. I want to confirm that, but I am happy to give a clear response to the Committee in due course if my hon. Friend will allow me.

As well as the advice that the Information Commissioner supplies, the proposal is also subject to the affirmative procedure, as the hon. Member for Barnsley East recognised, so Parliament could refuse to approve any additions to the list that do not respect the rights of data subjects. She suggested that it is rare for an affirmative resolution to be rejected by Parliament; nevertheless, it is part of our democratic proceedings, and every member of the Committee considering it will have the opportunity to reach their own view and vote accordingly. I hope that reassures the hon. Lady that there are already adequate safeguards in place in relation to the exercise of powers to add new activities to the list of recognised legitimate interests.

Amendment 67, which the hon. Lady also tabled, would require data controllers to publish a statement if they are relying on the new recognised legitimate interests lawful ground. The statement would have to explain what processing would be carried out in reliance on the new lawful ground and why the processing is proportionate and necessary for the intended purpose. In our view, the amendment would significantly weaken the clause. It would reintroduce something similar to the legitimate interests assessment, which, as we have heard, can unnecessarily delay some very important processing activities. In scenarios involving national security or child protection, for example, the whole point of the clause is to make sure that relevant and necessary personal data can be shared without hesitation to protect vulnerable individuals or society more generally.

I hope the hon. Lady is reassured by my response and agrees to withdraw her amendments. I commend clause 5 to the Committee.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - -

The principle that underpinned what happened in the Cambridge Analytica scandal was the connection of Facebook profiles to the electoral register. If I understand my right hon. Friend the Minister correctly, what he is talking about would not necessarily change that situation. This could be information that the political campaign has gained anyway from a voter profile or from information that already exists in accounts it has access to on platforms such as Facebook; it would simply be attaching that, for the purposes of targeting, to people who voted in an election. The sort of personal data that Members of Parliament hold for the purposes of completing casework would not have been processed in that way. These proposals would not change in any way the ability to safeguard people’s data, and companies such as Cambridge Analytica will still seek other sources of open public data to complete their work.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I think my hon. Friend is right. I have no doubt that we will go into these matters in more detail when we get to those provisions. As the hon. Member for Barnsley East knows, this measure makes a very narrow change to simply extend the existing time limit within which there is protection for elected representatives to conclude casework following a general election. As we will have opportunity in due course to look at the democratic engagement exemption, I hope she will be willing to support these narrow provisions.

--- Later in debate ---
thereby emphasising the importance of proportionality when considering whether a request is vexatious or excessive.
Damian Collins Portrait Damian Collins
- Hansard - -

Does my right hon. Friend agree that the provisions will be helpful and important for organisations that gather data about public persons, and particularly oligarchs, who are very adept at using subject access requests to bombard and overwhelm a journalist or a small investigatory team that is doing important work looking into their business activities?

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I completely agree with my hon. Friend. That is an issue that both he and I regard as very serious, and is perhaps another example of the kind of legal tactic that SLAPPs—strategic lawsuits against public participation—represent, whereby oligarchs can frustrate genuine journalism or investigation. He is absolutely right to emphasise that.

It is important to highlight that controllers can already consider resource when refusing or charging a reasonable fee for a request. The Government do not wish to change that situation. Current ICO guidance sets out that controllers can consider resources as a factor when determining if a request is excessive.

The new parameters are not intended to be reasons for refusal. The Government expect that the new parameters will be considered individually as well as in relation to one another, and a controller should consider which parameters may be relevant when deciding how to respond to a request. For example, when the resource impact of responding would be minimal even if a large amount of information was requested—such as for a large organisation—that should be taken into account. Additionally, the current rights of appeal allow a data subject to contest a refusal and ultimately raise a complaint with the ICO. Those rights will not change with regard to individual rights requests.

Amendment 74 proposes adding more detail on the obligations of a controller who refuses or charges for a request from a data subject. The current legislation sets out that any request from a data subject, including subject access requests, is to be responded to. The Government are retaining that approach and controllers will be expected to demonstrate why the provision applies each time it is relied on. The current ICO guidance sets out those obligations on controllers and the Government do not plan to suggest a move away from that approach.

The clause also states that it is for the controller to show that a request is vexatious or excessive in circumstances where that might be in doubt. Thus, the Government believe that the existing legislation provides the necessary protections. Following the passage of the Bill, the Government will work with the ICO to update guidance on subject access requests, which we believe plays an important role and is the best way to achieve the intended effect of the amendments. For those reasons, I will not accept this group of amendments; I hope that the hon. Member for Barnsley East will be willing to withdraw them.

I turn to clause 7 itself. As I said, the UK’s data protection framework sets out key data subject rights, including the right of access—the right for a person to obtain a copy of their personal data. A subject access request is used when an individual requests their personal data from an organisation. The Government absolutely recognise the importance of the right of access and do not want to restrict that right for reasonable requests.

The existing legislation enables organisations to refuse or charge a reasonable fee for a request when they deem it to be “manifestly unfounded or excessive”. Some organisations, however, struggle to rely on that in cases where it may be appropriate to do so, which as a consequence impacts their ability to respond to reasonable requests.

The clause changes the legislation to allow controllers to refuse or charge a reasonable fee for a request that is “vexatious or excessive”. The clause adds parameters for controllers to consider when relying on the “vexatious or excessive” exemption, such as the nature of the request and the relationship between the data subject and the controller. The clause also includes examples of the types of request that may be vexatious, such as those intended to cause distress, those not made in good faith or those that are an abuse of process.

We believe that the changes will give organisations much-needed clarity over when they can refuse or charge a reasonable fee for a request. That will ensure that controllers can focus on responding to reasonable requests, as well as other important data and organisational needs. I commend the clause to the Committee.

Data Protection and Digital Information (No. 2) Bill (Fourth sitting) Debate

Full Debate: Read Full Debate

Data Protection and Digital Information (No. 2) Bill (Fourth sitting)

Damian Collins Excerpts
Committee stage
Tuesday 16th May 2023

(11 months, 2 weeks ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 16 May 2023 - (16 May 2023)
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I am very much aware of the concern about automated decision making. The Government share the wish of the hon. Member for Barnsley East for all those who may be affected to be given protection. Where I think we differ is that we do not recognise the distinction that she tries to make between data subjects and decision subjects, which forms the basis of her amendments.

The hon. Lady’s amendments would introduce to the UK GDPR a definition of the term “decision subject”, which would refer to an identifiable individual subject to data- based and automated decision making, to be distinguished from the existing term “data subject”. The intended effect is to extend the requirements associated with provisions related to decisions taken about an individual using personal data to those about whom decisions are taken, even though personal information about them is not held or used to take a decision. It would hence apply to the safeguards available to individuals where significant decisions are taken about them solely through automated means, as amendments 78 to 101 call for, and to the duties of the Information Commissioner to have due regard to decision subjects in addition to data subjects, as part of the obligations imposed under amendment 106.

I suggest to the hon. Lady, however, that the existing reference to data subjects already covers decision subjects, which are, if you like, a sub-group of data subjects. That is because even if an individual’s personal data is not used to inform the decision taken about them, the fact that they are identifiable through the personal data that is held makes them data subjects. The term “data subject” is broad and already captures the decision subjects described in the hon. Lady’s amendment, as the identification of a decision subject would make them a data subject.

I will not, at this point, go on to set out the Government’s wider approach to the use of artificial intelligence, because that is somewhat outside the scope of the Bill and has already been set out in the White Paper, which is currently under consultation. Nevertheless, it is within that framework that we need to address all these issues.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - -

I have been closely following the speeches of the Minister and the hon. Member for Barnsley East. The closest example that I can think of for this scenario is the use of advertising tools such as lookalike audiences on Facebook and customer match on YouTube, where a company holding data about users looks to identify other customers who are the closest possible match. It does not hold any personal data about those people, but the platform forms the intermediary to connect them. Is the Minister saying that in that situation, as far as the Bill is concerned, someone contacted through a lookalike audience has the same rights as someone who is contacted directly by an advertiser that holds their data?

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

It is certainly our view that anybody who is affected by an automated decision made on the basis of data held about individuals themselves becomes a data subject, so I think the answer to the honourable Lady’s question is no. As I said, the Information Commissioner’s Office will provide guidance in this area. If such a situation does arise, obviously it will need to be considered.The hon. Members for Barnsley East and for Glasgow North West asked about making information available to all those affected, and about safeguards, which we think are contained within the requirements under article 22C.

Damian Collins Portrait Damian Collins
- Hansard - -

Further to the point that was made earlier, let us say that a Facebook user was targeted with an advert that was based on their protected characteristics data—data relevant to their sexual orientation, for example—but that user said that they had never shared that information with the platform. Would they have the right to make a complaint, either to the advertiser or to the platform, for inferring that data about them and making it available to a commercial organisation without their informed consent?

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

They would obviously have that right, and indeed they would ultimately have the right to appeal to the Information Commissioner if they felt that they had been subjected unfairly to a decision where they had not been properly informed of the fact. On the basis of what I have said, I hope the hon. Member for Barnsley East might withdraw her amendment.

Data Protection and Digital Information (No. 2) Bill (Fifth sitting) Debate

Full Debate: Read Full Debate

Data Protection and Digital Information (No. 2) Bill (Fifth sitting)

Damian Collins Excerpts
Committee stage
Thursday 18th May 2023

(11 months, 2 weeks ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 18 May 2023 - (18 May 2023)
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Clause 43 is a technical measure that creates a presumption that our data protection laws should not be overridden by future laws that relate to the processing of personal data, but it respects parliamentary sovereignty by ensuring that Parliament can depart from this presumption in particular cases if it deems it appropriate to do so. For example, if new legislation permitted or required an organisation to share personal data with another for a particular purpose, the default position in the absence of any specific indication to the contrary would be that the data protection legislation would apply to the new arrangement.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - -

Will my right hon. Friend confirm that the provision will also apply with trade agreements? Certainly in the early stages of the negotiations for a UK-US trade agreement, the United States Government sought to include various provisions relating to tech policy. In such a scenario, would this legislation take precedence above anything written into a trade agreement?

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

That would certainly be my interpretation. I do not see that a trade agreement could possibly overturn an Act of Parliament unless Parliament specifically sets out that it intends that that should be the case. This is a general protection, essentially saying that in all future cases data protection legislation applies unless Parliament specifically indicates that that should not be the case.

Until now, ensuring that any new data protection measures are read consistently with the data protection legislation has relied either on inclusion of express provision to that effect in new data processing measures, or on general rules of interpretation. There are risks to that situation. Including relevant provisions in each and every new data processing provision is onerous and could be inadvertently omitted. General rules of interpretation can be open to different interpretations by courts, particularly in the light of legal challenges following our exit from the European Union. This can create the potential for legal uncertainty and as a result could lead to a less effective and comprehensive data protection legislative framework.

Clause 43 creates a presumption that any future legislation permitting the processing of personal data will be subject to the key requirements of the UK’s data protection legislation unless clear provisions are made to the contrary. This is a technical but necessary measure and I commend it to the Committee.

Data Protection and Digital Information (No. 2) Bill (Sixth sitting) Debate

Full Debate: Read Full Debate

Data Protection and Digital Information (No. 2) Bill (Sixth sitting)

Damian Collins Excerpts
Committee stage
Thursday 18th May 2023

(11 months, 2 weeks ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 18 May 2023 - (18 May 2023)
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

The clause defines key terms in this part of the Bill, such as business data, customer data and data holder, as well as data regulations, customer and trader. These are key to the regulation-making powers on smart data in part 3, and I have no specific concerns to raise about them at this point.

I note the clarification made by the Minister in his amendment to the example given. As he outlined, that will ensure there is consistency in the definition and understanding of business data. It is good to see areas such as that being cleaned up so that the Bill can be interpreted as easily as possible, given its complexity to many. I am therefore happy to proceed with the Bill.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - -

I rise to ask the Minister a specific question about the use of smart data in this way. A lot of users will be giving away data a device level, rather than just accessing individual accounts. People are just going to a particular account they are signed into and making transactions, or doing whatever they are doing in that application, on a particular device, but there will be much more gathering of data at the device level. We know that many companies—certainly some of the bigger tech companies—use their apps to gather data not just about what their users do on their particular app, but across their whole device. One of the complaints of Facebook customers is that if they seek to remove their data from Facebook and get it back, the company’s policy is to give them back data only for things they have done while using its applications—Instagram, Facebook or whatever. It retains any device-level data that it has gathered, which could be quite significant, on the basis of privacy—it says that it does not know whether someone else was using the device, so it is not right to hand that data back. Companies are exploiting this anomaly to retain as much data as possible about things that people are doing across a whole range of apps, even when the customer has made a clear request for deletion.

I will be grateful if the Minister can say something about that. If he cannot do so now, will he write to me or say something in the future? When considering the way that these regulations work, particularly in the era of smart data when it will be far more likely that data is gathered across multiple applications, it should be clear what rights customers have to have all that data deleted if they request it.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I share my hon. Friend’s general view. Customers can authorise that their data be shared through devices with other providers, so they should equally have the right to take back that data if they so wish. He invites me to come back to him with greater detail on that point, and we would be very happy to do so.

Amendment 46 agreed to.

Clause 61, as amended, ordered to stand part of the Bill.

Clause 62

Power to make provision in connection with customer data

Data Protection and Digital Information (No. 2) Bill (Seventh sitting) Debate

Full Debate: Read Full Debate

Data Protection and Digital Information (No. 2) Bill (Seventh sitting)

Damian Collins Excerpts
Committee stage
Tuesday 23rd May 2023

(11 months, 1 week ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 23 May 2023 - (23 May 2023)
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

As the hon. Lady sets out, amendment 117 would remove new regulation 6B from the Bill, but we see this as an important tool for reducing frequent cookie consent banners and pop-ups that can, as we have debated already, interfere with people’s use of the internet. Members will be aware, as has already been set out, that clause 79 removes the need for organisations to seek consent to place cookies for certain non-intrusive purposes. One way of further reducing the need for repeated cookie pop-up notices is by blocking them at source—in other words, allowing web users to select which cookies they are willing to accept and which they are not comfortable with by using browser-level settings or similar technologies. These technologies should allow users to set their online preferences once and be confident that those choices will be respected throughout their use of the internet.

We will continue to work with the industry and the Information Commissioner to improve take-up and effectiveness of browser-based and similar solutions. Retaining the regulation-making powers at 6B is important to this work because it will allow the Secretary of State to require relevant technologies to meet certain standards or specifications.

Without regulations, there could be an increased risk of companies developing technologies that did not give web users sufficient choice and control about the types of cookies they are willing to accept. We will consult widely before making any new regulations under 6B, and new regulations will be subject to the affirmative resolution procedure. We have listened to stakeholders and intend to amend 6B to provide an explicit requirement for the Secretary of State to consult the Competition and Markets Authority before making new regulations.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - -

Is this something the Department has considered? For example, Google Chrome has a 77% share of the web browser market on desktop computers, and over 60% for all devices including mobile devices. Although we want to improve the use of the internet for users and get rid of unwanted cookies, the consequence would be the consolidation of power in the hands of one or two companies with all that data.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I entirely agree with my hon. Friend. He accurately sums up the reason that the Government decided it was important that the Competition and Markets Authority would have an input into the development of any facility to allow browser users to set their preferences at the browser level. We will see whether, with the advent of other browsers, AI-generated search engines and so on, the dominance is maintained, but I think he is absolutely right that this will remain an issue that the Competition and Markets Authority needs to keep under review.

That is the purpose of Government amendment 54, which will ensure that any competition impacts are considered properly. For example, we want any review of regulations to be relevant and fair to both smaller publishers and big tech. On that basis, I hope that the hon. Member for Barnsley East will consider withdrawing her amendment.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - -

Does my right hon. Friend agree that, if amendment 118 were made, it could be used as a general get-out-of-jail-free card by companies? Let us consider, for example, a situation where a company could easily and obviously have spotted a likely breach of the regulations and should have intervened. When the commissioner discovered that the company had failed in its duty to do so, the company could turn around and say, “Well, yes, we missed that, but we were not under any obligation to monitor.” It is therefore important that there is a requirement for companies to use their best endeavours to monitor where possible.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I completely agree; my hon. Friend is right to make that distinction. Companies should use their best endeavours, but it is worth repeating that the guidance does not expect service and network providers to monitor the content of individual calls and messages to comply with the duty. There is more interest in patterns of activity on networks, such as where a rogue direct marketing firm behaves in the manner that I set out. On that basis, I ask the hon. Lady not to press her amendment to a vote.