All 6 Lord Kamall contributions to the Data Protection and Digital Information Bill 2022-23

Read Bill Ministerial Extracts

Wed 20th Mar 2024
Data Protection and Digital Information Bill
Grand Committee

Committee stage & Committee stage: Minutes of Proceedings & Committee stage: Minutes of Proceedings & Committee stage & Committee stage

Data Protection and Digital Information Bill

Lord Kamall Excerpts
Lord Kamall Portrait Lord Kamall (Con)
- View Speech - Hansard - -

My Lords, it is a pleasure to follow the previous speakers, including my noble friend the Minister, the other Front-Benchers and the noble Baroness, Lady Kidron.

I start by thanking the House of Lords Library for its briefing—it was excellent, as usual—and the number of organisations that wrote to noble Lords so that we could understand and drill down into some of the difficulties and trade-offs we are going to have to look at. As with most legislation, we want to get the balance right between, for example, a wonderful environment for commerce and the right to privacy and security. I think that we in this House will be able to tease out some of those issues and, I hope, get a more appropriate balance.

I refer noble Lords to my interests as set out in the register. They include the fact that I am an unpaid adviser to the Startup Coalition and have worked with a number of think tanks that have written about tech and privacy issues in the past.

When I look at the Bill at this stage, I think that there are bits to be welcomed, bits that need to be clarified and bits that raise concern. I want to touch on a few of them before drilling down—I will not drill down into all of them, because I am sure that noble Lords have spoken or will speak on them, and we will have much opportunity for further debate.

I welcome Clause 129, which requires social media companies to retain information linked to a child suicide. However, I understand and share the concern of the noble Baroness, Lady Kidron, that this seems to be the breaking of a promise. The fact is that this was supposed to be about much more data and harms to children and how we can protect our children. In some ways, we must remember the analogy about online spaces: when we were younger, before the online age, our parents were always concerned about us when we went beyond the garden gate; nowadays, we must look at the internet and the computers on our mobile devices as that garden gate. When children leave that virtual garden gate and go through into the online world, we must ask whether they are safe, in the same way that my parents worried about us when, as children, we went through our garden gate to go out and play with others.

Clauses 138 to 141, on a national underground asset register, are obviously very sensible; that proposal is probably long overdue. I have questions about the open electoral register, in particular the impact on the direct marketing industry. Once again, we want to get the balance right between commerce and ease of doing business, as my noble friend the Minister said, and the right to privacy.

I have concerns about Clauses 147 and 148 on abolishing the offices of the Biometrics Commissioner and the Surveillance Camera Commissioner. I understand that the responsibilities will be transferred, but, in thinking about the legislation that we have been talking about in this place—such as the Online Safety Act—I wonder about the amount of powers that we are giving to these regulators and whether they will have the bandwidth for them. Is there really a good reason for abolishing these two commissioners?

I share the concerns of the noble Lord, Lord Knight, about access to bank accounts. Surely people should have the right to know why their bank account has been accessed and have some protection so that not just anyone can access it. I know that it is not just anyone but there are concerns about this, and people have to be clearer on the rules.

I have talked to the direct marketing industry. It sees the open electoral register as a valuable resource for businesses in understanding and targeting customers. However, it tells me that a recent court case between Experian and the ICO has introduced some confusion on the use of the register for business purposes. It is concerned that the Information Commissioner’s Office’s interpretation, requiring notification to every individual for every issue, presents challenges that could cost the industry millions and make the open electoral register unusable for it, perhaps pushing businesses to rely more on large tech companies. However, I understand that, at the same time, this may well be an issue where there are clear concerns about privacy.

Where there is no harm, I would like to understand the Government’s thinking on some of that—whether it is going too far or whether some clarification is needed in this area. Companies say they will be unable to target prospective customers; some of us may like that, but we should also remember that there is Clause 116 on unlawful direct marketing. The concern for many of us is that while it is junk if we do not want it, sometimes we do respond to someone’s direct marketing. I wonder how we get that balance right; I hope we can tease some of that out. If the Government agree with the interpretation and restrictions on the direct marketing industry, I wonder whether they can explain some of the reasons behind it. There may very well be good reasons.

I also want to look at transparency and data usage, not just for AI but more generally. It is obvious in the Government’s own AI White Paper that we want a pro-innovation approach to regulation, but we are also calling for transparency at a number of levels: of datasets and of algorithms. To be honest, even if we are given that transparency, do we have the ability to understand those algorithms and datasets? We still need that transparency. I am concerned about undermining the principle, and particularly weakening subject access requests.

I am also interested in companies that, say, have used your data but have refused an application and then tell you that they do not have to tell you why they refused that application. Perhaps this is too much of a burden to companies, but I wonder whether we have a right to know which data was being accessed when that decision was made. I will give a personal example; about a year ago, I applied for an account with a very clever online bank and was rejected. It told me I would have a decision within 48 hours; I did not. Two weeks later, I got a message on the app that said I had been rejected and that under the law it did not have to tell me why. I wrote to it and said, “Okay, you don’t have to tell me why, but could you delete all the data you have on me—what I put in?”. It said, “Oh, we don’t have to delete it until a certain time”. If we really own that data, I wonder whether there should be more of an expectation on companies to explain what data and information they have to make those decisions, which can be life changing for many people. We have heard all sorts of stories about access to bank accounts and concerns about digital exclusion.

We really have to think about how much access individuals can have to the data that is used to refuse them, but also the data when they leave a service or stop being a user. I also want to make sure that there is accountability. I want to know, in Clause 12, about “reasonable and proportionate search”; what does that mean, particularly when it is processed by law enforcement and intelligence services? I think we need further clarification on some of this for our assurance.

We also have to recognise that, if we look at the online environment of the last 10, 15 or 20 years, at first we were very happy to give our data away to social media companies because we thought we were getting a free service, connecting with friends across the world et cetera. Only later did we realise that the companies were using this data and monetising it for commercial purposes. There is nothing wrong with that in itself, but we have to ask whose data it is. Is it my data? Does the company own it? For those companies that think they own it, why do they think that? We need some more accountability, to make sure that we understand which data we own and which we give away. Once again, the same thing might happen—you might stop being a user or customer of a service, or you might be rejected, but it is not there.

As an academic, I recognise the need for greater access to data, particularly for online research. I welcome some of the mechanisms in the Online Safety Act that we debated. Does my noble friend the Minister believe that the Bill sufficiently addresses the requirements and incentives for large data holders to hold data for academic research with all the appropriate safeguards in place? I wonder whether the Minister has looked at some of the proposals to allow this to happen more, perhaps with the information commission acting as an intermediary for datasets et cetera. Once again, I am concerned about giving even more power to the information commission and the bandwidth to do all this stuff, including all the powers we are giving.

On cookie consent, I understand the annoyance of cookies. I remember the debates about cookie consent when I was in the European Parliament, but at the time we supported it because we thought it was important for users to be told what was being done with their information. It has become annoying, just like those text messages when we go roaming; I supported that during the roaming debates in the European Parliament because I did not want users to say they were not warned about the cost of roaming. The problem is that they become annoying; people ignore them and tick things on terms and conditions without having read them because they are too long.

When it comes to some of the cookies, I like the idea about exemptions for prior consent—a certain opt-out where there is no real harm—but I wonder whether it could be extended, for example so that cookies to understand the performance of advertising and to help companies understand the effectiveness of advertisements are exempt from the consent requirements. I do not think this would fundamentally change the structure of the Bill, but I wonder whether we have the right balance here on harm, safety and the ability of companies to test the effectiveness of some of their direct marketing. Again, I am just interested in the Government’s thinking about the balance between privacy and commerce.

Like other noble Lords, I share concerns about the powers granted to the Secretary of State. I think they lack the necessary scrutiny and safeguards, and that there is a risk of undermining the operations of online content and service providers that rely on these technologies. We need to see some strengthening here and more assurances.

I have one or two other concerns. The Information Commissioner has powers to require people to attend interviews as part of an investigation; that seems rather Big Brother-ish to me, and I am not sure whether the Information Commissioner would want these abilities, but there might be good reasons. I just want to understand the Government’s thinking on this.

I know that on Report in the other place, both Dawn Butler MP and David Davis MP raised concerns about retaining the right to use non-digital verification systems. We all welcome verification systems, but the committee I sit on—the Communications and Digital Committee—recently wrote a report on digital exclusion. We are increasingly concerned about digital exclusion and people having a different level of service because they are digitally excluded. I wonder what additional assurances the Minister can give us on some of those issues. The Minister in the other place said:

“Individual choice is integral … digital verification services can be provided only at the request of the individual”.—[Official Report, Commons, 29/11/23; col. 913.]


I think that any further verification would be really important.

The last point I turn to is EU adequacy. Let me be quite clear: I do not believe in divergence for the sake of divergence, but at the same time I do not believe in convergence or harmonisation for the sake of convergence and harmonisation. We used to have these debates in the European Parliament all the time. There are those expressing concerns about EU data adequacy, and we have to split them into two groups—one is those people who really still wish we were members of the EU, but there are also those for whom this is irrelevant, and for whom this really is about the privacy and security of our users. If the EU is raising these issues in its agreements, we can thank it for doing that.

I obviously was involved in debates on the safe harbour and the privacy shield. As noble Lords have said, we thought we had the right answer; the Commission thought we had the answer, but it was challenged by courts. I think this will have to be challenged more. Are we diverging just for the sake of divergence, or is there a good reason to diverge here, particularly when concerns have already been raised about security and privacy?

I end by saying that I look forward to the maiden speech of the noble Lord, Lord de Clifford. I thank noble Lords for listening to me, and I look forward to working with noble Lords across the House on some of the issues I have raised.

Data Protection and Digital Information Bill

Lord Kamall Excerpts
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, we are beginning rather a long journey—at least, it feels a bit like that. I will speak to Amendments 1, 5 and 288, and the Clause 1 stand part notice.

I will give a little context about Clause 1. In a recent speech, the Secretary of State said something that Julia Lopez repeated this morning at a conference I was at:

“The Data Bill that I am currently steering through Parliament with my wonderful team of ministers”—


I invite the Minister to take a bow—

“is just one step in the making of this a reality—on its own it will add £10 billion to our economy and most crucially—we designed it so that the greatest benefit would be felt by small businesses across our country. Cashing in on a Brexit opportunity that only we were prepared to take, and now those rewards are going to be felt by the next generation of founders and business owners in local communities”.

In contrast, a coalition of 25 civil society organisations wrote to the Secretary of State, calling for the Bill to be dropped. The signatories included trade unions as well as human rights, healthcare, racial justice and other organisations. On these Benches, we share the concerns about the government proposals. They will seriously weaken data protection rights in the UK and will particularly harm people from marginalised communities.

So that I do not have to acknowledge them at every stage of the Bill, I will now thank a number of organisations. I am slightly taking advantage of the fact that our speeches are not limited but will be extremely limited from Monday onwards—the Minister will have 20 minutes; I, the noble Baroness, Lady Jones, and colleagues will have 15; and Back-Benchers will have 10. I suspect we are into a new era of brevity, but I will take advantage today, believe me. I thank Bates Wells, Big Brother Watch, Defend Digital Me, the Public Law Project, Open Rights Group, Justice, medConfidential, Chris Pounder, the Data & Marketing Association, CACI, Preiskel & Co, AWO, Rights and Security International, the Advertising Association, the National AIDS Trust, Connected by Data and the British Retail Consortium. That is a fair range of organisations that see flaws in the Bill. We on these Benches agree with them and believe that it greatly weakens the existing data protection framework. Our preference, as we expressed at Second Reading, is that the Bill is either completely revised on a massive scale or withdrawn in the course of its passage through the Lords.

I will mention one thing; I do not think the Government are making any great secret of it. The noble Baroness, Lady Kidron, drew my attention to the Keeling schedule, which gives the game away, and Section 2(2). The Information Commissioner will no longer have to pay regard to certain aspects of the protection of personal data—all the words have been deleted, which is quite extraordinary. It is clear that the Bill will dilute protections around personal data processing, reducing the scope of data protected by the safeguards within the existing law. In fact, the Bill gives more power to data users and takes it away from the people the data is about.

I am particularly concerned about the provisions that change the definition of personal data and the purposes for which it can be processed. There is no need to redraft the definitions of personal data, research or the boundaries of legitimate interests. We have made it very clear over a period of time that guidance from the ICO would have been adequate in these circumstances, rather than a whole piece of primary legislation. The recitals are readily available for guidance, and the Government should have used them. More data will be processed, with fewer safeguards than currently permitted, as it will no longer meet the threshold of personal data, or it will be permitted under the new recognised legitimate interest provision, which we will debate later. That combination is a serious threat to privacy rights in the UK, and that is the context of a couple of our probing amendments to Clause 1— I will come on to the clause stand part notice.

As a result of these government changes, data in one organisation’s hands may be anonymous, while that same information in another organisation’s hands can be personal data. The factor that determines whether personal data can be reidentified is whether the appropriate organisational measures and technical safeguards exist to keep the data in question separate from the identity of specific individuals. That is a very clear decision by the CJEU; the case is SRB v EDPS, if the Minister is interested.

The ability to identify an individual indirectly with the use of additional information is due to the lack of appropriate organisational and technical measures. If the organisation had such appropriate measures that separated data into differently silos, it would not be able to use the additional information to identify such an individual. The language of technical and organisational measures is used in the definition of pseudonymisation in Clause 1(3)(d), which refers to “indirectly identifiable” information. If such measures existed, the data would be properly pseudonymised, in which case it would no longer be indirectly identifiable.

A lot of this depends on how data savvy organisations are, so those that are not well organised and do not have the right technology will get a free pass. That cannot be right, so I hope the Minister will respond to that. We need to make sure that personal data remains personal data, even if some may claim it is not.

Regarding my Amendment 5, can the Government explicitly confirm that personal data that is

“pseudonymised in part, but in which other indirect identifiers remain unaltered”

will remain personal data after this clause is passed? Can the Government also confirm that if an assessment is made that some data is not personal data, but that assessment is later shown to be incorrect, the data will have been personal data at all times and should be treated as such by controllers, processors and the Information Commissioner, about whom we will talk when we come to the relevant future clauses.

Amendment 288 simply asks the Government for an impact assessment. If they are so convinced that the definition of personal data will change, they should be prepared to submit to some kind of impact assessment after the Bill comes into effect. Those are probing amendments, and it would be useful to know whether the Government have any intention to assess what the impact of their changes to the Bill would be if they were passed. More importantly, we believe broadly that Clause 1 is not fit for purpose, and that is why we have tabled the clause stand part notice.

As we said, this change will erode people’s privacy en masse. The impacts could include more widespread use of facial recognition and an increase in data processing with minimal safeguards in the context of facial recognition, as the threshold for personal data would be met only if the data subject is on a watchlist and therefore identified. If an individual is not on a watchlist and images are deleted after checking it, the data may not be considered personal and so would not qualify for data protection obligations.

People’s information could be used to train AI without their knowledge or consent. Personal photos scraped from the internet and stored to train an algorithm would no longer be seen as personal data, as long as the controller does not recognise the individual, is not trying to identify them and will not process the data in such a way that would identify them. The police would have increased access to personal information. Police and security services will no longer have to go to court if they want access to genetic databases; they will be able to access the public’s genetic information as a matter of routine.

Personal data should be defined by what type of data it is, not by how easy it is for a third party to identify an individual from it. That is the bottom line. Replacing a stable, objective definition that grants rights to the individual with an unstable, subjective definition that determines the rights an individual has over their data according to the capabilities of the processor is illogical, complex, bad law-making. It is contrary to the very premise of data protection law, which is founded upon personal data rights. We start on the wrong foot in Clause 1, and it continues. I beg to move.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

My Lords, I rise to speak in favour of Amendments 1 and 5 in this group and with sympathy towards Amendment 4. The noble Lord, Lord Clement-Jones, will remember when I was briefly Minister for Health. We had lots of conversations about health data. One of the things we looked at was a digitised NHS. It was essential if we were to solve many problems of the future and have a world-class NHS, but the problem was that we had to make sure that patients were comfortable with the use of their data and the contexts in which it could be used.

When we were looking to train AI, it was important that we made sure that the data was as anonymous as possible. For example, we looked at things such as synthetic and pseudonymised data. There is another point: having done the analysis and looked at the dataset, if you see an identifiable group of people who may well be at risk, how can you reverse-engineer that data perhaps to notify those patients that they should be contacted for further medical interventions?

I know that that makes it far too complicated; I just wanted to rise briefly to support the noble Lord, Lord Clement-Jones, on this issue, before the new rules come in next week. It is essential that the users, the patients—in other spheres as well—have absolute confidence that their data is theirs and are given the opportunity to give permission or opt out as much as possible.

One of the things that I said when I was briefed as a Health Minister was that we can have the best digital health system in the world, but it is no good if people choose to opt out or do not have confidence. We need to make sure that the Bill gives those patients that confidence where their data is used in other areas. We need to toughen this bit up. That is why I support Amendments 1 and 5 in the name of the noble Lord, Lord Clement-Jones.

Lord Davies of Brixton Portrait Lord Davies of Brixton (Lab)
- Hansard - - - Excerpts

My Lords, anonymisation of data is crucially important in this debate. I want to see, through the Bill, a requirement for personal data, particularly medical data, to be held within trusted research environments. This is a well-developed technique and Britain is the leader. It should be a legal requirement. I am not quite sure that we have got that far in the Bill; maybe we will need to return to the issue on Report.

The extent to which pseudonymisation—I cannot say it—is possible is vastly overrated. There is a sport among data scientists of being able to spot people within generally available datasets. For example, the data available to TfL through people’s use of Oyster cards and so on tells you an immense amount of information about individuals. Medical data is particularly susceptible to this, although it is not restricted to medical data. I will cite a simple example from publicly available data.

Data Protection and Digital Information Bill

Lord Kamall Excerpts
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am not aware one way or the other, but I will happily look into that to see what further safeguards we can add so that we are not bombarding people who are too young with this material.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

May I make a suggestion to my noble friend the Minister? It might be worth asking the legal people to get the right wording, but if there are different ages at which people can vote in different parts of the United Kingdom, surely it would be easier just to relate it to the age at which they are able to vote in those elections. That would address a lot of the concerns that many noble Lords are expressing here today.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, this whole area of democratic engagement is one that the Minister will need to explain in some detail. This is an Alice in Wonderland schedule: “These words mean what I want them to mean”. If, for instance, you are engaging with the children of a voter—at 14, they are children—is that democratic engagement? You could drive a coach and horses through Schedule 1. The Minister used the word “necessary”, but he must give us rather more than that. It was not very reassuring.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I rise to speak to a series of minor and technical, yet necessary, government amendments which, overall, improve the functionality of the Bill. I hope the Committee will be content if I address them together. Amendments 20, 42, 61 and 63 are minor technical amendments to references to special category data in Clauses 6 and 14. All are intended to clarify that references to special category data mean references to the scope of Article 9(1) of the UK GDPR. They are simply designed to improve the clarity of the drafting.

I turn now to the series of amendments that clarify how time periods within the data protection legal framework are calculated. For the record, these are Amendments 136, 139, 141, 149, 151, 152, 176, 198, 206 to 208, 212 to 214, 216, 217, 253 and 285. Noble Lords will be aware that the data protection legislation sets a number of time periods or deadlines for certain things to happen, such as responding to subject access requests; in other words, at what day, minute or hour the clock starts and stops ticking in relation to a particular procedure. The Data Protection Act 2018 expressly applies the EU-derived rules on how these time periods should be calculated, except in a few incidences where it is more appropriate for the UK domestic approach to apply, for example time periods related to parliamentary procedures. I shall refer to these EU-derived rules as the time periods regulation.

In response to the Retained EU Law (Revocation and Reform) Act 2023, we are making it clear that the time periods regulation continues to apply to the UK GDPR and other regulations that form part of the UK’s data protection and privacy framework, for example, the Privacy and Electronic Communications (EC Directive) Regulations 2003. By making such express provision, our aim is to ensure consistency and continuity and to provide certainty for organisations, individuals and the regulator. We have also made some minor changes to existing clauses in the Bill to ensure that application of the time periods regulation achieves the correct effect.

Secondly, Amendment 197 clarifies that the requirement to consult before making regulations that introduce smart data schemes may be satisfied by a consultation before the Bill comes into force. The regulations must also be subject to affirmative parliamentary scrutiny to allow Members of both Houses to scrutinise legislation. This will facilitate the rapid implementation of smart data schemes, so that consumers and businesses can start benefiting as soon as possible. The Government are committed to working closely with business and wider stakeholders in the development of smart data.

Furthermore, Clause 96(3) protects data holders from the levy that may be imposed to meet the expenses of persons and bodies performing functions under smart data regulations. This levy cannot be imposed on data holders that do not appear capable of being directly affected by the exercise of those functions.

Amendment 196 extends that protection to authorised persons and third-party recipients on whom the levy may also be imposed. Customers will not have to pay to access their data, only for the innovative services offered by third parties. We expect that smart data schemes will deliver significant time and cost savings for customers.

The Government are committed to balancing the incentives for businesses to innovate and provide smart data services with ensuring that all customers are empowered through their data use and do not face undue financial barriers or digital exclusion. Any regulations providing for payment of the levy or fees will be subject to consultation and to the affirmative resolution procedure in Parliament.

Amendments 283 and 285 to Schedule 15 confer a general incidental power on the information commission. It will have the implied power to do things incidental to or consequential upon the exercise of its functions, for example, to hold land and enter into agreements. This amendment makes those implicit powers explicit for the avoidance of doubt and in line with standard practice. It does not give the commission substantive new powers. I beg to move.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

My Lords, I know that these amendments were said to be technical amendments, so I thought I would just accept them, but when I saw the wording of Amendment 283 some alarm bells started ringing. It says:

“The Commission may do anything it thinks appropriate for the purposes of, or in connection with, its functions”.


I know that the Minister said that this is stating what the commission is already able to do, but I am concerned whenever I see those words anywhere. They give a blank cheque to any authority or organisation.

Many noble Lords will know that I have previously spoken about the principal-agent theory in politics, in which certain powers are delegated to an agency or regulator, but what accountability does it have? I worry when I see that it “may do anything … appropriate” to fulfil its tasks. I would like some assurance from the Minister that there is a limit to what the information commission can do and some accountability. At a time when many of us are asking who regulates the regulators and when we are looking at some of the arm’s-length bodies—need I mention the Post Office?—there is some real concern about accountability.

I understand the reason for wanting to clarify or formalise what the Minister believes the information commission is doing already, but I worry about this form of words. I would like some reassurance that it is not wide-ranging and that there is some limit and accountability to future Governments. I have seen this sentiment across the House; people are asking who regulates the regulators and to whom are they accountable.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I must congratulate the noble Lord, Lord Kamall. Amid a blizzard of technical and minor amendments from the Minister, he forensically spotted one to raise in that way. He is absolutely right. The Industry and Regulators Committee has certainly been examining the accountability and scrutiny devoted to regulators, so we need to be careful in the language that we use. I think we have to take a lot on trust from the Minister, particularly in Grand Committee.

I apparently failed to declare an interest at Second Reading. I forgot to state that I am a consultant to DLA Piper and the Whips have reminded me today that I failed to do so on the first day in Committee, so I apologise to the Committee for that. I am not quite sure why my consultancy with DLA Piper is relevant to the data protection Bill, but there it is. I declare it.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

I should also declare an interest. I apologise that I did not do so earlier. I worked with a think tank and wrote a series of papers on who regulates the regulators. I still have a relationship with that think tank.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

My Lords, I have been through this large group and, apart from my natural suspicion that there might be something dastardly hidden away in it, I am broadly content, but I have a few questions.

On Amendment 20, can the Minister conform that the new words “further processing” have the same meaning as the reuse of personal data? Can he confirm that Article 5(1)(b) will prohibit this further processing when it is not in line with the original purpose for which the data was collected? How will the data subject know that is the case?

On Amendment 196, to my untutored eye it looks like the regulation-making power is being extended away from the data holder to include authorised persons and third-party recipients. My questions are simple enough: was this an oversight on the part of the original drafters of that clause? Is the amendment an extension of those captured by the effect of the clause? Is it designed to achieve consistency across the Bill? Finally, can I assume that an authorised person or third party would usually be someone acting on behalf of an agent of the data holder?

I presume that Amendments 198, 212 and 213 are needed because of a glitch in the drafting—similarly with Amendment 206. I can see that Amendments 208, 216 and 217 clarify when time periods begin, but why are the Government seeking to disapply time periods in Amendment 253 when surely some consistency is required?

Finally—I am sure the Minister will be happy about this—I am all in favour of flexibility, but Amendment 283 states that the Information Commissioner has the power to do things to facilitate the exercise of his functions. The noble Lord, Lord Kamall, picked up on this. We need to understand what those limits are. On the face of it, one might say that the amendment is sensible, but it seems rather general and broad in its application. As the noble Lord, Lord Kamall, rightly said, we need to see what the limits of accountability are. This is one of those occasions.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the noble Lords, Lord Kamall and Lord Bassam, for their engagement with this group. On the questions from the noble Lord, Lord Kamall, these are powers that the ICO would already have in common law. As I am given to understand is now best practice, they are put on a statutory footing in the Bill as part of best practice with all Bills. The purpose is to align with best practice. It does not confer substantial new powers but clarifies the powers that the regulator has. I can also confirm that the ICO was and remains accountable to Parliament.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

I am sorry to intervene as I know that noble Lords want to move on to other groups, but the Minister said that the ICO remains accountable to Parliament. Will he clarify how it is accountable to Parliament for the record?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

The Information Commissioner is directly accountable to Parliament in that he makes regular appearances in front of Select Committees that scrutinise the regulator’s work, including progress against objectives.

The noble Lord, Lord Bassam, made multiple important and interesting points. I hope he will forgive me if I undertake to write to him about those; there is quite a range of topics to cover. If there are any on which he requires answers right away, he is welcome to intervene.

Data Protection and Digital Information Bill

Lord Kamall Excerpts
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, as is so often the case on these issues, it is daunting to follow the noble Baroness as she has addressed the issues so comprehensively. I speak in support of Amendment 57, to which I have added my name, and register my support for my noble friend Lord Holmes’s Amendment 59A, but I will begin by talking about the Clause 14 stand part notice.

Unfortunately, I was not able to stay for the end of our previous Committee session so I missed the last group on automated decision-making; I apologise if I cover ground that the Committee has already covered. It is important to start by saying clearly that I am in favour of automated decision-making and the benefits that it will bring to society in the round. I see from all the nodding heads that we are all in the same place—interestingly, my Whip is shaking his head. We are trying to make sure that automated decision-making is a force for good and to recognise that anything involving human beings—even automated decision-making does, because human beings create it—has the potential for harm as well. Creating the right guard-rails is really important.

Like the noble Baroness, Lady Kidron, until I understood the Bill a bit better, I mistakenly thought that the Government’s position was not to regulate AI. But that is exactly what we are doing in the Bill, in the sense that we are loosening regulation and the ability to make use of automated decision-making. While that may be the right answer, I do not think we have thought about it in enough depth or scrutinised it in enough detail. There are so few of us here; I do not think we quite realise the scale of the impact of this Bill and this clause.

I too feel that the clause should be removed from the Bill—not because it might not ultimately be the right answer but because this is something that society needs to debate fully and comprehensively, rather than it sneaking into a Bill that not enough people, either in this House or the other place, have really scrutinised.

I assume I am going to lose that argument, so I will briefly talk about Amendment 57. Even if the Government remain firm that there is “nothing to see here” in Clause 14, we know that automated decision-making can do irreparable harm to children. Any of us who has worked on child internet safety—most of us have worked on it for at least a decade—regret that we failed to get in greater protections earlier. We know of the harm done to children because there have not been the right guard-rails in the digital world. We must have debated together for hours and hours why the harms in the algorithms of social media were not expressly set out in the Online Safety Act. This is the same debate.

It is really clear to me that it should not be possible to amend the use of automated decision-making to in any way reduce protections for children. Those protections have been hard fought and ensure a higher bar for children’s data. This is a classic example of where the Bill reduces that, unless we are absolutely explicit. If we are unable to persuade the Government to remove Clause 14, it is essential that the Bill is explicit that the Secretary of State does not have the power to reduce data protection for children.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

My Lords, I speak in favour of the clause stand part notice in my name and that of the noble Lord, Lord Clement-Jones.

Lord Harlech Portrait Lord Harlech (Con)
- Hansard - - - Excerpts

The noble Lord missed the start of the debate.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

I apologise and thank the noble Lord for his collegiate approach.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank all noble Lords who have contributed to this debate. We have had a major common theme, which is that any powers exercised by the Secretary of State in Clause 14 should be to enhance, rather than diminish, the protections for a data subject affected by automated decision-making. We have heard some stark and painful examples of the way in which this can go wrong if it is not properly regulated. As noble Lords have said, this seems to be regulation on automated decision-making by the backdoor, but with none of the protections and promises that have been made on this subject.

Our Amendment 59 goes back to our earlier debate about rights at work when automated decision-making is solely or partly in operation. It provides an essential underpinning of the Secretary of State’s powers. The Minister has argued that ADM is a new development and that it would be wrong to be too explicit about the rules that should apply as it becomes more commonplace, but our amendment cuts through those concerns by putting key principles in the Bill. They are timeless principles that should apply regardless of advances in the adoption of these new technologies. They address the many concerns raised by workers and their representatives, about how they might be disfranchised or exploited by machines, and put human contact at the heart of any new processes being developed. I hope that the Minister sees the sense of this amendment, which will provide considerable reassurance for the many people who fear the impact of ADM in their working lives.

I draw attention to my Amendments 58 and 73, which implement the recommendations of the Delegated Powers and Regulatory Reform Committee. In the Bill, the new Articles 22A to 22D enable the Secretary of State to make further provisions about safeguards when automated decision-making is in place. The current wording of new Article 22D makes it clear that regulations can be amended

“by adding or varying safeguards”.

The Delegated Powers Committee quotes the department saying that

“it does not include a power to remove safeguards provided in new Article 22C and therefore cannot be exercised to weaken the protections”

afforded to data subjects. The committee is not convinced that the department is right about this, and we agree with its analysis. Surely “vary” means that the safeguards can move in either direction—to improve or reduce protection.

The committee also flags up concerns that the Bill’s amendments to Sections 49 and 50 of the Data Protection Act make specific provision about the use of automated decision-making in the context of law enforcement processing. In this new clause, there is an equivalent wording, which is that the regulations may add or vary safeguards. Again, we agree with its concerns about the application of these powers to the Secretary of State. It is not enough to say that these powers are subject to the affirmative procedure because, as we know and have discussed, the limits on effective scrutiny of secondary legislation are manifest.

We have therefore tabled Amendments 58 and 73, which make it much clearer that the safeguards cannot be reduced by the Secretary of State. The noble Lord, Lord Clement-Jones, has a number of amendments with a similar intent, which is to ensure that the Secretary of State can add new safeguards but not remove them. I hope the Minister is able to commit to taking on board the recommendations of the Delegated Powers Committee in this respect.

The noble Baroness, Lady Kidron, once again made the powerful point that the Secretary of State’s powers to amend the Data Protection Act should not be used to reduce the hard-won standards and protections for children’s data. As she says, safeguards do not constitute a right, and having regard to the issues is a poor substitute for putting those rights back into the Bill. So I hope the Minister is able to provide some reassurance that the Bill will be amended to put these hard-won rights back into the Bill, where they belong.

I am sorry that the noble Lord, Lord Holmes, is not here. His amendment raises an important point about the need to build in the views of the Information Commissioner, which is a running theme throughout the Bill. He makes the point that we need to ensure, in addition, that a proper consultation of a range of stakeholders goes into the Secretary of State’s deliberations on safeguards. We agree that full consultation should be the hallmark of the powers that the Secretary of State is seeking, and I hope the Minister can commit to taking those amendments on board.

I echo the specific concerns of the noble Lord, Lord Clement-Jones, about the impact assessment and the supposed savings from changing the rules on subject access requests. This is not specifically an issue for today’s debate but, since it has been raised, I would like to know whether he is right that the savings are estimated to be 50% and not 1%, which the Minister suggested when we last debated this. I hope the Minister can clarify this discrepancy on the record, and I look forward to his response.

--- Later in debate ---
It is worth stressing that this amendment does not say how the Government should do this; it simply sets out the principle that the Government should do it. It is not open to the argument that the amendment should be drafted differently or approached in another way; it simply says that we should free the postal address file. I very much hope to hear positive words and a positive direction from the Minister on Amendment 252.
Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

My Lords, I apologise for not being here on Monday, when I wanted to speak about automated decision-making. I was not sure which group to speak on today; I am thankful that my noble friend Lord Harlech intervened to ensure that I spoke on this group and made my choice much easier.

I want to speak on Amendments 74 to 77 because transparency is essential. However, one of the challenges about transparency is to ensure you understand what you are reading. I will give noble Lords a quick example: when I was in the Department of Health and Social Care, we had a scheme called the voluntary pricing mechanism for medicines. Companies would ask whether that could be changed and there could be a different relationship because they felt that they were not getting enough value from it. I said to the responsible person in the department, “I did engineering and maths, so can you send me a copy of algorithm?” He sent it to me, and it was 100 pages long. I said, “Does anyone understand this algorithm?”, and he said, “Oh yes, the analysts do”. I was about to get a meeting, but then I was moved to another department. That shows that even if we ask for transparency, we have to make sure that we understand what we are being given. As the noble Lord, Lord Clement-Jones, has worded this, we have to make sure that we understand the functionality and what it does at a high enough level.

My noble friend Lady Harding often illustrates her points well with short stories. I am going to do that briefly with two very short stories. I promise to keep well within the time limit.

A few years ago, I was on my way to a fly to Strasbourg because I was a Member of the European Parliament. My train got stuck, and I missed my flight. My staff booked me a new ticket and sent me the boarding pass. I got to the airport, which was fantastic, and got through the gate and was waiting for my flight in a waiting area. They called to start boarding and, when I went to go on, they scanned my pass again and I was denied boarding. I asked why I was denied, having been let into the gate area in the first place, but no one could explain why. To cut a long story short, over two hours, four or five people from that company gaslighted me. Eventually, when I got back to the check-in desk, which the technology was supposed to avoid in the first place, it was explained that they had sent me an email the day before. In fact, they had not sent me an email the day before, which they admitted the day after, but no one ever explained why I was not allowed on that flight.

Imagine that in the public sector. I can accept it, although it was awful behaviour by that company, but imagine that happening for a critical operation that had been automated to cut down on paperwork. Imagine turning up for your operation when you are supposed to scan your barcode to be let into the operating theatre. What happens if there is no accountability or transparency in that case? This is why the amendments tabled by the noble Lord, Lord Clement-Jones, are essential.

Here is another quick story. A few years ago, someone asked me whether I was going to apply for one of these new fintech banks. I submitted the application and the bank said that it would get back to me within 48 hours. It did not. Two weeks later, I got a message on the app saying that I had been rejected, that I would not be given an account and that “by law, we do not have to explain why”.

Can you imagine that same technology being used in the public sector, with a WYSIWYG on the fantastic NHS app that we have now? Imagine booking an appointment then suddenly getting a message back saying, “Your appointment has been denied but we do not have to explain why”. These Amendments 74 to 78 must be given due consideration by the Government because it is absolutely essential that citizens have full transparency on decisions made through automated decision-making. We should not allow the sort of technology that was used by easyJet and Monzo in this case to permeate the public sector. We need more transparency—it is absolutely essential—which is why I support the amendments in the name of the noble Lord, Lord Clement-Jones.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, I associate myself with the comments that my noble friend Lord Kamall just made. I have nothing to add on those amendments, as he eloquently set out why they are so important.

In the spirit of transparency, my intervention enables me to point out, were there any doubt, who I am as opposed to the noble Baroness, Lady Bennett, who was not here earlier but who I was mistaken for. Obviously, we are not graced with the presence of my noble friend Lord Maude, but I am sure that we all know what he looks like as well.

I will speak to two amendments. The first is Amendment 144, to which I have added my name. As usual, the noble Baroness, Lady Kidron, has said almost everything that can be said on this but I want to amplify two things. I have yet to meet a politician who does not get excited about the two-letter acronym that is AI. The favoured statement is that it is as big a change in the world as the discovery of electricity or the invention of the wheel. If it is that big—pretty much everyone in the world who has looked at it probably thinks it is—we need properly to think about the pluses and the minuses of the applications of AI for children.

The noble Baroness, Lady Kidron, set out really clearly why children are different. I do not want to repeat that, but children are different and need different protections; this has been established in the physical world for a very long time. With this new technology that is so much bigger than the advent of electricity and the creation of the first automated factories, it is self-evident that we need to set out how to protect children in that world. The question then is: do we need a separate code of practice on children and AI? Or, as the noble Baroness set out, is this an opportunity for my noble friend the Minister to confirm that we should write into this Bill, with clarity, an updated age-appropriate design code that recognises the existence of AI and all that it could bring? I am indifferent on those two options but I feel strongly that, as we have now said on multiple groups, we cannot just rely on the wording in a previous Act, which this Bill aims to update, without recognising that, at the same time, we need to update what an age-appropriate design code looks like in the age of AI.

The second amendment that I speak to is Amendment 252, on the open address file. I will not bore noble Lords with my endless stories about the use of the address file during Covid, but I lived through and experienced the challenges of this. I highlight an important phrase in the amendment. Proposed new subsection (1) says:

“The Secretary of State must regularly publish a list of UK addresses as open data to an approved data standard”.


One reason why it is a problem for this address data to be held by an independent private company is that the quality of the data is not good enough. That is a real problem if you are trying to deliver a national service, whether in the public sector or the private sector. If the data quality is not good enough, it leaves us substantially poorer as a country. This is a fundamental asset for the country and a fundamental building block of our geolocation data, as the noble Lord, Lord Clement-Jones, set out. Anybody who has tried to build a service that delivers things to human beings in the physical world knows that errors in the database can cause huge problems. It might not feel like a huge problem if it concerns your latest Amazon delivery but, if it concerns the urgent dispatch of an ambulance, it is life and death. Maintaining the accuracy of the data and holding it close as a national asset is therefore hugely important, which is why I lend my support to this amendment.

Data Protection and Digital Information Bill

Lord Kamall Excerpts
Finally, our Amendment 195 picks up another recommendation of the Delegated Powers and Regulatory Reform Committee, which relates to setting fees for people seeking entry or modifying details on the register of providers of verification services. The powers are given to the Secretary of State to set these fees with no reference to Parliament. The Delegated Powers Committee recommends that there should be parliamentary scrutiny using the negative procedure. We agree with this point and this is reflected in our amendment. I therefore beg to move Amendment 177.
Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

My Lords, I speak in favour of Amendment 195ZA in my name and that of the noble Lords, Lord Vaux of Harrowden and Lord Clement-Jones, and Amendments 289 and 300 on digital identity theft. I am also very sympathetic to many of the points made by the noble Baroness, Lady Jones of Whitchurch, particularly about the most disadvantaged people in our society.

As many noble Lords know, I am a member of the Communications and Digital Committee of this House. A few months ago, we did a report on digital exclusion. We had to be quite clear about one of the issues that we found: even though some people may partly use digital—for example, they may have an email address—it does not make them digitally proficient or literate. We have to be very clear that, as more and more of our public and private services go online, it is obvious that companies and others will want to know which people are claiming to use these services. At the same time, a number of people will not be digitally literate or will not have this digital ID available. It is important that we offer them enough alternatives. It should be clear, and not beyond the wit of man or clever lawyers, that there are non-digital alternatives available for consumers and particularly, as was said by the noble Baroness, Lady Jones of Whitchurch, people from disadvantaged communities.

As we found in the report on our inquiry into digital exclusion, this does not concern only people from deprived areas. Sometimes people get by in life without much digital literacy. There are those who may be scared of it or who do not trust it, and they can come from all sorts of wealth brackets. This drives home the point that it is important to have an alternative. I cannot really say much more than the amendment itself; it does what it says on the tin. The amendment is quite clear and I am sure that the noble Lord, Lord Vaux, will speak to it as well.

I will briefly speak in favour of Amendments 289 and 300. Digital identity theft is clearly an issue and has been for a long time. Even before the digital days, identity theft was an issue and it is so much easier to hack someone’s ID these days. I have had bank accounts opened in my name. I received a letter claiming this but, fortunately, the bank was able to deal with it when I walked in and said, “This wasn’t me”. It is quite clear that this will happen more and more. Sometimes, it will simply be stealing data that has been leaked or because a system is not particularly secure; at other times, it will be because you have been careless. No matter why the crime is committed, it must be an offence in the terms suggested by the amendments of the noble Lord, Lord Clement-Jones. It is clear that we have to send a strong signal that digital identity theft is a crime and that people should be deterred from engaging in it.

Lord Vaux of Harrowden Portrait Lord Vaux of Harrowden (CB)
- Hansard - - - Excerpts

My Lords, I have added my name to Amendment 195ZA—I will get to understand where these numbers come from, at some point—in the name of the noble Lord, Lord Kamall, who introduced it so eloquently. I will try to be brief in my support.

For many people, probably most, the use of online digital verification will be a real benefit. The Bill puts in place a framework to strengthen digital verification so, on the whole, I am supportive of what the Government are trying to do, although I think that the Minister should seriously consider the various amendments that the noble Baroness, Lady Jones of Whitchurch, has proposed to strengthen parliamentary scrutiny in this area.

However, not everyone will wish to use digital verification in all cases, perhaps because they are not sufficiently confident with technology or perhaps they simply do not trust it. We have already heard the debates around the advances of AI and computer-based decision-making. Digital identity verification could be seen to be another extension of this. There is a concern that Part 2 of the Bill appears to push people ever further towards decisions being taken by a computer.

I suspect that many of us will have done battle with some of the existing identity verification systems. In my own case, I can think of one bank where I gave up in deep frustration as it insisted on telling me that I was not the same person as my driving licence showed. I have also come up against systems used by estate agents when trying to provide a guarantee for my student son that was so intrusive that I, again, refused to use it.

Therefore, improving verification services is to be encouraged but there must be some element of choice, and if someone does not have the know-how, confidence, or trust in the systems, they should be able to do so through some non-digital alternative. They should not be barred from using relevant important services such as, in my examples, banking and renting a property because they cannot or would prefer not to use a digital verification service.

At the very least, even if the Minister is not minded to accept that amendment, I hope that he can make clear that the Government have no intention to make digital ID verification mandatory, as some have suggested that this Part 2 may be driving towards.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Is the Minister saying that, as a result of the Equality Act, there is an absolute right to that analogue—if you like—form of identification if, for instance, someone does not have access to digital services?

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

I understand that some services are purely digital, but some of those may well not have digital ID. We do not know what future services there might be, so they might want to show an analogue ID. Is my noble friend saying that that will not be possible because it will impose too much of a burden on those innovative digital companies? Could he clarify what he said?

Data Protection and Digital Information Bill Debate

Full Debate: Read Full Debate
Department: Department for Work and Pensions

Data Protection and Digital Information Bill

Lord Kamall Excerpts
There are also companies that have regularly short-changed the public purs,; for example, by overcharging the NHS for drugs. They, too, are clearly depleting the public purse. UK emergency services were overcharged by £200 million a year for a communications network. No companies faced any surveillance. Energy companies are overcharging, and defence companies have overcharged the Government by millions. Serco and G4S have overcharged the Government for monitoring services. What I am saying is that the bank accounts of any company, individual or organisation receiving public money ought to be given the same monitoring treatment as the Government are proposing for those receiving public money in the form of benefits and pensions.
Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

My Lords, I will speak in favour of the amendment to which I have added my name, with other noble Lords here today, and also to some of the other amendments in the group. I find it interesting having to follow the noble Lord, Lord Sikka. Quite often we disagree on issues, and we are probably coming at this from different angles, but actually we have come to the same conclusion.

Noble Lords will know of my concerns raised at earlier stages about automated decision-making. We have to ensure that there is always human intervention but, even when there is human intervention, things can go seriously wrong. When I first saw this proposal for mass trawling of bank accounts, I have to say that the first thought that came into my mind was, “This is Big Brother”, so I was not surprised when I received an email and a briefing from Big Brother Watch. I thank Big Brother Watch for its point. I will quickly dip into some of the points made by Big Brother Watch. There are many more points.

People may find it interesting that the noble Lord, Lord Sikka, and I are speaking on this amendment from different angles. Let me be quite clear: I am a classical liberal. Some people call me a libertarian. I believe in a smaller state and government doing less, but there has to be a state to help those who cannot help themselves and people who have fallen on hard times. For some people, that is all they have. They have only the state benefit. There are no local community organisations or civil society organisations to help them, and therefore you have to accept that role for the state.

Those people are quite often the most vulnerable, the least represented and unable to speak up for themselves. I do not want to patronise them, but quite often you find that. When I saw this, I thought, “First of all, this is going to force third-party organisations to trawl”—I use the term advisedly—“customers’ accounts in search of matching accounts”. When we talk about those third-party organisations, we are talking about banks, landlords and a number of other organisations that have some financial relationship with those individuals. Some estimates put it at approximately 40% of the population who could be vulnerable to being trawled.

I am also worried about the precedent that this sets. I know that the noble Lord, Lord Sikka, talked about this in a different way. He would perhaps like this power to be extended. I do not want this power at all. I do not want it to be extended to others. I just do not want it at all.

I also worry about what this surveillance power does to the presumption of innocence. Are we just trawling everyone’s accounts in the hope that they will be found guilty? While I do not always agree with the Information Commissioner’s Office, we should note that it does not view these powers as proportionate.

One general concern that a number of noble Lords have is about AI and, in particular, the transparency of datasets and algorithms. We would want to know, even if we do not understand the algorithm itself, as the noble Lord, Lord Clement-Jones, and I discussed in a debate on earlier amendments, what these algorithms are supposed to be doing and what they are looking for in trawling people’s bank accounts.

There are some precedents to this. We see from financial institutions’ suspicious activity reports that they have a very high false hit rate. I have a friend who is a magistrate, who told me that she heard a case about a family who wanted to get back access to their bank account. She felt that they were under suspicion because of their ethnicity or faith, and said to the bank, “You have not made a clear case for why we should freeze this account. This family has suffered because they are not able to access their bank account”. Think about that mistake being repeated over and over with false positives from this data. The noble Baroness, Lady Kidron, was right to remind us that this is all against the background of the Horizon scandal. Even when people intervene, do they speak up enough to make sure that the victims are heard or does it need an ITV drama to raise these issues?