Committee (3rd Day)
Scottish, Welsh and Northern Ireland Legislative Consent sought.
13:00
Clause 14: Automated decision-making
Amendment 53
Moved by
53: Clause 14, page 27, line 21, leave out “is, or”
Member’s explanatory statement
This amendment, along with others in the name of Lord Clement-Jones, would retain the ability of the Secretary of State to introduce new safeguards but would prevent the removal or variation of safeguards under the new UK GDPR Article 22D and the new section 50D of the 2018 Act.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, once more into the trenches we go before Easter. In moving Amendment 53, I will also speak to Amendments 54, 55, 57, 69, 70, 71 and 72 and the Clause 14 stand part notice.

The Bill contains a number of wide delegated powers, giving the Secretary of State the power to amend the UK GDPR via statutory instrument. The Government have said that the UK GDPR’s key elements remain sound and that they want to continue to offer a high level of protection for the public’s data, but that is no guarantee against significant reforms being brought in through a process that eludes full parliamentary scrutiny through primary legislation. Proposed changes to the UK GDPR should be contained in the Bill, where they can be debated and scrutinised properly via the primary legislation process. As it stands, key provisions of the UK GDPR can subsequently be amended via statutory instrument, which, in this case, is an inappropriate legislative process that affords much less scrutiny and debate, if debates are held at all.

The UK GDPR treats a solely automated decision as one without “meaningful human involvement”. The public are protected from being subject to solely automated decision-making where the decision has a legal or “similarly significant effect”. Clause 14(1) inserts new Article 22D(1) into the UK GDPR, which allows the Secretary of State to make regulations that deem a decision to have involved “meaningful human involvement”, even if there was no active review by a human decision-maker. New Article 22D(2) similarly allows the Secretary of State to make regulations to determine whether a decision made had a “similarly significant effect” to a legal effect. For example, in summer 2021 there was the A-level algorithm grading scandal. If something like that were to reoccur, under this new power a Minister could lay regulations stating that the decision to use an algorithm in grading A-levels was not a decision with a “similarly significant effect”.

New Article 22D(4) also allows the Secretary of State to add or remove, via regulations, any of the listed safeguards for automated decision-making. If the Government wish to amend or remove safeguards on automated decision-making, that should also be specified in the Bill and not left to delegated legislation. Amendments 53 to 55 and 69 to 72 would limit the Secretary of State’s power, so that they may add safeguards but cannot vary or remove those in the new Article 22D, as they stand, when the legislation comes into force.

If the clause is to be retained, we support Amendment 59A in the name of the noble Lord, Lord Holmes, which requires the Information Commissioner’s Office to develop guidance on the interpretation of the safeguards in new Article 22C and on important terms such as “similarly significant effect” and “meaningful human involvement”. It is within the Information Commissioner’s Office’s duties to issue guidance and to harmonise the interpretation of the law. As the dedicated regulator, the ICO is best placed and equipped to publish guidance and ensure consistency of application.

As a way to increase protections and incorporate more participation from those affected, Amendment 59A would add a new paragraph (7) to new Article 22D, which specifies that the Secretary of State needs to consult with the Information Commissioner’s Office if developing regulations. It also includes an obligation for the Secretary of State to consult with data subjects or their representatives, such as trade union or civil society organisations, at least every two years from the commencement of the Bill.

Our preference is for Clause 14 not to stand part of the Bill. The deployment of automated decision-making under Clause 14 risks automating harm, including discrimination, without adequate safeguards. Clause 14 creates a new starting point for all ADM using personal, but not special category, data. It is allowed, including for profiling, provided that certain safeguards are in place. The Minister said those safeguards are “appropriate” and “robust” and provide “certainty”, but I preferred what the noble Lord, Lord Bassam, said about the clause:

“We need more safeguards. We have moved from one clear position to another, which can be described as watering down or shifting the goalposts”.—[Official Report, 25/3/24; col. GC 150.]


That is very much my feeling about the clause as well.

I refer back to the impact assessment, which we discussed at some point during our discussions about Clause 9. It is very interesting that, in table 15 of the impact assessment, the savings on compliance costs are something like £7.3 million as regards AI and machine learning, which does not seem a very big number compared with the total savings on compliance costs, which the Government have put rather optimistically at £295 million.

In passing, I should say that, when I look at the savings regarding subject access requests, I see that the figure is £153 million, which is half of those so-called savings on compliance costs. I do not square that at all with what the Minister says about the total savings on compliance costs for subject access requests being 1%. I do not know quite where those figures come from, but it is a far more significant percentage: it is 50% of what the Government believe that the savings on compliance costs will be. I know that it is not part of this group, but I would be very grateful if the Minister could write to clarify that issue in due course.

Although the Minister has called these adequate, we believe that they are inadequate for three reasons. First, they shift the burden to the individual. Secondly, there is no obligation to provide any safeguards before the decision is made. Neither the Bill nor any of the material associated with it indicates what the content of this information is expected to be, nor the timescales in which that information is to be given. There is nothing to say when representations or contest may be heard, when human intervention may be sought or the level of that intervention. Thirdly, the Secretary of State has delegated powers to vary the safeguards by regulations.

Article 22 is currently one of the strongest prohibitions in the GDPR. As we know, the current starting point is that using solely automated decision-making is prohibited unless certain exemptions apply. The exemptions are limited. Now, as a result of the Government’s changes, you can use solely automated decision-making in an employment context in the UK, which you cannot do in the EU. That is a clear watering down of the restriction. The Minister keeps returning to the safeguards, but I have referred to those. We know that they are not being applied in practice even now and that hiring and firing is taking place without any kind of human review.

There is therefore an entirely inadequate basis on which we can be satisfied that the Bill will safeguard individuals from harmful automated decision-making before it is too late. In fact, the effect of the Bill will be to do the opposite: to permit unfair and unsafe ADM to occur, including discriminatory profiling ADM, which causes harm to individuals. It then places the burden on the individual to complain, without providing for any adequate safeguards to guarantee their ability to do so before the harm is already incurred. While I beg to move Amendment 53, our preference would be that Clause 14 is deleted from the Bill entirely.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I will speak to Amendment 57 in my name, Amendment 59 in the name of the noble Baroness, Lady Jones, and the Clause 14 stand part notice from the noble Lord, Lord Clement-Jones. In doing so, I register my support for Amendment 59A in the name of the noble Lord, Lord Holmes.

The Government assert that there is no diminution of rights in the Bill, yet Clause 14 removes the right not to be subject to an automated decision and replaces that right with inadequate safeguards, as the noble Lord, Lord Clement-Jones, said. On the previous day in Committee, the Minister made the argument that:

“These reforms clarify and simplify the rules related to solely automated decision-making without watering down any of the protections for data subjects or the fundamental data protection principles”,—[Official Report, 25/3/24; col. GC 146.]


but I hope he will at least accept that safeguards do not constitute a right. The fact that the Secretary of State has delegated powers to change the safeguards at will undermines his argument that UK citizens have lost nothing at all; they have lost the right not to be subject to an automated decision.

The fact that the Government have left some guard-rails for special category data is in itself an indication that they know they are downgrading UK data rights, because the safeguards in place are not adequate. If they were adequate, it would be unnecessary to separate out SPC data in this way. I hammer the point home by asking the Minister to explain how the protections will work in practice in an era of AI when risks can come from inference and data analytics that do not use special category data but will still have a profound impact on the work lives, health, finances and opportunities of data subjects. If it is the case that data about your neighbourhood, shopping habits, search results, steps or entertainment choices is used to infer an important decision, how would a data subject activate their rights in that case?

As an illustration of this point, the daughter of a colleague of mine, who, as it happens, has a deep expertise in data law, this year undertook a video-based interview for a Russell group university with no human contact. It was not yet an ADM system, but we are inching ever closer to it. Removing the right, as the Government propose, would place the onus on students to complain or intervene—in a non-vexatious manner, of course. Will the Minister set out how UK citizens will be protected from life-changing decisions after government changes to Article 22, particularly as, in conjunction with other changes such as subject access requests and data impact assessments, UK citizens are about to have fewer routes to justice and less transparency of what is happening to their data?

I would also be grateful if the Minister could speak to whether he believes that the granularity and precision of current profiling deployed by AI and machine learning is sufficiently guaranteed to take this fundamental right away. Similarly, I hope that the known concerns about bias and fairness in ADM will be resolved over time, but we are not there yet, so why is it that the Government have a wait-and-see policy on regulation but are not offering the same “wait and see” in relation to data rights?

On Amendment 59 in the name of the noble Baroness, Lady Jones, the number of workers anticipated to be impacted by AI is simply eye-watering. In last Friday’s debate on AI, it was said to be 300 million worldwide, and one in four across Europe. But how workers work with AI is not simply a scary vision of the near future; it is here now.

I have a family member who last year left an otherwise well-paid and socially useful job when they introduced surveillance on to his computer during his working from home. At the time, he said that the way in which it impacted on both his self-esteem and autonomy was so devastating that he felt like

“a cog in a machine or an Amazon worker with no agency or creativity”.

He was an exemplary employee: top of the bonus list and in all measurable ways the right person in the right job. Efficiency in work has a vital role but it is not the whole picture. We know that, if able and skilled workers lose their will to work, it comes at a considerable cost to the well-being of the nation and the public purse. Most jobs in future will involve working with or even collaborating with technology; ensuring that work is dignified and fair to the human components of this arrangement is not a drag on productivity but a necessity if society is to benefit from changes to technology.

13:15
Amendment 57 in my name would prevent the Secretary of State making any amendments to new Articles 22A, 22B or 22C if such amendments reduce, minimise or undermine the existing standards and protections for children’s data. I hope I have made it clear that I am not setting myself against automated decision-making. Training a model using thousands of scans of people’s lungs can enhance a doctor’s ability to identify potential tumours accurately. Nor do I wish for children to miss out on the benefits of such technology, as the Minister appeared to suggest last week—merely that it should be deployed only when it is in their best interests, as discussed in our debate on the previous group.
Moreover, if the noble Lord, Lord Clement-Jones, is successful in his desire that Clause 14 should not stand part of the Bill, this amendment and Amendment 46 will be unnecessary, but noble Lords will recognise a steady drum beat of resistance against the Government’s plans to change data rights to benefit the commercial interests of tech companies at the expense of children.
In his answer relating to legitimate interests, the Minister pointed out that, when amending or adding to Annexe 1, the Secretary of State already has a duty to have regard to
“the need to provide children with special protection with regard to their personal data”.
Unless the Minister can tell me otherwise, I believe that is the only instance where she is required to do so when exercising her powers. So there is a place for some of the broader amendments from the second group that speak to the status of children throughout the Bill. I remind the Minister of my suggestion that recital 38 be put on the face of the Bill, as the Government have done with so many other recitals to give “legal certainty” or “clarity”.
Irrespective of that wider point, I trust that the Minister will at least agree with me that having regard to something is quite different from ensuring something. There is a difference between a vague notion that all is changed but nothing diminished and the certainty demanded by the children’s amendments. I ask the Minister to be certain when he replies to address the question of whether “having regard” is the same bar as “ensuring no diminution of standards”.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, as is so often the case on these issues, it is daunting to follow the noble Baroness as she has addressed the issues so comprehensively. I speak in support of Amendment 57, to which I have added my name, and register my support for my noble friend Lord Holmes’s Amendment 59A, but I will begin by talking about the Clause 14 stand part notice.

Unfortunately, I was not able to stay for the end of our previous Committee session so I missed the last group on automated decision-making; I apologise if I cover ground that the Committee has already covered. It is important to start by saying clearly that I am in favour of automated decision-making and the benefits that it will bring to society in the round. I see from all the nodding heads that we are all in the same place—interestingly, my Whip is shaking his head. We are trying to make sure that automated decision-making is a force for good and to recognise that anything involving human beings—even automated decision-making does, because human beings create it—has the potential for harm as well. Creating the right guard-rails is really important.

Like the noble Baroness, Lady Kidron, until I understood the Bill a bit better, I mistakenly thought that the Government’s position was not to regulate AI. But that is exactly what we are doing in the Bill, in the sense that we are loosening regulation and the ability to make use of automated decision-making. While that may be the right answer, I do not think we have thought about it in enough depth or scrutinised it in enough detail. There are so few of us here; I do not think we quite realise the scale of the impact of this Bill and this clause.

I too feel that the clause should be removed from the Bill—not because it might not ultimately be the right answer but because this is something that society needs to debate fully and comprehensively, rather than it sneaking into a Bill that not enough people, either in this House or the other place, have really scrutinised.

I assume I am going to lose that argument, so I will briefly talk about Amendment 57. Even if the Government remain firm that there is “nothing to see here” in Clause 14, we know that automated decision-making can do irreparable harm to children. Any of us who has worked on child internet safety—most of us have worked on it for at least a decade—regret that we failed to get in greater protections earlier. We know of the harm done to children because there have not been the right guard-rails in the digital world. We must have debated together for hours and hours why the harms in the algorithms of social media were not expressly set out in the Online Safety Act. This is the same debate.

It is really clear to me that it should not be possible to amend the use of automated decision-making to in any way reduce protections for children. Those protections have been hard fought and ensure a higher bar for children’s data. This is a classic example of where the Bill reduces that, unless we are absolutely explicit. If we are unable to persuade the Government to remove Clause 14, it is essential that the Bill is explicit that the Secretary of State does not have the power to reduce data protection for children.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

My Lords, I speak in favour of the clause stand part notice in my name and that of the noble Lord, Lord Clement-Jones.

Lord Harlech Portrait Lord Harlech (Con)
- Hansard - - - Excerpts

The noble Lord missed the start of the debate.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

I apologise and thank the noble Lord for his collegiate approach.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank all noble Lords who have contributed to this debate. We have had a major common theme, which is that any powers exercised by the Secretary of State in Clause 14 should be to enhance, rather than diminish, the protections for a data subject affected by automated decision-making. We have heard some stark and painful examples of the way in which this can go wrong if it is not properly regulated. As noble Lords have said, this seems to be regulation on automated decision-making by the backdoor, but with none of the protections and promises that have been made on this subject.

Our Amendment 59 goes back to our earlier debate about rights at work when automated decision-making is solely or partly in operation. It provides an essential underpinning of the Secretary of State’s powers. The Minister has argued that ADM is a new development and that it would be wrong to be too explicit about the rules that should apply as it becomes more commonplace, but our amendment cuts through those concerns by putting key principles in the Bill. They are timeless principles that should apply regardless of advances in the adoption of these new technologies. They address the many concerns raised by workers and their representatives, about how they might be disfranchised or exploited by machines, and put human contact at the heart of any new processes being developed. I hope that the Minister sees the sense of this amendment, which will provide considerable reassurance for the many people who fear the impact of ADM in their working lives.

I draw attention to my Amendments 58 and 73, which implement the recommendations of the Delegated Powers and Regulatory Reform Committee. In the Bill, the new Articles 22A to 22D enable the Secretary of State to make further provisions about safeguards when automated decision-making is in place. The current wording of new Article 22D makes it clear that regulations can be amended

“by adding or varying safeguards”.

The Delegated Powers Committee quotes the department saying that

“it does not include a power to remove safeguards provided in new Article 22C and therefore cannot be exercised to weaken the protections”

afforded to data subjects. The committee is not convinced that the department is right about this, and we agree with its analysis. Surely “vary” means that the safeguards can move in either direction—to improve or reduce protection.

The committee also flags up concerns that the Bill’s amendments to Sections 49 and 50 of the Data Protection Act make specific provision about the use of automated decision-making in the context of law enforcement processing. In this new clause, there is an equivalent wording, which is that the regulations may add or vary safeguards. Again, we agree with its concerns about the application of these powers to the Secretary of State. It is not enough to say that these powers are subject to the affirmative procedure because, as we know and have discussed, the limits on effective scrutiny of secondary legislation are manifest.

We have therefore tabled Amendments 58 and 73, which make it much clearer that the safeguards cannot be reduced by the Secretary of State. The noble Lord, Lord Clement-Jones, has a number of amendments with a similar intent, which is to ensure that the Secretary of State can add new safeguards but not remove them. I hope the Minister is able to commit to taking on board the recommendations of the Delegated Powers Committee in this respect.

The noble Baroness, Lady Kidron, once again made the powerful point that the Secretary of State’s powers to amend the Data Protection Act should not be used to reduce the hard-won standards and protections for children’s data. As she says, safeguards do not constitute a right, and having regard to the issues is a poor substitute for putting those rights back into the Bill. So I hope the Minister is able to provide some reassurance that the Bill will be amended to put these hard-won rights back into the Bill, where they belong.

I am sorry that the noble Lord, Lord Holmes, is not here. His amendment raises an important point about the need to build in the views of the Information Commissioner, which is a running theme throughout the Bill. He makes the point that we need to ensure, in addition, that a proper consultation of a range of stakeholders goes into the Secretary of State’s deliberations on safeguards. We agree that full consultation should be the hallmark of the powers that the Secretary of State is seeking, and I hope the Minister can commit to taking those amendments on board.

I echo the specific concerns of the noble Lord, Lord Clement-Jones, about the impact assessment and the supposed savings from changing the rules on subject access requests. This is not specifically an issue for today’s debate but, since it has been raised, I would like to know whether he is right that the savings are estimated to be 50% and not 1%, which the Minister suggested when we last debated this. I hope the Minister can clarify this discrepancy on the record, and I look forward to his response.

Viscount Camrose Portrait The Parliamentary Under-Secretary of State, Department for Science, Innovation and Technology (Viscount Camrose) (Con)
- Hansard - - - Excerpts

I thank the noble Lords, Lord Clement-Jones and Lord Knight, my noble friend Lord Holmes and the noble Baronesses, Lady Jones, Lady Kidron and Lady Bennett—

None Portrait Noble Lords
- Hansard -

Lady Harding.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I apologise to my noble friend. I cannot be having a senior moment already—we have only just started. I look forward to reading that part in Hansard.

I can reassure noble Lords that data subjects still have the right to object to solely automated decision-making. It is not an absolute right in all circumstances, but I note that it never has been. The approach taken in the Bill complements the UK’s AI regulation framework, and the Government are committed to addressing the risks that AI poses to data protection and wider society. Following the publication of the AI regulation White Paper last year, the Government started taking steps to establish a central AI risk function that brings together policymakers and AI experts with the objective of identifying, assessing and preparing for AI risks. To track identified risks, we have established an initial AI risk register, which is owned by the central AI risk function. The AI risk register lists individual risks associated with AI that could impact the UK, spanning national security, defence, the economy and society, and outlines their likelihood and impact. We have also committed to engaging on and publishing the AI risk register in spring this year.

13:30
The ICO also monitors the effects of AI on people and society using sources including its own casework, stakeholder engagement and wider intelligence gathering. The ICO is currently looking at how it might update its guidance on AI to improve its usability and is committed to incorporating any changes needed as a result of the Bill.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I am processing what the Minister has just said. He said it complements the AI regulation framework, and then he went on to talk about the central risk function, the AI risk register and what the ICO is up to in terms of guidance, but I did not hear that the loosening of safeguards or rights under Clause 14 and Article 22 of the GDPR was heralded in the White Paper or the consultation. Where does that fit with the Government’s AI regulation strategy? There is a disjunct somewhere.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I reject the characterisation of Clause 14 or any part of the Bill as loosening the safeguards. It focuses on the outcomes and by being less prescriptive and more adaptive, its goal is to heighten the levels of safety of AI, whether through privacy or anything else. That is the purpose.

On Secretary of State powers in relation to ADM, the reforms will enable the Government to further describe what is and is not to be taken as a significant effect on a data subject and what is and is not to be taken as meaningful human—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

I may be tired or just not very smart, but I am not really sure that I understand how being less prescriptive and more adaptive can heighten safeguards. Can my noble friend the Minister elaborate a little more and perhaps give us an example of how that can be the case?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Certainly. Being prescriptive and applying one-size-fits-all measures for all processes covered by the Bill encourages organisations to follow a process, but focusing on outcomes encourages organisations to take better ownership of the outcomes and pursue the optimal privacy and safety mechanisms for those organisations. That is guidance that came out very strongly in the Data: A New Direction consultation. Indeed, in the debate on a later group we will discuss the use of senior responsible individuals rather than data protection officers, which is a good example of removing prescriptiveness to enhance adherence to the overall framework and enhance safety.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

This seems like a very good moment to ask whether, if the variation is based on outcome and necessity, the Minister agrees that the higher bar of safety for children should be specifically required as an outcome.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I absolutely agree about the outcome of higher safety for children. We will come to debate whether the mechanism for determining or specifying that outcome is writing that down specifically, as suggested.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I am sure the Minister knew I was going to stand up to say that, if it is not part of the regulatory instruction, it will not be part of the outcome. The point of regulation is to determine a floor— never a ceiling—below which people cannot go. Therefore, if we wish to safeguard children, we must have that floor as part of the regulatory instruction.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Indeed. That may well be the case, but how that regulatory instruction is expressed can be done in multiple ways. Let me continue; otherwise, I will run out of time.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I am having a senior moment as well. Where are the outcomes written? What are we measuring this against? I like the idea; it sounds great—management terminology—but I presume that it is written somewhere and that we could easily add children’s rights to the outcomes as the noble Baroness suggests. Where are they listed?

Lord Harlech Portrait Lord Harlech (Con)
- Hansard - - - Excerpts

My Lords, I think we should try to let the Minister make a little progress and see whether some of these questions are answered.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I am sorry, but I just do not accept that intervention. This is one of the most important clauses in the whole Bill and we have to spend quite a bit of time teasing it out. The Minister has just electrified us all in what he said about the nature of this clause, what the Government are trying to achieve and how it fits within their strategy, which is even more concerning than previously. I am very sorry, but I really do not believe that this is the right point for the Whip to intervene. I have been in this House for 25 years and have never seen an intervention of that kind.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Let me make the broad point that there is no single list of outcomes for the whole Bill but, as we go through clause by clause, I hope the philosophy behind it, of being less prescriptive about process and more prescriptive about the results of the process that we desire, should emerge—not just on Clause 14 but as the overall philosophy underlying the Bill. Regulation-making powers can also be used to vary the existing safeguards, add additional safeguards and remove additional safeguards added at a later date.

On the point about having regard, it is important that the law is drafted in a way that allows it to adapt as technology advances. Including prescriptive requirements in the legislation reduces this flexibility and undermines the purpose of this clause and these powers to provide additional legal clarity when it is deemed necessary and appropriate in the light of the fast-moving advances in and adoption of technologies relevant to automated decision-making. I would like to reassure noble Lords that the powers can be used only to vary the existing safeguards, add additional ones and remove them. They cannot remove any of the safeguards written into the legislation.

Amendments 53 to 55 and 69 to 71 concern the Secretary of State powers relating to the terms “significant decisions” and “meaningful human involvement”. These powers enable the Secretary of State to provide a description of decisions that do or do not have a significant effect on data subjects, and describe cases that can be taken to have, or not to have, meaningful human involvement. As technology adoption grows and new technologies emerge, these powers will enable the Government to provide legal clarity, if and when deemed necessary, to ensure that people are protected and have access to safeguards when they matter most. In respect of Amendment 59A, Clause 50 already provides for an overarching requirement for the Secretary of State to consult the ICO and other persons the Secretary of State considers appropriate before making regulations under the UK GDPR, including for the measures within Article 22.

Also, as has been observed—I take the point about the limitations of this, but I would like to make the point anyway—any changes to the regulations are subject to the affirmative procedure and so must be approved by both Houses. As with other provisions of the Bill, the ICO will seek to provide organisations with timely guidance and support to assist them in interpreting and applying the legislation. As such, I would ask the noble Lord, Lord Clement Jones, and my noble friend Lord Holmes—were he here—not to press their amendments.

Amendment 57 in the name of the noble Baroness, Lady Kidron, seeks to ensure that, when exercising regulation-making powers in relation to the safeguards in Article 22 of the UK GDPR, the Secretary of State should uphold the level of protection that children are entitled to in the Data Protection Act 2018. As I have said before, Clause 50 requires the Secretary of State to consult the ICO and other persons he or she considers appropriate. The digital landscape and its technologies evolve rapidly, presenting new challenges in safeguarding children. Regular consultations with the ICO and stakeholders ensure that regulations remain relevant and responsive to emerging risks associated with solely automated decision-making. The ICO has a robust position on the protection of children, as evidenced through its guidance and, in particular, the age-appropriate design code. As such, I ask the noble Baroness not to press her amendment.

Amendments 58, 72 and 73 seek to prevent the Secretary of State varying any of the safeguards mentioned in the reformed clauses. As I assured noble Lords earlier, the powers in this provision can be used only to vary the existing safeguards, add additional safeguards and remove additional safeguards added by regulation in future; there is not a power to remove any of the safeguards.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I apologise for breaking the Minister’s flow, especially as he had moved on a little, but I have a number of questions. Given the time, perhaps he can write to me to answer them specifically. They are all designed to show the difference between what children now have and what they will have under the Bill.

I have to put on the record that I do not accept what the Minister just said—that, without instruction, the ICO can use its old instruction to uphold the current safety for children—if the Government are taking the instruction out of the Bill and leaving it with the old regulator. I ask the Minister to tell the Committee whether it is envisaged that the ICO will have to rewrite the age-appropriate design code to marry it with the new Bill, rather than it being the reason why it is upheld. I do not think the Government can have it both ways where, on the one hand, the ICO is the keeper of the children, and, on the other, they take out things that allow the ICO to be the keeper of the children in this Bill.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I absolutely recognise the seriousness and importance of the points made by the noble Baroness. Of course, I would be happy to write to her and meet her, as I would be for any Member in the Committee, to give—I hope—more satisfactory answers on these important points.

As an initial clarification before I write, it is perhaps worth me saying that the ICO has a responsibility to keep guidance up to date but, because it is an independent regulator, it is not for the Government to prescribe this, only to allow it to do so for flexibility. As I say, I will write and set out that important point in more detail.

Amendment 59 relates to workplace rights. I reiterate that the existing data protection legislation and our proposed reforms—

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Has the Minister moved on from our Amendments 58 and 59? He was talking about varying safeguards. I am not quite sure where he is.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

It is entirely my fault; when I sit down and stand up again, I lose my place.

We would always take the views of the DPRRC very seriously on that. Clearly, the Bill is being designed without the idea in mind of losing or diminishing any of those safeguards; otherwise, it would have simply said in the Bill that we could do that. I understand the concern that, by varying them, there is a risk that they would be diminished. We will continue to find a way to take into account the concerns that the noble Baroness has set out, along with the DPRRC. In the interim, let me perhaps provide some reassurance that that is, of course, not the intention.

13:45
Amendment 59 relates to workplace rights. I reiterate that the existing data protection legislation and our proposed reforms provide sufficient safeguards for automated decision-making where personal data is being processed, including in workplaces. Workers are already protected by a wide range of regulatory frameworks related to AI, which include employment law, as well as data protection law, human rights law, equality law and potentially also legal frameworks relating to health and safety. Where AI might engage with workers’ rights in the workplace, the UK has a strong system of legislation and enforcement of these protections using enforcement through specialist employment tribunals. In addition, the existing transparency provisions in the wider data protection framework continue to apply. These require organisations to inform individuals about the existence of solely automated decision-making, including profiling, as well as to provide meaningful information about the logic involved and the significance and envisaged consequence of such automated processing for the individual. As such, we do not believe these amendments are necessary, and I ask the noble Baroness and the noble Lord not to press them.
The Government take the view that these reforms clarify and simplify the rules related to solely automated decision-making without watering down any of the protections for data subjects or the fundamental data protection principles. In doing so, they will provide confidence to organisations looking to use these technologies in a responsible way, while driving economic growth and innovation. The Government want to provide the necessary future-proofing measures in an evolving technology landscape. This is why the Bill has been carefully designed to provide a future-proofed and flexible data protection regime for the UK. I therefore ask noble Lords not to oppose the question that Clause 14 stand part of the Bill.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I feel less reassured after this debate than I did even at the end of our two groups on Monday. I thank all those who spoke in this debate. There is quite a large number of amendments in this group, but a lot of them go in the same direction. I was very taken by what the noble Baroness, Lady Kidron, said: if the Government are offering safeguards and not rights, that is really extremely worrying. I also very much take on board what the noble Baroness, Lady Harding, had to say. Yes, of course we are in favour of automated decision-making, as it will make a big difference to our public services and quite a lot of private businesses, but we have to create the right ground rules around it. That is what we are talking about. We all very much share the question of children having a higher bar. The noble Baroness, Lady Jones, outlined exactly why the Secretary of State’s powers either should not be there or should not be expressed in the way that they are. I very much hope that the Minister will write on that subject.

More broadly, there are huge issues here. I think that it was the noble Baroness, Lady Kidron, who first raised the fact that the Government seem to be regulating in a specific area relating to AI that is reducing rights. The Minister talks about now regulating outcomes, not process. As the noble Baroness, Lady Jones, said, we do not have any criteria—what KPIs are involved? The process is important—the ethics by which decisions are made and the transparency involved. I cannot see that it is simply about whether the outcome is such and such; it is about the way in which people make decisions. I know that people like talking about outcome-based regulation, but it is certainly not the only important aspect of regulation.

On the issue of removing prescriptiveness, I am in favour of ethical prescriptiveness, so I cannot see that the Minister has made a particularly good case for the changes made under Clause 14. He talked about having access to safeguards when they matter most. It would be far preferable to have rights that can be exercised in the face of automated decision-making, in particular workplace protection. At various points during the debates on the Bill we have touched on things such as algorithmic impact assessment in the workplace and no doubt we will touch on it further. That is of great and growing importance, but again there is no recognition of that.

I am afraid that the Minister has not made a fantastic case for keeping Clause 14 and I think that most of us will want to kick the tyres and carry on interrogating whether it should be part of the Bill. In the meantime, I beg leave to withdraw Amendment 53.

Amendment 53 withdrawn.
Amendments 54 to 60 not moved.
Amendment 61
Moved by
61: Clause 14, page 28, line 17, leave out “using sensitive personal data” and insert “based on sensitive processing”
Member’s explanatory statement
This amendment of a heading is consequential on the amendment in my name to clause 14, page 28, line 19.
Amendment 61 agreed.
Amendment 62 not moved.
Amendment 63
Moved by
63: Clause 14, page 28, line 19, leave out “sensitive personal data” and insert “sensitive processing (as defined in section 35(8))”
Member’s explanatory statement
This technical amendment adjusts the wording of new section 50B(1) of the Data Protection Act 2018 to refer to “sensitive processing”, rather than “sensitive personal data”, to reflect the terms of section 35(8) of that Act.
Amendment 63 agreed.
Amendments 64 to 73 not moved.
Clause 14, as amended, agreed.
Amendment 74
Moved by
74: After Clause 14, insert the following new Clause—
“Use of the Algorithmic Transparency Recording Standard(1) The Secretary of State must by regulations make provision requiring Government departments, public authorities and all persons exercising a public function using algorithmic tools to process personal data to use the Algorithmic Transparency Recording Standard (“the Standard”).(2) The Standard is that published by the Central Digital and Data Office and Centre for Data Ethics and Innovation as part of the Government’s National Data Strategy.(3) Regulations under subsection (1) must require the submission and publication of algorithmic transparency reports as required by the Standard.(4) Regulations under subsection (1) may provide for exemptions to the requirement for publication where necessary—(a) to avoid obstructing an official or legal inquiry, investigation or procedure, (b) to avoid prejudicing the prevention, detection, investigation or prosecution of criminal offences or the execution of criminal penalties,(c) to protect public security, or(d) to safeguard national security.(5) Regulations under subsection (1) are subject to the affirmative resolution procedure.”Member’s explanatory statement
This new Clause puts a legislative obligation on public bodies using algorithmic tools that have a significant influence on a decision-making process with direct or indirect public effect, or directly interact with the general public, to publish reports under the Algorithmic Transparency Recording Standard.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, the Central Digital and Data Office, or CDDO, and the Centre for Data Ethics and Innovation, as it was then called—it now has a new name as a unit of DSIT—launched the algorithmic transparency recording standard in November 2021. The idea for the ATRS arose from a recommendation by the CDEI that the UK Government should place a mandatory transparency obligation on public sector organisations using algorithms to support “significant decisions affecting individuals”. It is intended to help public sector organisations to provide clear information about the algorithmic tools that they use, how they operate and why they are using them.

The ATRS is a promising initiative that could go some way to addressing the current transparency deficit around the use of algorithmic and AI tools by public authorities. Organisations are encouraged to submit reports about each algorithmic tool that they are using that falls within the scope of the standard.

We welcome the recent commitments made in the Government’s response to the AI regulation White Paper consultation to make the ATRS a requirement for all government departments. However, we believe that this is an opportunity to deliver on this commitment through the DPDI Bill, by placing it on a statutory footing rather than it being limited to a requirement in guidance. That is what Amendment 74 is designed to do.

We also propose another new clause that should reflect the Government’s commitment to algorithmic transparency. It would require the Secretary of State to introduce a compulsory transparency reporting requirement, but only when she or he considers it appropriate to do so. It is a slight watering-down of Amendment 74, but it is designed to tempt the Minister into further indiscretions. In support of transparency, the new clause would, for as long as the Secretary of State considers making the ATRS compulsorily inappropriate, also require the Secretary of State to regularly explain why and keep her decision under continual review.

Amendment 76 on safe and responsible automated decision systems proposes a new clause that seeks to shift the burden back on public sector actors. It puts the onus on them to ensure safety and prevent harm, rather than waiting for harm to occur and putting the burden on individuals to challenge it. It imposes a proactive statutory duty, similar to the public sector equality duty under Section 149 of the Equality Act 2010, to have “due regard” to ensuring that

“automated decision systems … are responsible and minimise harm to individuals and society at large”.

The duty incorporates the key principles in the Government’s AI White Paper and therefore is consistent with its substantive approach. It also includes duties to be proportionate, to give effect to individuals’ human rights and freedoms and to safeguard democracy and the rule of law. It applies to all “automated decision systems”. These are

“any tool, model, software, system, process, function, program, method and/or formula designed with or using computation to automate, analyse, aid, augment, and/or replace human decisions that impact the welfare, rights and freedoms of individuals”.

This therefore applies to partly automated decisions, as well as those that are entirely automated, and systems in which multiple automated decision processes take place.

It applies to traditional public sector actors: public authorities, or those exercising public functions, including private actors outsourced by the Government to do so; those that may exercise control over automated decision systems, including regulators; as well as those using data collected or held by a public authority, which may be public or private actors. It then provides one mandatory mechanism through which compliance with the duty must be achieved—impact assessments. We had a small debate about the ATRS and whether a compliance system was in place. It would be useful to see whether the Minister has any further comment on that, but I think that he disagreed with my characterisation that there is no compliance system currently.

This provision proposes impact assessments. The term used, “algorithmic impact assessment”, is adopted from Canada’s analogous directive on automated decision-making, which mandates the use of AIAs for all public sector automated decision systems. The obligation is on the Secretary of State, via regulations, to set out a framework for AIAs, which would help actors to uphold their duty to ensure that automated decision systems are responsible and safe; to understand and to reduce the risks in a proactive and ongoing way; to introduce the appropriate governance, oversight, reporting and auditing requirements; and to communicate in a transparent and accessible way to affected individuals and the wider public.

Amendment 252 would require a list of UK addresses to be made freely available for reuse. Addresses have been identified as a fundamental geospatial dataset by the UN and a high-value dataset by the EU. Address data is used by tens of thousands of UK businesses, including for delivery services and navigation software. Crucially, address data can join together different property-related data, such as energy performance certificates or Land Registry records, without using personal information. This increases the value of other high-value public data.

14:00
Addresses are no longer just about sending letters; they are crucial spatial data infrastructure used by tens of thousands of UK businesses. Reliable address data is important for navigation software, such as TomTom or Waze, and successful service provision such as Amazon deliveries. Control of the UK’s addresses was sold to Royal Mail when it was privatised in 2013. There is now a complex system for generating and managing UK addresses, where most of the work is done by local authorities but most of the benefits flow to Royal Mail.
If you are in the public sector, you can access address data for free because the Government pay Royal Mail and Ordnance Survey millions of pounds per year for public sector access. If you are a business, however, you have to pay for access, sign licensing agreements and potentially hire lawyers. This is especially burdensome for start-ups and SMEs. The previous Government created the Geospatial Commission to unlock the value of geospatial data, but it has not yet evaluated the arrangements with Royal Mail. The process of creating and managing addresses is slow and complex, with multiple bodies involved.
This all causes a number of problems: it holds back growth and innovation, it holds back emerging technology, it hinders public service delivery, it causes problems for citizens and, increasingly, it makes the UK an outlier among high-income countries. As a result, our address data is expensive, hard to access and unreliable. The Government need to act now to provide a dataset of all UK addresses that is accurate and can be freely used by anyone offering services to UK citizens. This could be implemented by establishing a new body to manage address data or by mandating that Royal Mail makes the data openly available.
Unlike opening up more sensitive datasets, such as personal location, releasing address data—a list of the physical places recognised by the Government—carries few new legal or ethical risks. Many other countries are doing this, including those with a strong privacy regime. The harms created by the lack of access to address data are more pressing. I offer the Netherlands as a good example of somewhere that has already been through this process.
I am grateful for the support of the noble Baroness, Lady Bennett, for this particular amendment, alongside another noble Lord who no doubt will reveal themself when I finally find my way through this list of amendments. In the meantime, I beg to move.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I speak to Amendment 144 in my name, which is supported by the noble Baronesses, Lady Harding and Lady Jones, and the noble Lord, Lord Clement-Jones. The amendment would introduce a code of practice on children and AI. Before I speak to it, I declare an interest: I am working with academic NGO colleagues in the UK, EU and US on such a code, and I am part of the UN Secretary-General’s AI advisory body’s expert group, which is currently working on sections on both AI and children and AI and education.

AI drives the recommender systems that determine all aspects of a child’s digital experience, including the videos they watch, their learning opportunities, people they follow and products they buy. But it no longer concerns simply the elective parts of life where, arguably, a child—or a parent on their behalf—can choose to avoid certain products and services. AI is invisibly and ubiquitously present in all areas of their lives, and its advances and impact are particularly evident in the education and health sectors—the first of which is compulsory and the second of which is necessary.

The proposed code has three parts. The first requires the ICO to create the code and sets out expectations of its scope. The second considers who and what should be consulted and considered, including experts, children and the frameworks that codify children’s existing rights. The third defines elements of the process, including risk assessment, defines language and puts the principles to which the code must adhere in the Bill.

I am going to get my defence in early. I anticipate that the Minister will say that the ICO has published guidance, that we do not want to exclude children from the benefits of AI and that we are in a time of “wait and see”. He might even ask why children need something different or why the AADC, which I mention so frequently, is not sufficient. Let me take each of those in turn.

On the sufficiency of the current guidance, the ICO’s non-binding Guidance on AI and Data Protection, which was last updated on 15 March 2023, has a single mention of a child in its 140 pages, in a case study about child benefits. The accompanying AI and data protection toolkit makes no mention of children, nor does the ICO’s advice to developers on generative AI, issued on 3 April 2023. There are hundreds of pages of guidance but it fails entirely to consider the specific needs of children, their rights, their development vulnerabilities or that their lives will be entirely dominated by AI systems in a way that is still unimaginable to those in this Room. Similarly, there is little mention of children in the Government’s own White Paper on AI. The only such references are limited to AI-generated child sexual abuse material; we will come to that later when we discuss Amendment 291. Even the AI summit had no main-stage event relating to children.

Of course we do not want to exclude children from the benefits of AI. A code on the use of children’s data in the development and deployment of AI technology increases their prospects of enjoying the benefits of AI while ensuring that they are protected from the pitfalls. Last week’s debate in the name of the noble Lord, Lord Holmes, showed the broad welcome of the benefits while urgently speaking to the need for certain principles and fundamental protections to be mandatory.

As for saying, “We are in a time of ‘wait and see’”, that is not good enough. In the course of this Committee, we will explore edtech that has only advertising and no learning content, children being left out of classrooms because their parents will not accept the data leaks of Google Classroom, social media being scraped to create AI-generated CSAM and how rapid advances in generative AI capabilities mark a new stage in its evolution. Some of the consequences of that include ready access to models that create illegal and abusive material at scale and chatbots that offer illegal or dangerous advice. Long before we get on to the existential threat, we have “here and now” issues. Childhood is a very short period of life. The impacts of AI are here and now in our homes, our classrooms, our universities and our hospitals. We cannot afford to wait and see.

Children are different for three reasons. First, as has been established over decades, there are ages and stages at which children are developmentally able to do certain things, such as walk, talk, understand risk and irony, and learn different social skills. This means that, equally, there are ages and stages at which they cannot do that. The long-established consensus is that family, social groups and society more broadly—including government—step in to support that journey.

Secondly, children have less voice and less choice about how and where they spend their time, so the places and spaces that they inhabit have to be fit for childhood.

Thirdly, we have a responsibility towards children that extends even beyond our responsibilities to each other; this means that it is not okay for us to legitimise profit at their expense, whether it is allowing an unregulated edtech market that exploits their data and teaches them nothing or the untrammelled use of their pictures to create child sexual abuse material.

Finally, what about the AADC? I hope that, in the course of our deliberations, we will put that on a more secure footing. The AADC addresses recommender systems in standard 12. However, the code published in August 2020 does not address generative AI which, as we have repeatedly heard, is a game-changer. Moreover, the AADC is currently restricted to information society services, which leaves a gaping hole. This amendment would address this gap.

There is an argument that the proposed code could be combined with the AADC as an update to its provisions. However, unless and until we sort out the status of the AADC in relation to the Bill, an AI kids code would be better formed as a stand-alone code. A UK code of practice on children and AI would ensure that data processors consider the fundamental rights and freedoms of children, including their safety, as they develop their products and perhaps even give innovators the appetite to innovate with children in mind.

As I pointed out at the beginning, there are many people globally working on this agenda. I hope that as we are the birthplace of the AADC and the Online Safety Act, the Government will adopt this suggestion and again be a forerunner in child privacy and safety. If, however, the Minister once again says that protections for children are not necessary, let me assure him that they will be put in place by others, and we will be a rule taker not a rule maker.

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - - - Excerpts

My Lords, I rise with the advantage over the noble Lord, Lord Clement-Jones, in that I will speak to only one amendment in this group; I therefore have the right page in front of me and can note that I will speak to Amendment 252, tabled by the noble Lord, Lord Clement-Jones, and signed by me and the noble Lords, Lord Watson of Wyre Forest and Lord Maude of Horsham.

I apologise that I was not with the Committee earlier today, but I was chairing a meeting about the microbiome, which was curiously related to this Committee. One issue that came up in that meeting was data and data management and the great uncertainties that remain. For example, if a part of your microbiome is sampled and the data is put into a database, who owns that data about your microbiome? In fact, there is no legal framework at the moment to cover this. There is a legal framework about your genome, but not your microbiome. That is a useful illustration of how fast this whole area is moving and how fast technology, science and society are changing. I will actually say that I do not blame the Government for the fact of this gaping hole as it is an international hole. It is a demonstration of how we need to race to catch up as legislators and regulators to deal with the problem.

This relates to Amendment 252 in the sense that perhaps this is an issue that has arisen over time, kind of accidentally. However, I want to credit a number of campaigners, among them James O’Malley, who was the man who draw my attention to this issue, as well as Peter Wells, Anna Powell-Smith and Hadley Beeman. They are people who have seen a really simple and basic problem in the way that regulation is working and are reaching out including, I am sure, to many noble Lords in this Committee. This is a great demonstration of how campaigning has at least gone part of the way to working. I very much hope that, if not today, then some time soon, we can see this working.

What we are talking about here, as the noble Lord, Lord Clement-Jones, said, is the postal address file. It is held as a piece of private property by Royal Mail. It is important to stress that this is not people’s private information or who lives at what address; it is about where the address is. As the noble Lord, Lord Clement-Jones, set out, all kinds of companies have to pay Royal Mail to have access to this basic information about society, basic information that is assembled by society, for society.

The noble Lord mentioned Amazon having to pay for the file. I must admit that I feel absolutely no sympathy there. I am no fan of the great parasite. It is an interesting contrast to think of Amazon paying, but also to think of an innovative new start-up company, which wants to be able to access and reach people to deliver things to their homes. For this company, the cost of acquiring this file could be prohibitive. It could stop it getting started and competing against Amazon.

14:15
I am very interested in citizen science. We are seeing huge developments in citizen science in areas in which I often engage, such as birds, ecosystems et cetera. There is a chance for citizens who have access to data to use it in all kinds of interesting ways to help us all collectively as a society understand the social world in which we live and what is happening with it. It is crucial for people to be able access this data, and they should not have to pay to do so. It is a chance for—this is their self-description so I will use the term—civic-minded nerds to do all sorts of interesting things with the data.
It is worth stressing that this amendment does not say how the Government should do this; it simply sets out the principle that the Government should do it. It is not open to the argument that the amendment should be drafted differently or approached in another way; it simply says that we should free the postal address file. I very much hope to hear positive words and a positive direction from the Minister on Amendment 252.
Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

My Lords, I apologise for not being here on Monday, when I wanted to speak about automated decision-making. I was not sure which group to speak on today; I am thankful that my noble friend Lord Harlech intervened to ensure that I spoke on this group and made my choice much easier.

I want to speak on Amendments 74 to 77 because transparency is essential. However, one of the challenges about transparency is to ensure you understand what you are reading. I will give noble Lords a quick example: when I was in the Department of Health and Social Care, we had a scheme called the voluntary pricing mechanism for medicines. Companies would ask whether that could be changed and there could be a different relationship because they felt that they were not getting enough value from it. I said to the responsible person in the department, “I did engineering and maths, so can you send me a copy of algorithm?” He sent it to me, and it was 100 pages long. I said, “Does anyone understand this algorithm?”, and he said, “Oh yes, the analysts do”. I was about to get a meeting, but then I was moved to another department. That shows that even if we ask for transparency, we have to make sure that we understand what we are being given. As the noble Lord, Lord Clement-Jones, has worded this, we have to make sure that we understand the functionality and what it does at a high enough level.

My noble friend Lady Harding often illustrates her points well with short stories. I am going to do that briefly with two very short stories. I promise to keep well within the time limit.

A few years ago, I was on my way to a fly to Strasbourg because I was a Member of the European Parliament. My train got stuck, and I missed my flight. My staff booked me a new ticket and sent me the boarding pass. I got to the airport, which was fantastic, and got through the gate and was waiting for my flight in a waiting area. They called to start boarding and, when I went to go on, they scanned my pass again and I was denied boarding. I asked why I was denied, having been let into the gate area in the first place, but no one could explain why. To cut a long story short, over two hours, four or five people from that company gaslighted me. Eventually, when I got back to the check-in desk, which the technology was supposed to avoid in the first place, it was explained that they had sent me an email the day before. In fact, they had not sent me an email the day before, which they admitted the day after, but no one ever explained why I was not allowed on that flight.

Imagine that in the public sector. I can accept it, although it was awful behaviour by that company, but imagine that happening for a critical operation that had been automated to cut down on paperwork. Imagine turning up for your operation when you are supposed to scan your barcode to be let into the operating theatre. What happens if there is no accountability or transparency in that case? This is why the amendments tabled by the noble Lord, Lord Clement-Jones, are essential.

Here is another quick story. A few years ago, someone asked me whether I was going to apply for one of these new fintech banks. I submitted the application and the bank said that it would get back to me within 48 hours. It did not. Two weeks later, I got a message on the app saying that I had been rejected, that I would not be given an account and that “by law, we do not have to explain why”.

Can you imagine that same technology being used in the public sector, with a WYSIWYG on the fantastic NHS app that we have now? Imagine booking an appointment then suddenly getting a message back saying, “Your appointment has been denied but we do not have to explain why”. These Amendments 74 to 78 must be given due consideration by the Government because it is absolutely essential that citizens have full transparency on decisions made through automated decision-making. We should not allow the sort of technology that was used by easyJet and Monzo in this case to permeate the public sector. We need more transparency—it is absolutely essential—which is why I support the amendments in the name of the noble Lord, Lord Clement-Jones.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, I associate myself with the comments that my noble friend Lord Kamall just made. I have nothing to add on those amendments, as he eloquently set out why they are so important.

In the spirit of transparency, my intervention enables me to point out, were there any doubt, who I am as opposed to the noble Baroness, Lady Bennett, who was not here earlier but who I was mistaken for. Obviously, we are not graced with the presence of my noble friend Lord Maude, but I am sure that we all know what he looks like as well.

I will speak to two amendments. The first is Amendment 144, to which I have added my name. As usual, the noble Baroness, Lady Kidron, has said almost everything that can be said on this but I want to amplify two things. I have yet to meet a politician who does not get excited about the two-letter acronym that is AI. The favoured statement is that it is as big a change in the world as the discovery of electricity or the invention of the wheel. If it is that big—pretty much everyone in the world who has looked at it probably thinks it is—we need properly to think about the pluses and the minuses of the applications of AI for children.

The noble Baroness, Lady Kidron, set out really clearly why children are different. I do not want to repeat that, but children are different and need different protections; this has been established in the physical world for a very long time. With this new technology that is so much bigger than the advent of electricity and the creation of the first automated factories, it is self-evident that we need to set out how to protect children in that world. The question then is: do we need a separate code of practice on children and AI? Or, as the noble Baroness set out, is this an opportunity for my noble friend the Minister to confirm that we should write into this Bill, with clarity, an updated age-appropriate design code that recognises the existence of AI and all that it could bring? I am indifferent on those two options but I feel strongly that, as we have now said on multiple groups, we cannot just rely on the wording in a previous Act, which this Bill aims to update, without recognising that, at the same time, we need to update what an age-appropriate design code looks like in the age of AI.

The second amendment that I speak to is Amendment 252, on the open address file. I will not bore noble Lords with my endless stories about the use of the address file during Covid, but I lived through and experienced the challenges of this. I highlight an important phrase in the amendment. Proposed new subsection (1) says:

“The Secretary of State must regularly publish a list of UK addresses as open data to an approved data standard”.


One reason why it is a problem for this address data to be held by an independent private company is that the quality of the data is not good enough. That is a real problem if you are trying to deliver a national service, whether in the public sector or the private sector. If the data quality is not good enough, it leaves us substantially poorer as a country. This is a fundamental asset for the country and a fundamental building block of our geolocation data, as the noble Lord, Lord Clement-Jones, set out. Anybody who has tried to build a service that delivers things to human beings in the physical world knows that errors in the database can cause huge problems. It might not feel like a huge problem if it concerns your latest Amazon delivery but, if it concerns the urgent dispatch of an ambulance, it is life and death. Maintaining the accuracy of the data and holding it close as a national asset is therefore hugely important, which is why I lend my support to this amendment.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

My Lords, the noble Lord, Lord Clement-Jones, has, as ever, ably introduced his Amendments 74, 75, 76, 77 and 78, to the first of which the Labour Benches have added our name. We broadly support all the amendments, but in particular Amendment 74. We also support Amendment 144 which was tabled by the noble Baroness, Lady Kidron, and cosigned by the noble Baroness, Lady Harding, the noble Lord, Lord Clement-Jones and my noble friend Lady Jones.

Amendments 74 to 78 cover the use of the Government’s Algorithmic Transparency Recording Standard—ATRS. We heard a fair bit about this in Committee on Monday, when the Minister prayed it in aid during debates on Clause 14 and Article 22A. The noble Lord, Lord Clement-Jones, outlined its valuable work, which I think everyone in the Committee wants to encourage and see writ large. These amendments seek to aid the transparency that the Minister referred to by publishing reports by public bodies using algorithmic tools where they have a significant influence on the decision-making process. The amendments also seek to oblige the Secretary of State to ensure that public bodies, government departments and contractors using public data have a compulsory transparency reporting scheme in place. The amendments legislate to create impact assessments and root ADM processes in public service that minimise harm and are fair and non-discriminatory in their effect.

The noble Lord, Lord Kamall, made some valuable points about the importance of transparency. His two stories were very telling. It is only right that we have that transparency for the public service and in privately provided services. I think the Minister would be well advised to listen to him.

The noble Lord, Lord Clement-Jones, also alighted on the need for government departments to publish reports under the ATRS in line with their position as set out in the AI regulation White Paper consultation process and response. This would put it on a legislative basis, and I think that is fairly argued. The amendments would in effect create a statutory framework for transparency in the public service use of algorithmic tools.

We see these amendments as forming part of the architecture needed to begin building a place of trust around the increased use of ADM and the introduction of AI into public services. Like the Government and everyone in this Committee, we see all the advantages, but take the view that we need to take the public with us on this journey. If we do not do that, we act at our peril. Transparency, openness and accountability are key to securing trust in what will be something of a revolution in how public services are delivered and procured in the future.

We also support Amendment 144 in the name of the noble Baroness, Lady Kidron, for the very simple reason that in the development of AI technology we should hardwire into practice and procedure using the technology as it affects the interests of children to higher standards, and those higher standards should apply. This has been a constant theme in our Committee deliberations and our approach to child protection. In her earlier speech, the noble Baroness, Lady Harding, passionately argued for the need to get this right. We have been wanting over the past decade in that regard, and now is the moment to put that right and begin to move on this policy area.

The noble Baroness, Lady Kidron, has made the argument for higher standards of protection for children persuasively during all our deliberations, and a code of practice makes good sense. As the noble Baroness, Lady Harding, said, it can either be stand-alone or integrated. In the end, it matters little, but having it there setting the standard is critical to getting this policy area in the right place. The amendment sets out the detail that the commissioner must cover with admirable clarity so that data processors should always have prioritising children’s interests and fundamental rights in their thinking. I am sure that is something that is broadly supported by the whole Committee.

14:30
We have an open mind on Amendment 252 because there is a balance to be struck between privacy issues and the need to ensure that service delivery and commercial activity operate on a level playing field. I listened to the passionate argument made by the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Harding, also made some powerful points on this, but I would like to hear what the Minister has to say because we need to get this right as well. We cannot have a situation where one part of the public service is holding up or getting wrong public service delivery and the operation of physical delivery services to our homes and households. With that, I am content to let the Minister have his say. I hope he gets all our names right.
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I feel under amazing pressure to get the names right, especially given the number of hours we spend together.

I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, for tabling Amendments 74 to 78, 144 and 252 in this group. I also extend my thanks to noble Lords who have signed the amendments and spoken so eloquently in this debate.

Amendments 74 to 78 would place a legislative obligation on public authorities and all persons in the exercise of a public function to publish reports under the Algorithmic Transparency Recording Standard—ATRS—or to publish algorithmic impact assessments. These would provide information on algorithmic tools and algorithm-assisted decisions that process personal data in the exercise of a public function or those that have a direct or indirect public effect or directly interact with the general public. I remind noble Lords that the UK’s data protection laws will continue to apply throughout the processing of personal data.

The Government are already taking action to establish the necessary guard-rails for AI, including to promote transparency. In the AI regulation White Paper response, we announced that the use of the ATRS will now become a requirement for all government departments and the broader public sector. The Government are phasing this in as we speak and will check compliance accordingly, as DSIT has been in contact with every department on this issue.

In making this policy, the Government are taking an approach that provides increasing degrees of mandation of the ATRS, with appropriate exemptions, allowing them to monitor compliance and effectiveness. The announcement in the White Paper response has already led to more engagement from across government, and more records are under way. The existing process focuses on the importance of continuous improvement and development. Enshrining the standard into law prematurely, amid exponential technological change, could hinder its adaptability.

More broadly, our AI White Paper outlined a proportionate and adaptable framework for regulating AI. As part of that, we expect AI development and use to be fair, transparent and secure. We set out five key principles for UK regulators to interpret and apply within their remits. This approach reflects the fact that AI systems are not unregulated and need to be compliant with existing regulatory frameworks, including employment, human rights, health and safety and data protection law.

For instance, the UK’s data protection legislation imposes obligations on data controllers, including providers and users of AI systems, to process personal data fairly, lawfully and transparently. Our reforms in this Bill will ensure that, where solely automated decision-making is undertaken—that is, ADM without any meaningful human involvement that has significant effects on data subjects—data subjects will have a right to the relevant safeguards. These safeguards include being provided with information on the ADM that has been carried out and the right to contest those decisions and seek human review, enabling controllers to take suitable measures to correct those that have produced wrongful outcomes.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I wonder whether the Minister can comment on this; he can write if he needs to. Is he saying that, in effect, the ATRS is giving the citizen greater rights than are ordinarily available under Article 22? Is that the actual outcome? If, for instance, every government department adopted ATRS, would that, in practice, give citizens a greater degree of what he might put as safeguards but, in this context, he is describing as rights?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am very happy to write to the noble Lord, but I do not believe that the existence of an ATRS-generated report in and of itself confers more rights on anybody. Rather, it makes it easier for citizens to understand how their rights are being used, what rights they have, or what data about them is being used by the department concerned. The existence of data does not in and of itself confer new rights on anybody.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I understand that, but if he rewinds the reel he will find that he was talking about the citizen’s right of access, or something of that sort, at that point. Once you know what data is being used, the citizen has certain rights. I do not know whether that follows from the ATRS or he was just describing that at large.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

As I said, I will write. I do not believe that follows axiomatically from the ATRS’s existence.

On Amendment 144, the Government are sympathetic to the idea that the ICO should respond to new and emerging technologies, including the use of children’s data in the development of AI. I assure noble Lords that this area will continue to be a focus of the ICO’s work and that it already has extensive powers to provide additional guidance or make updates to the age-appropriate design code, to ensure that it reflects new developments, and a responsibility to keep it up to date. The ICO has a public task under Article 57(1)(b) of the UK GDPR to

“promote public awareness and understanding of the risks, rules, safeguards and rights in relation to processing”.

It is already explicit that:

“Activities addressed specifically to children shall receive specific attention”.


That code already includes a chapter on profiling and provides guidance on fairness and transparency requirements around automated decision-making.

Taking the specific point made by the noble Baroness, Lady Kidron, on the contents of the ICO’s guidance, while I cannot speak to the ICO’s decisions about the drafting of its guidance, I am content to undertake to speak to it about this issue. I note that it is important to be careful to avoid a requirement for the ICO to duplicate work. The creation of an additional children’s code focused on AI could risk fragmenting approaches to children’s protections in the existing AADC—a point made by the noble Baroness and by my noble friend Lady Harding.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

I have a question on this. If the Minister is arguing that this should be by way of amendment of the age-related code, would there not be an argument for giving that code some statutory effect?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I believe that the AADC already has statutory standing.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

On that point, I think that the Minister said—forgive me if I am misquoting him —risk, rules and rights, or some list to that effect. While the intention of what he said was that we have to be careful where children are using it, and the ICO has to make them aware of the risks, the purpose of a code—whether it is part of the AADC or stand-alone—is to put those responsibilities on the designers of service products and so on by default. It is upstream where we need the action, not downstream, where the children are.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Yes, I entirely agree with that, but I add that we need it upstream and downstream.

For the reasons I have set out, the Government do not believe that it would be appropriate to add these provisions to the Bill at this time without further detailed consultation with the ICO and the other organisations involved in regulating AI in the United Kingdom. Clause 33—

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Can we agree that there will be some discussions with the ICO between now and Report? If those take place, I will not bring this point back on Report unnecessarily.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Yes, I am happy to commit to that. As I said, we look forward to talking with the noble Baroness and others who take an interest in this important area.

Clause 33 already includes a measure that would allow the Secretary of State to request the ICO to publish a code on any matter that she sees fit, so this is an issue that we could return to in the future, if the evidence supports it, but, as I said, we consider the amendments unnecessary at this time.

Finally, Amendment 252 would place a legislative obligation on the Secretary of State regularly to publish address data maintained by local authorities under open terms—that is, accessible by anyone for any purpose and for free. High-quality, authoritative address data for the UK is currently used by more than 50,000 public and private sector organisations, which demonstrates that current licensing arrangements are not prohibitive. This data is already accessible for a reasonable fee from local authorities and Royal Mail, with prices starting at 1.68p per address or £95 for national coverage.

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - - - Excerpts

Some 50,000 organisations access that information, but does the Government have any data on it? I am not asking for it now, but maybe the Minister could go away and have a look at this. We have heard that other countries have opened up this data. Are they seeing an increase? That is just a number; it does not tell us how many people are denied access to the data.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

We have some numbers that I will come to, but I am very happy to share deeper analysis of that with all noble Lords.

There is also free access to this data for developers to innovate in the market. The Government also make this data available for free at the point of use to more than 6,000 public sector organisations, as well as postcode, unique identifier and location data available under open terms. The Government explored opening address data in 2016. At that time, it became clear that the Government would have to pay to make this data available openly or to recreate it. That was previously attempted, and the resulting dataset had, I am afraid, critical quality issues. As such, it was determined at that time that the changes would result in significant additional cost to taxpayers and represent low value for money, given the current widespread accessibility of the data. For the reasons I have set out, I hope that the noble Lords will withdraw their amendments.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I thank the Minister for his response. There are a number of different elements to this group.

The one bright spot in the White Paper consultation is the ATRS. That was what the initial amendments in this group were designed to give a fair wind to. As the noble Lord, Lord Bassam, said, this is designed to assist in the adoption of the ATRS, and I am grateful for his support on that.

14:45
I was intrigued by the way the Minister described the adoption of the ATRS in government. He said that it will be phased in with increasing degrees of mandation, which is interesting because it sounds like gradually turning the screw in some way. I do not know who will turn the screw. It would be awfully interesting to know who will be responsible for that screw because my contention throughout has been that there is no clear, if any, compliance mechanism for the ATRS, however desirable it may be to see it put into effect. That is why this kind of legislation is so important. Like the noble Lord, Lord Bassam, I agree entirely with what the noble Lord, Lord Kamall, said. It is not good enough just to say it; we need to have the standard and make sure that it is understandable to the ordinary citizen. If you read GitHub or something for pleasure, you will not find it very transparent. Something much simpler by way of explanation is needed, and the private sector needs this as well as the public sector.
That brings me to the difference in philosophy that we have here. We keep butting up against the philosophy set out in the White Paper and its consultation: “We do not need this, not now”. It is the St Augustine approach to regulation: “We’ll have regulation but not now. It’s too early. We don’t know what the risks are”. We know what the risks are in many ways, and nowhere is that illustrated more clearly than in the field of children.
I very much hope that the Minister will come to my book launch. I have spent some time—there are 200 pages there—saying why we need regulation. Perhaps, after he has read it—over the Easter break, possibly—the Minister will change his views on this issue. There is hope yet.
I was very taken by what the noble Baroness, Lady Harding, said. AI is as big as the wheel or electricity. It will have the same impact; if anything, it will have a bigger impact. This means that we cannot rely on the AADC. We are in new territory here. The noble Baronesses, Lady Kidron and Lady Harding, absolutely made the case for revising. Again, I took some comfort from what the Minister had to say about being sympathetic to the idea of an upgrade to the AADC to take account of the circumstances of generative AI more than anything else. That could be a real game-changer for child protection if it were done. I very much hope that, in the way mentioned by the noble Baroness, Lady Kidron, the Minister will come back later on in the course of this Bill and give us some greater comfort because we have a number of amendments coming down the track that will raise these issues again and again.
I come back to the open address issue, I am grateful for what the noble Baronesses, Lady Bennett and Lady Harding, had to say on this subject. The noble Baroness, Lady Bennett, was absolutely right to credit the PAF campaigners for their work on this matter. There are many different ways in which the Government can do this—as the noble Baroness said, we are not predicating exactly how they will go about it—so it was rather disappointing to hear what the Minister had to say, given that the Government extol the virtues of data and growth being linked. This is a clear example of where growth and business are being held back. It sounds as though the Government do not want to take on the cost of making sure that this data is freely available; that could be the bottom line here. I am sure that they will keep up the pressure to find a solution in this area; I very much hope that they will do that. In the meantime, I beg leave to withdraw Amendment 74.
Amendment 74 withdrawn.
Amendments 75 to 78 not moved.
Schedule 3 agreed.
Clause 15: General obligations
Amendment 79
Moved by
79: Clause 15, page 30, line 37, at end insert—
“(ba) in paragraph 3(c) for “Article 32” substitute “Articles 25 and 32””Member’s explanatory statement
This amendment would add data protection by design as an additional measure for processors, to ensure that they are accountable for the design of their systems and services, noting the challenge that controllers often face when engaging processors for services such as AI and cloud computing and what influence they can have on the design.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I will speak to a number of amendments in this group—Amendments 79, 83, 85, 86, 96, 97, 105 and 107.

Amendment 79 proposes an addition to the amendments to Article 28 of the UK GDPR in Clause 15(4). Article 28 sets out the obligations on processors when processing personal data on behalf of controllers. Currently, paragraph 3(c) requires processors to comply with Article 32 of the UK GDPR, which relates to data security. Amendment 79 adds the requirement for processors also to comply with the privacy-by-design provision in Article 25. Article 25 requires controllers to

“at the time of the processing itself, implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects”.

I am not proposing an abdication of responsibility by the controller when it instructs a processor to act on its behalf but, in practice, it is hard for a controller to meet this responsibility at the time of processing if it has delegated the processing to a third party that is not bound by the same requirement. I am not normally associated with the edtech sector, but the amendment is of particular importance to it as schools are controllers but the data of children is being processed.

The amendment ensures that processors would be contractually committed to complying with Article 25. It is particularly relevant to situations where controllers procure AI systems, including facial recognition technology and edtech products. It would be helpful in both the public and private sectors and would address the power asymmetry between controller and processor when the processor is a multinational and solutions are often presented on a take-it-or-leave-it basis.

I hope noble Lords will forgive me if I take Amendment 97 out of turn, as all the others in my name relate to children’s data, whereas Amendment 97, like Amendment 79, applies to all data subjects. Amendment 97 would require public bodies to publish risk assessments to create transparency and accountability. This would also place in statute a provision that is already contained in the ICO’s freedom of information publication scheme guidance. The amendment would also require the Cabinet Office to create and maintain an accessible register of public sector risk assessments to improve accountability.

In the last group, we heard that the way in which public bodies collect and process personal data has far-reaching consequences for all of us. I was moved to lay this amendment after witnessing some egregious examples from the education system. The public have a right to know how bodies such as health authorities, schools, universities, police forces, local authorities and government departments comply with their obligations under UK data law. This amendment is simply about creating trust.

The child-related amendments in this group are in my name and those of the noble Lord, Lord Clement-Jones, and the noble Baronesses, Lady Harding and Lady Jones. Clause 17 sets out the obligations for the newly created role of “senior responsible individual”, which replaces the GDPR requirement to appoint a data protection officer. The two roles are not equivalent: a DPO is an independent adviser to senior management, while a senior responsible individual would be a member of senior management. Amendment 83 would ensure that those appointed senior responsible individuals have an understanding of the heightened risks and the protections to which children are entitled.

Over the years, I have had many conversations with senior executives at major tech companies and, beyond the lines prepared by their public affairs teams, their understanding of children’s protection is often superficial and their grasp of key issues very limited. In fact, if I had a dollar for every time a tech leader, government affairs person or engineer has said, “I never thought of it that way before”, I would be sitting on quite a fortune.

Amendment 83 would simply ensure that a senior leader who is tasked with overseeing compliance with UK data law knows what he or she is talking about when it comes to children’s privacy, and that it informs the decisions they make. It is a modest proposal, and I hope the Minister will find a way to accept it.

Amendments 85 and 86 would require a controller to consider children’s right to higher standards of privacy than adults for their personal data when carrying out its record-keeping duties. Specifically, Amendment 85 sets out what is appropriate when maintaining records of high-risk processing and Amendment 87 relates to processing that is non-high risk. Creating an express requirement to include consideration of these rights in a data controller’s processing record-keeping obligation is a simple but effective way of ensuring that systems and processes are designed with the needs and rights of children front of mind.

Clause 20 is one of the many fault lines where the gap between the assurances given that children will be just as safe and the words on the page is clear. I make clear that the amendments to Clause 18 that I put forward are, as the noble Lord, Lord Clement-Jones, said on Monday, belt and braces. They do not reach the standard of protection that children currently enjoy under the risk-assessment provisions in Article 35 of the UK GDPR and the age-appropriate design code.

A comparison of what controllers must include in a data protection impact assessment under Article 35(7) and what they would need to cover in an assessment of high-risk processing under Clause 20(3)(d) shows the inadequacies of the latter. Instead of a controller having to include

“a systematic description of the envisaged processing operations and the purposes of the processing, including, where applicable, the legitimate interest pursued by the controller”,

under the Bill, the controller needs to include only

“a summary of the purposes of the processing”.

They need to include no systematic description—just a summary. There is no obligation to include information about the processing operations or to explain when and how the controller has determined they are entitled to rely on legitimate interest purpose. Instead of

“an assessment of the necessity and proportionality of the processing operations in relation to the purposes”,

under the Bill, a controller needs to assess only necessity, not proportionality. Instead of

“an assessment of the risks to the rights and freedoms of data subjects”,

under the Bill, a controller does not need to consider rights and freedoms.

As an aside, I note that this conflicts with the proposed amendments to Section 64 of the Data Protection Act 2018 in Clause 20(7)(d), which retains the “rights and freedoms” wording but otherwise mirrors the new downgraded requirements in Clause 20(3)(d). I would be grateful for clarification from the Minister on this point.

Instead of requiring the controller to include information about

“the measures envisaged to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with this Regulation taking into account the rights and legitimate interests of data subjects and other persons concerned”,

as currently prescribed in Article 35, under the Bill, the controller needs to provide only

“a description of how the controller proposes to mitigate those risks”.

The granularity of what is currently required is replaced by a generalised reference to “a description”. These are not the same bar. My argument throughout Committee is that we need to maintain the bar for processing children’s data.

15:00
Amendment 96 would retain the current requirement to undertake a comprehensive data protection impact assessment for services likely to be accessed by children. In that amendment, proposed new paragraphs (12) and (13) of Article 35 would retain the current, more detailed requirements under that article, and would require controllers to follow the guidance of the AADC. Proposed new paragraph (14) would require controllers, when preparing a children’s data protection impact assessment, to give due consideration to their interests and rights, the principles under the 2018 Act and
“the views of children or their representatives”.
I hope that that Minister finds the direct comparison of the old and new Article 35(7) compelling, and that he agrees that the standards of protection are different—and worse—under Government’s proposals. I would be grateful if he would specifically address that point in his reply.
I had so much to say on the detail to try to convince the Minister, but, sadly, the new rules on speaking means that I have not put them all in my speech. In a reversal of Committee norms, I will write to the Minister with my detailed examples, so that the department is fully aware of the level of the downgrade.
Finally, Amendments 105 and 107 would reinstate and reinforce the reporting requirement on controllers in the event that the children’s data protection impact assessment, as proposed in Amendment 96, requires the controller to consult with the commissioner because the processing is high risk. Amendment 105 is consequential, while Amendment 107 is substantive. I beg to move.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I will speak to almost all the amendments in this group, other than those proposed by the noble Baroness, Lady Kidron. I am afraid that this is a huge group; we probably should have split it to have a better debate, but that is history.

I very much support what the noble Baroness said about her amendments, particularly Amendment 79. The mandation of ethics by design is absolutely crucial. There are standards from organisations such as the IEEE for that kind of ethics by design in AI systems. I believe that it is possible to do exactly what she suggested, and we should incorporate that into the Bill. It illustrates that process is as important as outcomes. We are getting to a kind of philosophical approach here, which illustrates the differences between how some of us and the Government are approaching these things. How you do something, the way you design it and the fact that it needs to be ethical is absolutely cardinal in any discussion—particularly about artificial intelligence. I do not think that it is good enough simply to talk about the results of what AI does without examining how it does it.

Having said that, I turn to Amendment 80 and the Clause 16 stand part notice. Under Clause 16, the Government are proposing to remove Article 27 of the UK GDPR without any replacement. By removing the legal requirement on non-UK companies to retain a UK representative, the Government would deprive individuals of a local, accessible point of contact through which people can make data protection rights requests. That decision threatens people’s capacity to exercise their rights, reducing their ability to remain in control of their personal information.

The Government say that removing Article 27 will boost trade with the UK by reducing the compliance burden on non-UK businesses. But they have produced little evidence to support the notion that this will be the case and have overlooked the benefits in operational efficiency and cost savings that the representative can bring to non-UK companies. Even more worryingly, the Government appear to have made no assessment of the impact of the change on UK individuals, in particular vulnerable groups such as children. It is an ill-considered policy decision that would see the UK take a backward step in regulation at a time when numerous other jurisdictions, such as Switzerland, Turkey, South Korea, China and Thailand, are choosing to safeguard the extraterritorial application of their data protection regimes through the implementation of the legal requirement to appoint a representative.

The UK representative ensures that anyone in the UK wishing to make a privacy-related request has a local, accessible point of contact through which to do so. The representative plays a critical role in helping people to access non-UK companies and hold them accountable for the processing of their data. The representative further provides a direct link between the ICO and non-UK companies to enable the ICO to enforce the UK data protection regime against organisations outside the UK.

On the trade issue, the Government argue that by eliminating the cost of retaining a UK representative, non-UK companies will be more inclined to offer goods and services to individuals in the UK. Although there is undeniably a cost to non-UK companies of retaining a representative, the costs are significantly lower than the rather disproportionately inflated figures that were cited in the original impact assessment, which in some cases were up to 10 times the average market rate for representative services. The Government have put forward very little evidence to support the notion that removing Article 27 will boost trade with the UK.

There is an alternative approach. Currently, the Article 27 requirement to appoint a UK representative applies to data controllers and processors. An alternative approach to the removal of Article 27 in its entirety would be to retain the requirement but limit its scope so that it applies only to controllers. Along with the existing exemption at Article 27(2), this would reduce the number of non-UK companies required to appoint a representative, while arguably still preserving a local point of contact through which individuals in the UK can exercise their rights, as it is data controllers that are obliged under Articles 15 to 22 of the UK GDPR to respond to data subject access requests. That is a middle way that the Government could adopt.

Moving to Amendment 82, at present, the roles of senior responsible individual in the Bill and data protection officer under the EU GDPR appear to be incompatible. That is because the SRI is part of the organisation’s senior management, whereas a DPO must be independent of an organisation’s senior management. This puts organisations caught by both the EU GDPR and the UK GDPR in an impossible situation. At the very least, the Government must explain how they consider that these organisations can comply with both regimes in respect of the SRI and DPO provisions.

The idea of getting rid of the DPO runs completely contrary to the way in which we need to think about accountability for AI systems. We need senior management who understand the corporate significance of the AI systems they are adopting within the business. The ideal way forward would be for the DPO to be responsible for that when AI regulation comes in, but the Government seem to be completely oblivious to that. Again, it is highly frustrating for those of us who thought we had a pretty decent data protection regime to find this kind of watering down taking place in the face of the risks from artificial intelligence that are becoming more and more apparent as the days go by. I firmly believe that it will inhibit the application and adoption of AI within businesses if we do not have public trust and business certainty.

I now come to oppose the question that Clause 18, on the duty to keep records, stand part of the Bill. This clause seems to masquerade as an attempt to get rid of red tape. In reality, it makes organisations less likely to be compliant with the main obligations in the UK GDPR, as it will be amended by the Bill, and therefore heightens the risk both to the data subjects whose data they hold and to the organisations in terms of non-compliance. This is, of course, the duty to keep records. It is particularly unfair on small businesses that do not have the resources to take advice on these matters. Records of processing activities are one of the main ways in which organisations can meet the requirements of Article 5(2) of the UK GDPR to demonstrate their compliance. The obligation to demonstrate compliance remains unaltered under the Bill. Therefore, dispensing with the main way of achieving compliance with Article 5(2) is impractical and unhelpful.

At this point, I should say that we support Amendment 81 in the name of the noble Baroness, Lady Jones, which concerns the assessment of high-risk processing.

Our amendments on data protection impact assessments are Amendments 87, 88 and 89. Such assessments are currently required under Article 35 of the UK GDPR and are essential to ensuring that organisations do not deploy, and individuals are not subjected to, systems that may lead to unlawful, rights-violating or discriminatory outcomes. The Government’s data consultation response noted:

“The majority of respondents agreed that data protection impact assessments requirements are helpful in identifying and mitigating risk, and disagreed with the proposal to remove the requirement to undertake data protection impact assessments”.


However, under Clause 20, the requirement to perform an impact assessment would be seriously diluted. That is all I need to say. The Government frequently pray in aid the consultation—they say, “Well, we did that because of the consultation”—so why are they flying in the face of it? That seems an extraordinary thing to do in circumstances where impact assessments are regarded as a useful tool and training by business has clearly adjusted to them over the years since the Data Protection Act 2018.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, I rise to speak in support Amendments 79, 83, 85, 86, 93, 96, 97, 105 and 107, to which I have added my name. An awful lot has already been said. Given the hour of the day, I will try to be brief, but I want to speak to the child amendments I have put my name to and to the non-child ones and to raise things up a level.

The noble Lord, Lord Clement-Jones, talked about trust. I have spent the best part of the past 15 years running consumer and citizen digitally enabled services. The benefit that technology brings to life is clear to me but—this is a really important “but”—our customers and citizens need to trust what we do with their data, so establishing trust is really important.

One the bedrock of that trust is forcing—as a non-technologist, I use that word advisedly—technologists to set out what they are trying to do, what the technology they propose to build will do and what the risks and opportunities of that technology are. My experience as a non-engineer is that when you put engineers under pressure, they can speak English, but it is not their preferred language. They do not find it easy to articulate the risks and opportunities of the technology they are building, which is why forcing businesses that build these services to set out in advance the data protection impacts of the services they are building is so important. It is also why you have to design with safety in mind upfront because technology is so hard to retrofit. If you do not design it up front with ethics and safety at its core, it is gone by the time you see the impact in the real world.

15:15
The architecture that GDPR has created for us does that with data protection officers, independent individuals with legal responsibilities to speak truth to power in companies, and DPIAs—data protection impact assessments, tedious though they are to pull together, and I have done that—which are a really important exercise in forcing technologists to speak in English and set out what the real risks are.
As I speak, I have this sinking feeling that the Minister is going to say that the Bill as set out does not change any of that really, but I feel that the noble Baroness, Lady Kidron, set out in quite a lot of detail—I am confident that there will be even more when she writes to him—where the regime being proposed in the Bill weakens, rather than strengthens, the transparency and trust-building that I think are so important. Rather than just repeat the same worry, I ask my noble friend whether between Committee and Report we could get engaged in that detail.
I would like to believe that we all want the same thing, which is that we do not want to diminish the trust that citizens hold in public and private services that use their data. We do not want to stop that process that forces engineers to set out what they are trying to do and what the risks are. I am willing to be open-minded that there may be some improvements in the Bill—I cannot see them at the moment—but I think we have to go through that detail. As currently set out, it really looks to me that, as the noble Baroness, Lady Kidron, said, we are replacing granularity with general requirements, when it is granularity that we need. All my experience of building and running digital services is that the huge temptation as a non-technologist is to not get involved in the detail. I give everyone who is not a technologist and starts running a tech business this advice: do not be afraid of the detail, force your engineers to speak in English and get to the detail of what will change for your customers. I feel that we need to do the same with the Bill. I urge the Minister to engage with us between Committee and Report so that we are not voting on very high-end principles but are establishing the detail of what the Bill will do.
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank all noble Lords who have contributed to this very wide-ranging debate. Our amendments cover a lot of common ground, and we are in broad agreement on most issues, so I hope noble Lords will bear with me if I primarily focus on the amendments that I have tabled, although I will come back to other points.

We have given notice of our intention to oppose Clause 16 standing part of the Bill which is similar to Amendment 80 tabled by the noble Lord, Lord Clement-Jones, which probes why the Government have found it necessary to remove the requirement that companies outside the UK should appoint a representative within the UK. The current GDPR rules apply to all those active in the UK market, regardless of whether their organisation is based or located in the UK. The intention is that the representative will ensure UK compliance and act as a primary source of contact for data subjects. Without this clause, data subjects will be forced to deal with overseas data handlers, with all the cultural and language barriers that might ensue. There is no doubt that this will limit their rights to apply UK data standards.

In addition, as my colleagues in the Commons identified, the removal of the provisions in Clause 16 was not included in the Government’s consultation, so stakeholders have not had the chance to register some of the many practical concerns that they feel will arise from this change. There is also little evidence that compliance with Article 27 is an unnecessary barrier to responsible data use by reputable overseas companies. Again, this was a point made by the noble Lord, Lord Clement-Jones. In fact, the international trend is for more countries to add a representative obligation to their data protection laws, so we are becoming outriders on the global stage.

Not only is this an unnecessary change but, compared to other countries, it will send a signal that our data protection rights are being eroded in the UK. Of course, this raises the spectre of the EU revisiting whether our UK adequacy status should be retained. It also has implications for the different rules that might apply north and south of the border in Ireland so, again, if we are moving away from the standard rules applied by other countries, this has wider implications that we need to consider.

For many reasons, I challenge the Government to explain why this change was felt to be necessary. The noble Lord, Lord Clement-Jones, talked about whether the cost was really a factor. It did not seem that there were huge costs, compared to the benefits of maintaining the current system, and I would like to know in more detail why the Government are doing this.

Our Amendments 81 and 90 seek to ensure that there is a definition of “high-risk processing” in the Bill. The current changes in Clauses 17 and 20 have the effect of watering down data controllers’ responsibilities, from carrying out data protection impact assessments to assessing high-risk processing on the basis of whether it was necessary and what risks are posed. But nowhere does it say what constitutes high-risk processing—it is left to individual organisations to make that judgment—and nowhere does it explain what “necessary” means in this context. Is it also expected to be proportionate, as in the existing standards? This lack of clarity has caused some consternation among stakeholders.

The Equality and Human Rights Commission argues that the proposed wording means that

“data controllers are unlikely to go beyond minimum requirements”,

so the wording needs to be more explicit. It also recommends that

“the ICO be required to provide detailed guidance on how ‘the rights and freedoms of individuals’ are to be considered in an Assessment of High Risk Processing”.

More crucially, the ICO has written to Peers, saying that the Bill should contain a list of

“activities that government and Parliament view as high-risk processing, similar to the current list set out at Article 35(3) of the UK GDPR”.

This is what our Amendments 81 and 90 aim to achieve. I hope the Minister can agree to take these points on board and come back with amendments to achieve this.

The ICO also makes the case for future-proofing the way in which high-risk processing is regulated by making a provision in the Bill for the ICO to further designate high-risk processing activities with parliamentary approval. This would go further than the current drafting of Clause 20, which contains powers for the ICO to give examples of high-risk profiling, but only for guidance. Again, I hope that the Minister can agree to take these points on board and come back with suitable amendments.

Our Amendments 99, 100 and 102 specify the need for wider factors in the proposed risk assessment list to ensure that it underpins our equality laws. Again, this was an issue about which stakeholders have raised concerns. The TUC and the Institute for the Future of Work make the point that data protection impact assessments are a crucial basis for consultation with workers and trade unions about the use of technology at work, and this is even more important as the complexities of AI come on stream. The Public Law Project argues that, without rigorous risk and impact analysis, disproportionate and discriminatory processes could be carried out before the harm comes to light.

The Equality and Human Rights Commission argues that data protection impact assessments

“provide a key mechanism for ensuring equality impacts are assessed when public and private sector organisations embed AI systems in their operations”.

It specifically recommends that express references in Article 35(7) of GDPR to “legitimate interests” and

“the rights and freedoms of data subjects”,

as well as the consultation obligations in Article 35(2), should be retained. I hope that the Minister can agree to take these recommendations on board and come back with suitable amendments to ensure that our equalities legislation is protected.

Our Amendments 106 and 108 focus on the particular responsibilities of data controllers to handle health data with specific obligations. This is an issue that we know, from previous debates, is a major cause for concern among the general public, who would be alarmed if they thought that the protections were being weakened.

The BMA has raised concerns that Clauses 20 and 21 will water down our high standards of data governance, which are necessary when organisations are handling health data. As it says,

“Removing the requirement to conduct a thorough assessment of risks posed to health data is likely to lead to a less diligent approach to data protection for individuals”.


It also argues that removing the requirement for organisations to consult the ICO on high-risk processing is,

“a backward step from good governance … when organisations are processing large quantities of sensitive health data.

Our amendments aim to address these concerns by specifying that, with regard to specific cases, such as the handling of health data, prior consultation with the ICO should remain mandatory. I hope that the Minister will see the sense in these amendments and recognise that further action is needed in this Bill to maintain public trust in how health data is managed for individual care and systemwide scientific development.

I realise that we have covered a vast range of issues, but I want to touch briefly on those raised by the noble Baroness, Lady Kidron. She is right that, in particular, applications of risk assessments by public bodies should be maintained, and we agree with her that Article 35’s privacy-by-design requirements should be retained. She once again highlighted the downgrading of children’s rights in this Bill, whether by accident or intent, and we look forward to seeing the exchange of letters with the Minister on this. I hope that we will all be copied in and that the Minister will take on board the widespread view that we should have more engagement on this before Report, because there are so many outstanding issues to be resolved. I look forward to the Minister’s response.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the noble Baronesses, Lady Kidron and Lady Jones, and the noble Lord, Lord Clement-Jones, for their amendments, and I look forward to receiving the letter from the noble Baroness, Lady Kidron, which I will respond to as quickly as I can. As everybody observed, this is a huge group, and it has been very difficult for everybody to do justice to all the points. I shall do my best, but these are points that go to the heart of the changes we are making. I am very happy to continue engaging on that basis, because we need plenty of time to review them—but, that said, off we go.

The changes the Government are making to the accountability obligations are intended to make the law clearer and less prescriptive. They will enable organisations to focus on areas that pose high risks to people resulting, the Government believe, in improved outcomes. The new provisions on assessments of high-risk processing are less prescriptive about the precise circumstances in which a risk assessment would be required, as we think organisations are best placed to judge whether a particular activity poses a high risk to individuals in the context of the situation.

However, the Government are still committed to high standards of data protection, and there are many similarities between our new risk assessment measures and the previous provisions. When an organisation is carrying out processing activities that are likely to pose a high risk to individuals, it will still be expected to document that processing, assess risks and identify mitigations. As before, no such document would be required where organisations are carrying out low-risk processing activities.

One of the main aims of the Bill is to remove some of the UK GDPR’s unnecessary compliance burdens. That is why organisations will be required to designate senior responsible individuals, keep records of processing and carry out the risk assessments above only when their activities pose high risks to individuals.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

The noble Viscount is very interestingly unpacking a risk-based approach to data protection under the Bill. Why are the Government not taking a risk-based approach to their AI regulation? After all, the AI Act approaches it in exactly that way.

15:30
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

That is a very interesting question, but I am not sure that there is a read-across between the AI Act and our approach here. The fundamental starting point was that, although the provisions of the original GDPR are extremely important, the burdens of compliance were not proportionate to the results. The overall foundation of the DPDI is, while at least maintaining existing levels of protection, to reduce the burdens of demonstrating or complying with that regulation. That is the thrust of it—that is what we are trying to achieve—but noble Lords will have different views about how successful we are being at either of those. It is an attempt to make it easier to be safe and to comply with the regulations of the DPDI and the other Acts that govern data protection. That is where we are coming from and the thrust of what we are trying to achieve.

I note that, as we have previously discussed, children need particular protection when organisations are collecting and processing their personal data.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I did not interrupt before because I thought that the Minister would say more about the difference between high-risk and low-risk processing, but he is going on to talk about children. One of my points was about the request from the Information Commissioner—it is very unusual for him to intervene. He said that a list of high-risk processing activities should be set out in the Bill. I do not know whether the Minister was going to address that important point.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I will briefly address it now. Based on that letter, the Government’s view is to avoid prescription and I believe that the ICO’s view— I cannot speak for it—is generally the same, except for a few examples where prescription needs to be specified in the Bill. I will continue to engage with the ICO on where exactly to draw that line.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I can see that there is a difference of opinion, but it is unusual for a regulator to go into print with it. Not only that, but he has set it all out in an annexe. What discussion is taking place directly between the Minister and his team and the ICO? There seems to be quite a gulf between them. This is number 1 among his “areas of ongoing concern”.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I do not know whether it is usual or unusual for the regulator to engage in this way, but the Bill team engages with the Information Commissioner frequently and regularly, and, needless to say, it will continue to do so on this and other matters.

Children need particular protection when organisations are collecting and processing their personal data, because they may be less aware of the risks involved. If organisations process children’s personal data, they should think about the need to protect them from the outset and design their systems and processes with this in mind.

Before I turn to the substance of what the Bill does with the provisions on high-risk processing, I will deal with the first amendment in this group: Amendment 79. It would require data processors to consider data protection-by-design requirements in the same way that data controllers do, because there is a concern that controllers may not always be able to foresee what processors do with people’s data for services such as AI and cloud computing.

However, under the current legislation, it should not be for the processor to determine the nature or purposes of the processing activity, as it will enter a binding controller-processor agreement or contract to deliver a specific task. Processors also have specific duties under the UK GDPR to keep personal data safe and secure, which should mean that this amendment is not necessary.

I turn to the Clause 16 stand part notice, which seeks to remove Clause 16 from the Bill and reinstate Article 27, and Amendment 80, which seeks to do the same but just in respect of overseas data controllers, not processors. I assure the noble Lord, Lord Clement-Jones, that, even without the Article 27 representative requirement, controllers and processors will still have to maintain contact and co-operation with UK data subjects and the ICO to comply with the UK GDPR provisions. These include Articles 12 to 14, which, taken together, require controllers to provide their contact details in a concise, transparent, intelligible and easily accessible form, using clear and plain language, particularly for any information addressed specifically to a child.

By offering firms a choice on whether to appoint a representative in the UK to help them with UK GDPR compliance and no longer mandating organisations to appoint a representative, we are allowing organisations to decide for themselves the best way to comply with the existing requirements for effective communication and co-operation. Removing the representative requirement will also reduce unnecessary burdens on non-UK controllers and processors while maintaining data subjects’ safeguards and rights. Any costs associated with appointing a representative are a burden on and a barrier to trade. Although the variety of packages made available by representative provider organisations differ, our assessments show that the cost of appointing representatives increases with the size of a firm. Furthermore, there are several jurisdictions that do not have a mandatory or equivalent representative requirement in their data protection law, including other countries in receipt of EU data adequacy decisions.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Nevertheless, does the Minister accept that quite a lot of countries have now begun the process of requiring representatives to be appointed? How does he account for that? Does he accept that what the Government are doing is placing the interests of business over those of data subjects in this context?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

No, I do not accept that at all. I would suggest that we are saying to businesses, “You must provide access to the ICO and data subjects in a way that is usable by all parties, but you must do so in the manner that makes the most sense to you”. That is a good example of going after outcomes but not insisting on any particular process or methodology in a one-size-fits-all way.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

The Minister mentioned the freedom to choose the best solution. Would it be possible for someone to be told that their contact was someone who spoke a different language to them? Do they have to be able to communicate properly with the data subjects in this country?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Yes—if the person they were supposed to communicate with did not speak English or was not available during reasonable hours, that would be in violation of the requirement.

I apologise if we briefly revisit some of our earlier discussion here, but Amendment 81 would reintroduce a list of high-risk processing activities drawn from Article 35 of the UK GDPR, with a view to helping data controllers comply with the new requirements around designating a senior responsible individual.

The Government have consulted closely with the ICO throughout the development of all the provisions in the Bill, and we welcome its feedback as it upholds data subjects’ rights. We recognise and respect that the ICO’s view on this issue is different to the Government’s, but the Government feel that adding a prescriptive list to the legislation would not be appropriate for the reasons we have discussed. However, as I say, we will continue to engage with it over the course of the passage of the Bill.

Some of the language in Article 35 of the UK GDPR is unclear and confusing, which is partly why we removed it in the first place. We believe organisations should have the ability to make a judgment of risk based on the specific nature, scale and context of their own processing activities. We do not need to provide prescriptive examples of high-risk processing on the face of legislation because any list could quickly become out of date. Instead, to help data controllers, Clause 20 requires the ICO to produce a document with examples of what the commissioner considers to be high-risk processing activities.

I turn to Clause 17 and Amendment 82. The changes we are making in the Bill will reduce prescription by removing the requirement to appoint a data protection officer in certain circumstances. Instead, public bodies and other organisations carrying out high-risk processing activities will have to designate a senior responsible individual to ensure that data protection risks are managed effectively within their organisations. That person will have flexibility about how they manage data protection risks. They might decide to delegate tasks to independent data protection experts or upskill existing staff members, but they will not be forced to appoint data protection officers if suitable alternatives are available.

The primary rationale for moving to a senior responsible individual model is to embed data protection at the heart of an organisation by ensuring that someone in senior management takes responsibility and accountability for it if the organisation is a public body or is carrying out high-risk processing. If organisations have already appointed data protection officers and want to keep an independent expert to advise them, they will be free to do so, providing that they also designate a senior manager to take overall accountability and provide sufficient support, including resources.

Amendment 83, tabled by the noble Baroness, Lady Kidron, would require the senior responsible individual to specifically consider the risks to children when advising the controller on its responsibilities. As drafted, Clause 17 of the Bill requires the senior responsible individual to perform a number of tasks or, if they cannot do so themselves, to make sure that they are performed by another person. They include monitoring the controller’s compliance with the legislation, advising the controller of its obligations and organising relevant training for employees who carry out the processing of personal data. Where the organisation is processing children’s data, all these requirements will be relevant. The senior responsible individual will need to make sure that any guidance and training reflects the type of data being processed and any specific obligations the controller has in respect of that data. I hope that this goes some way to convincing the noble Baroness not to press her amendment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

The Minister has not really explained the reason for the switch from the DPO to the new system. Is it another one of his “We don’t want a one-size-fits-all approach” arguments? What is the underlying rationale for it? Looking at compliance costs, which the Government seem to be very keen on, we will potentially have a whole new cadre of people who will need to be trained in compliance requirements.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

The data protection officer—I speak as a recovering data protection officer—is tasked with certain specific outcomes but does not necessarily have to be a senior person within the organisation. Indeed, in many cases, they can be an external adviser to the organisation. On the other hand, the senior responsible individual is a senior or board-level representative within the organisation and can take overall accountability for data privacy and data protection for that organisation. Once that accountable person is appointed, he or she can of course appoint a DPO or equivalent role or separate the role among other people as they see fit. That gives everybody the flexibility to meet the needs of privacy as they see fit, but not necessarily in a one-size-fits-all way. That is the philosophical approach.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Does the Minister accept that the SRI will have to cope with having at least a glimmering of an understanding of what will be a rather large Act?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Yes, the SRI will absolutely have to understand all the organisation’s obligations under this Act and indeed other Acts. As with any senior person in any organisation responsible for compliance, they will need to understand the laws that they are complying with.

Amendment 84, tabled by the noble Lord, Lord Clement-Jones, is about the advice given to senior responsible individuals by the ICO. We believe that the commissioner should have full discretion to enforce data protection in an independent, flexible, risk-based and proportionate manner. The amendment would tie the hands of the regulator and force them to give binding advice and proactive assurance without full knowledge of the facts, undermining their regulatory enforcement role.

15:45
Clause 18 deals with the new record-keeping requirements in the Bill. The Clause 18 stand part debate in the name of noble Lord, Lord Clement-Jones, would remove the clause in favour of retaining the existing requirements in new Article 30 of the UK GDPR. However, those provisions require most organisations to keep records of their processing activities and include a list of requirements that should be included in the record. That can lead to unnecessary paperwork, form-filling and cost, and to less focus on higher-risk processing. Although there is an exemption from these requirements in new Article 30 of the UK GDPR for small businesses, it has a limited impact because it does not apply to processing that is not “occasional”, where the processing poses risks to people or involves special categories of data.
Clause 18 will replace the record-keeping requirements under new Article 30. It will make it easier for data controllers to understand exactly what needs to be included in the record. Most importantly, organisations of any size will no longer have to keep records of processing, unless their activities are
“likely to result in a high risk to … individuals”.
That should help small businesses in particular, which have found the current small business exemption difficult to understand and apply in practice. Organisations will need to continue to comply with the data protection principles, even if they are no longer required to keep records of processing.
Amendments 85 and 86, put forward by the noble Baroness, Lady Kidron, would require any records kept by controllers to take account of the fact that a higher standard of protection is needed for children than adults. However, the clause already requires organisations to consider the context and nature of the processing and the likely risks arising to people of all ages when determining whether the record-keeping provisions apply. The ICO will be required to publish a document with a list of examples that it considers to be high-risk processing activities and we—
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

The Minister has reached his 20 minutes. We nudged him at 15 minutes.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Do I have to shut up?

Lord Harlech Portrait Lord Harlech (Con)
- Hansard - - - Excerpts

My Lords, just for clarification, because a number of questions were raised, if the Committee feels that it would like to hear more from the Minister, it can. It is for the mood of the Committee to decide.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I would like to hear from the Minister.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Yes. We will not stand on ceremony.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

As long as that applies to us on occasion as well.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I apologise for going over. I will try to be as quick as possible.

I turn now to the amendments on the new provisions on assessments of high-risk processing in Clause 20. Amendments 87, 88, 89, 91, 92, 93, 94, 95, 97, 98 and 101 seek to reinstate requirements in new Article 35 of the UK GDPR on data protection impact assessments, and, in some areas, make them even more onerous for public authorities. Amendment 90 seeks to reintroduce a list of high-risk processing activities drawn from new Article 35, with a view to help data controllers comply with the new requirements on carrying out assessments of high-risk processing.

Amendment 96, tabled by the noble Baroness, Lady Kidron, seeks to amend Clause 20, so that, where an internet service is likely to be accessed by children, the processing is automatically classed as high risk and the controller must do a children’s data protection impact assessment. Of course, I fully understand why the noble Baroness would like those measures to apply automatically to organisations processing children’s data, and particularly to internet services likely to be accessed by children. It is highly likely that many of the internet services that she is most concerned about will be undertaking high-risk activities, and they would therefore need to undertake a risk assessment.

Under the current provisions in Clause 20, organisations will still have to undertake risk assessments where their processing activities are likely to pose high risks to individuals, but they should have the ability to assess the level of risk based on the specific nature, scale and context of their own processing activities. Data controllers do not need to be directed by government or Parliament about every processing activity that will likely require a risk assessment, but the amendments would reintroduce a level of prescriptiveness that we were seeking to remove.

Clause 20 requires the ICO to publish a list of examples of the types of processing activities that it considers would pose high risks for the purposes of these provisions, which will help controllers to determine whether a risk assessment is needed. This will provide organisations with more contemporary and practical help than a fixed list of examples in primary legislation could. The ICO will be required to publish a document with a list of examples that it considers to be high-risk processing activities, and we fully expect the vulnerability age of data subjects to be a feature of that. The commissioner’s current guidance on data protection impact assessments already describes the use of the personal data of children or other vulnerable individuals for marketing purposes, profiling or offering internet services directly to children as examples of high-risk processing, although the Government cannot of course tell the ICO what to include in its new guidance.

Similarly, in relation to Amendments 99, 100 and 102 from the noble Baroness, Lady Jones, it should not be necessary for this clause to specifically require organisations to consider risks associated with automated decision-making or obligations under equalities legislation. That is because the existing clause already requires controllers to consider any risks to individuals and to describe

“how the controller proposes to mitigate those risks”.

I am being asked to wrap up and so, in the interests of time, I shall write with my remaining comments. I have no doubt that noble Lords are sick of the sound of my voice by now.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I hope that no noble Lord expects me to pull all that together. However, I will mention a couple of things.

With this group, the Minister finally has said all the reasons why everything will be different and less. Those responsible for writing the Minister’s speeches should be more transparent about the Government’s intention, because “organisations are best placed to determine what is high-risk”—not the ICO, not Parliament, not existing data law. Organisations are also for themselves. They are “best placed to decide on their representation”, whether it is here or there and whether it speaks English or not, and they “get to decide whether they have a DPO or a senior responsible individual”. Those are three quotes from the Minister’s speech. If organisations are in charge of the bar of data protection and the definition of data protection, I do believe that this is a weakening of the data protection regime. He also said that organisations are responsible for the quality of their risk assessment. Those are four places in this group alone.

At the beginning, the noble Baroness, Lady Harding, talked about the trust of consumers and citizens. I do not think that this engenders trust. The architecture is so keen to get rid of ways of accessing rights that some organisations may have to have a DPO and a DPIA—a doubling rather than a reducing of burden. Very early on—it feels a long time ago—a number of noble Lords talked about the granular detail. I tried in my own contribution to show how very different it is in detail. So I ask the Minister to reflect on the assertion that you can take out the detail and have the same outcome. All the burden being removed is on one side of the equation, just as we enter into a world in which AI, which is built on people’s data, is coming in the other direction.

I will of course withdraw my amendment, but I believe that Clauses 20, 18 and the other clauses we just discussed are deregulation measures. That should be made clear from the Dispatch Box, and that is a choice that the House will have to make.

Before I sit down, I do want to recognise one thing, which is that the Minister said that he would work alongside us between now and Report; I thank him for that, and I accept that. I also noted that he said that it was a responsibility to take care of children by default. I agree with him; I would like to see that in the Bill. I beg leave to withdraw my amendment.

Amendment 79 withdrawn.
Clause 15 agreed.
Clause 16: Removal of requirement for representatives for controllers etc outside the UK
Amendment 80 not moved.
Clause 16 agreed.
Clause 17: Senior responsible individual
Amendments 81 to 84 not moved.
Clause 17 agreed.
Clause 18: Duty to keep records
Amendments 85 and 86 not moved.
Clause 18 agreed.
Clause 19: Logging of law enforcement processing
Debate on whether Clause 19 should stand part of the Bill.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, just in passing, I will say that I am beginning to feel that the decision made by the Privileges Committee and now the House is beginning to creak in terms of the very first Grand Committee that it has encountered. So, in terms of time limits, I think flexibility on Grand Committee in particular is absolutely crucial. I am afraid that the current procedures will not necessarily stand the test of time—but we shall see.

This is a relatively short debate on whether Clause 19 should stand part, but it is a really significant clause, and it is another non-trust-engendering provision. This basically takes away the duty of the police to provide justification for why they are consulting or sharing personal data. Prompted by the National AIDS Trust, we believe that the Bill must retain the duty on police forces to justify why they have accessed an individual’s personal data.

This clause removes an important check on police processing of an individual’s personal data. The NAT has been involved in cases of people living with HIV whose HIV status was shared without their consent by police officers, both internally within their police station and within the wider communities that they serve. Therefore, ensuring that police officers justify why they have accessed an individual’s personal data is vital evidence in cases of police misconduct. Such cases include when a person’s HIV status is shared inappropriately by the police, or when it is not relevant to an investigation of criminal activity.

The noble Baroness, Lady Kidron, was extremely eloquent in her winding up of the last group. The Minister really needs to come back and tell us what on earth the motivation is behind this particular Clause 19. I beg to move that this clause should not stand part of the Bill.

16:00
Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

As the noble Lord, Lord Clement-Jones, explained, his intention to oppose the question that Clause 19 stands part seeks to retain the status quo. As I read Section 62 of the Data Protection Act 2016, it obliges competent authorities to keep logs of their processing activities, whether they be for collection, alteration, consultation, disclosure, combination or the erasing of personal data. The primary purpose is for self-monitoring purposes, largely linked to disciplinary proceedings, as the noble Lord said, where an officer has become a suspect by virtue of inappropriately accessing PNC-held data.

Clause 19 removes the requirement for a competent authority to record a justification in the logs only when consulting or disclosing personal data. The Explanatory Note to the Bill explains this change as follows:

“It is … technologically challenging for systems to automatically record the justification without manual input”.


That is not a sufficiently strong reason for removing the requirement, not least because the remaining requirements of Section 62 of the Data Protection Act 2018 relating to the logs of consultation and disclosure activity will be retained and include the need to record the date and time and the identity of the person accessing the log. Presumably they will be able to be manually input, so why remove the one piece of data that might, in an investigation of abuse or misuse of the system, be useful in terms of evidence and self-incrimination? I do not understand the logic behind that at all.

I rather think the noble Lord, Lord Clement-Jones, has an important point. He has linked it to those who have been unfortunate enough to be AIDS sufferers, and I am sure that there are other people who have become victims where cases would be brought forward. I am not convinced that the clause should stand part, and we support the noble Lord in seeking its deletion.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

This is a mercifully short group on this occasion. I thank the noble Lord, Lord Clement-Jones, for the amendment, which seeks to remove Clause 19 from the Bill. Section 62 of the Data Protection Act requires law enforcement agencies to record when personal data has been accessed and why. Clause 19 does not remove the need for police to justify their processing; it simply removes the ineffective administrative requirement to record that justification in a log.

The justification entry was intended to help to monitor and detect unlawful access. However, the reality is that anyone accessing data unlawfully is very unlikely to record an honest justification, making this in practice an unreliable means of monitoring misconduct or unlawful processing. Records of when data was accessed and by whom can be automatically captured and will remain, thereby continuing to ensure accountability.

In addition, the National Police Chiefs’ Council’s view is that this change will not hamper any investigations to identify the unlawful processing of data. That is because it is unlikely that an individual accessing data unlawfully would enter an honest justification, so capturing this information is unlikely to be useful in any investigation into misconduct. The requirements to record the time, date and, as far as possible, the identity of the person accessing the data will remain, as will the obligation that there is lawful reason for the access, ensuring that accountability and protection for data subjects is maintained.

Police officers inform us that the current requirement places an unnecessary burden on them as they have to update the log manually. The Government estimate that the clause could save approximately 1.5 million policing hours, representing a saving in the region of £46.5 million per year.

I understand that the amendment relates to representations made by the National AIDS Trust concerning the level of protection for people’s HIV status. As I believe I said on Monday, the Government agree that the protection of people’s HIV status is vital. We have met the National AIDS Trust to discuss the best solutions to the problems it has raised. For these reasons, I hope the noble Lord will not oppose Clause 19 standing part.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I thank the Minister for his response, but he has left us tantalised about the outcome of his meeting. What is the solution that he has suggested? We are none the wiser as a result of his response.

This pudding has been well over-egged by the National Police Chiefs’ Council. Already, only certain senior officers and the data protection leads in police forces have access to this functionality. There will continue to be a legal requirement to record the time and date of access. They are required to follow a College of Policing code of practice. Is the Minister really saying that recording a justification for accessing personal data is such an onerous requirement that £46.5 million in police time will be saved as a result of this? Over what period? That sounds completely disproportionate.

The fact is that the recording of the justification, whether or not it is false and cannot be relied upon as evidence, is rather useful because it is evidence of police misconduct in relation to inappropriately accessing personal data. They are actually saying: “We did it for this purpose”, when it clearly was not. I am not at all surprised that the National AIDS Trust is worried about this. The College of Policing code of practice does not mention logging requirements in detail. It references them just once in relation to automated systems that process data.

I am extremely grateful to the noble Lord, Lord Bassam, for what he had to say. It seems to me that we do not have any confidence on this side of the House that removing this requirement provides enough security that officers will be held to account if they share an individual’s special category data inappropriately. I do not think the Minister has really answered the concerns, but I beg leave to withdraw my objection to the clause standing part.

Clause 19 agreed.
Clause 20: Assessment of high risk processing
Amendments 87 to 102 not moved.
Amendment 103
Moved by
103: Clause 20, page 41, line 34, at end insert—
“(e) a description of how the controller will enforce purpose limitation, and(f) evidence of how individual information rights are enabled at the point of collection and after processing (if subsection (3A) is not routinely applied)”Member's explanatory statement
Large language models are accessing data that includes personal data. There are existing web protocols that can prevent this, but they are little known, difficult to navigate and require an opt out. This amendment and another in my name to Clause 20 would require either proof of legitimate interest, or prior permission from data subjects unless the company routinely give an easily accessible machine readable opt in.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I am somewhat disappointed to be talking to these amendments in the dying hours of our Committee before we take a break because many noble Lords—indeed, many people outside the House—have contacted me about them. I particularly want to record the regret of the noble Lord, Lord Black, who is a signatory to these amendments, that he is unable to be with us today.

The battle between rights-holders and the tech sector is nothing new. Many noble Lords will remember the arrival and demise of file-sharing platform Napster and the subsequent settlement between the sector and the giant creative industries. Napster argued that it was merely providing a platform for users to share files and was not responsible for the actions of its users; the courts sided with the music industry, and Napster was ordered to shut down its operations in 2001. The “mere conduit” argument was debunked two decades ago. To the frustration of many of us, the lawsuits led to a perverse outcome that violent bullying or sexually explicit content would be left up for days, weeks or forever, while a birthday video with the temerity to have music in the background would be deleted almost immediately.

The emergence of the large language models—LLMs—and the desire on the part of LLM developers to scrape the open web to capture as much text, data and images as possible raise some of the same issues. The scale of scraping is, by their own admission, unprecedented, and their hunger for data at any cost in an arms race for AI dominance is publicly acknowledged, setting up a tension between the companies that want the data and data subjects and creative rights holders. A data controller who publishes personal data as part of a news story, for example, may do so on the basis of an exemption under data protection law for journalism, only for that data to be scraped and commingled with other data scraped from the open web to train an LLM.

This raises issues of copyright infringement and, more importantly—whether for individuals, creative communities or businesses that depend on the value of what they produce—these scraping activities happen invisibly. Anonymous bots acting on behalf of AI developers, or conducting a scrape as a potential supplier to AI developers, are scraping websites without notifying data controllers or data subjects. In doing so, they are also silent on whether processes are in place to minimise risks or balance competing interests, as required by current data law.

Amendment 103 would address those risks by requiring documentation and transparency. Proposed new paragraph (e) would require an AI developer to document how the data controller will enforce purpose limitation. This is essential, given that invisible data processing enabled through web scraping can pick up material that is published for a legitimate purpose, such as journalism, but the combination of such information with other data accessed through invisible data processing could change the purpose and application of that data in ways that the individual may wish to object to using their existing data rights. Proposed new paragraph (f) would require a data processor seeking to use legitimate interest as the basis for web scraping and invisible processing to build LLMs to document evidence of how they have ensured that individual information rights have been enabled at the point of collection and after processing.

Together, those proposed new paragraphs would mean that anyone who scrapes web data must be able to show that the data subjects have meaningful control and can access their information rights ahead of processing. These would be mandatory, unless they have incorporated an easily accessible machine-readable protocol on an opt-in basis, which is then the subject of Amendment 104.

Amendment 104 would require web scrapers to establish an easily accessible machine-readable protocol that works on an opt-in basis rather than the current opt-out. Undoubtedly, the words “easily”, “accessible”, “machine readable” and “web protocols” would all benefit from guidance from the ICO but, for the absence of doubt, the intention of the amendment is that a web scraper would proactively notify individuals and website owners that scraping of their data will take place, including stating the identity of the data processor and the purpose for which that data is to be scraped. In addition, the data processor will provide information on how data subjects and data controllers can exercise their information rights to opt out of their data being scraped before any such scraping takes place, with an option to object after the event if taken without permission.

We are in a situation in which not only is IP being taken at scale, potentially impoverishing our very valuable creative industries, journalism and academic work that is then regurgitated inaccurately, but which is making a mockery of individual data rights. In its recent consultation into the lawful basis for web scraping, the ICO determined that use of web-scraped data

“can be feasible if generative AI developers take their legal obligations seriously and can evidence and demonstrate this in practice”.

These amendments would operationalise that demonstration. As it stands, there is routine failure, particularly regarding new models. For example, the ICO’s preliminary enforcement notice against Snap is that its risk assessment for its AI tool was inadequate.

Noble Lords will appreciate the significance of the connection that the ICO draws between innovative technology and children’s personal data, given the heightened data rights and protections that children are afforded under the age-appropriate design code. While I welcome the ICO’s action, holders of intellectual copyright have been left to fend for themselves, since government talks have failed and individual data subjects are left exposed. Whether it is the scraping of social media or work and school websites, these will not be pursued by the ICO because regulating action in such small increments is disproportionate, yet this lack of compliance is happening at scale.

16:15
The ICO suggests that web developers using web scraped data collected on either a first or third-party basis to train generative AI models need to be able to:
“Evidence and identify a valid and clear interest”.
They also need to:
“Consider the balancing test particularly carefully”
of the developer’s interest against individual interests when they are unlikely to know that their personal data is being used in this way and the developer does
“not or cannot exercise meaningful control over the use of the model”,
and to
“Demonstrate how the interest they have identified will be realised, and how the risks to individuals will be meaningfully mitigated, including their access to their information rights”.
None of these is currently being done, and all this should be seen in light of previous debates on previous groups. The Minister has already told noble Lords that negotiations between rights holders and the tech sector have failed—or, as I believe he said, “Sadly, no consensus was reached”.
Across the world, IP holders are going to the courts. The New York Times is suing Microsoft and OpenAI for what it claims is the large-scale commercial exploitation of its content to train Open AI’s ChatGPT, Microsoft Bing Chat and Microsoft 365 Copilot. As we will discuss later in Committee, the casual scraping of images of children from public places is happening at scale, and these images are turning up as AI kids, some of which then find their way into AI-generated CSAM and other violent and predatory materials.
The new LLMs promise vast changes to society, some of which are tantalisingly close, such as leaps in medical science, and others that we should all hope are further away, such as widespread unemployment or lethal robots. The two amendments in my name will not solve all the issues that we really should be discussing rather than messing around at the edges of the GDPR, but, while modest in nature, they would be transformative for data subjects and rights holders. They would allow a point of negotiation about the value of what is being shared by giving an option not to share. They also give the regulator a more robust avenue to consider the risks to individuals, including vulnerable users. Surely, we should not tolerate a situation in which an entire school website, or social media content including family photographs, is taken silently, without permission.
Finally, while the requirement to opt in that I am proposing is new, the technology is not. Unknown to almost all users of the digital world, there have long been protocols such as robots.txt that have worked on the basis that you can signal to a web scraper that you do not wish them to scrape your data. These protocols are currently the equivalent of a polite parish notice, with no consequences if they are ignored by web scrapers, whether a large corporation, an innovative start-up or someone acting on behalf of a foreign power. Given the arms race currently taking place to build LLMs and new forms of generative AI to service everything from creative activities to public services and military applications, these protocols are long overdue an upgrade, which my amendments seek to do.
The reality is that, without an obligation for these scrapers to provide transparency of who they are, the identity of their scraper, and the purpose for which they are scraping before the activity takes place, it is currently impossible for almost anyone, whether a data controller or data subject, to express their information rights. When the noble Lord the Minister responds, I hope that he will acknowledge that it is neither proportionate nor practical to ask the general public or small business to undertake a computer science degree or equivalent in order to access their data rights, and that the widespread abuse of UK data rights by web scraping without permission undermines the very purpose of the legislation. I beg to move.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, given the hour, I will be brief. That was an absolute tour de force by the noble Baroness. As with all the Minister’s speeches, I will read her speech over Easter.

I was very interested to be reminded of the history of Napster, because that was when many of us realised that we were, in many ways, entering the digital age in the creative industries and beyond. The amendments that the noble Baroness put forward are examples of where the Bill could make a positive impact, unlike the impact that so much of the rest of it is making in watering down rights. She described cogently how large language models are ingesting or scraping data from the internet, social media and journalism, how very close to the ingestion of copyright material this whole agenda is and how it is being done by anonymous bots in particular. It fits very well with the debate in which the Minister was involved last Friday on the Private Member’s Bill of the noble Lord, Lord Holmes, who inserted a clause requiring transparency on the ingestion or scraping of data and copyright material by large language models. It is very interesting.

The opportunity in the data area is currently much greater than it is in the intellectual property area. At least we have the ICO, which is a regulator, unlike the IPO, which is not really a regulator with teeth. I am very interested in the fact that the ICO is conducting a consultation on generative AI and data protection, which it launched in January. Conterminously with this Bill, perhaps the ICO might come to some conclusions that we can use. That would of course include the whole area of biometrics, which, in the light of things such as deepfakes and so on, is increasingly an issue of great concern. The watchword is “transparency”: we must impose a duty on the generative AI models about the use of the material that they use to train their models and then use in operation. I fully support Amendments 103 and 104 in the name of the noble Baroness, even though, as she describes them, they are a small step.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I, too, will be relatively brief. I thank the noble Baroness, Lady Kidron, for her amendments, to which I was very pleased to add my name. She raised an important point about the practice of web scrapers, who take data from a variety of sources to construct large language models without the knowledge or permission of web owners and data subjects. This is a huge issue that should have been a much more central focus of the Bill. Like the noble Baroness, I am sorry that the Government did not see fit to use the Bill to bring in some controls on this increasingly prevalent practice, because that would have been a more constructive use of our time than debating the many unnecessary changes that we have been debating so far.

As the noble Baroness said, large language models are built on capturing text, data and images from infinite sources without the permission of the original creator of the material. As she also said, it is making a mockery of our existing data rights. It raises issues around copyright and intellectual property, and around personal information that is provided for one purpose and commandeered by web scrapers for another. That process often happens in the shadows, whereby the owner of the information finds out only much later that their content has been repurposed.

What is worse is that the application of AI means that material provided in good faith can be distorted or corrupted by the bots scraping the internet. The current generation of LLMs are notorious for hallucinations in which good quality research or journalistic copy is misrepresented or misquoted in its new incarnation. There are also numerous examples of bias creeping into the LLM output, which includes personal data. As the noble Baroness rightly said, the casual scraping of children’s images and data is undermining the very essence of our existing data protection legislation.

It is welcome that the Information Commissioner has intervened on this. He argued that LLMs should be compliant with the Data Protection Act and should evidence how they are complying with their legal obligations. This includes individuals being able to exercise their information rights. Currently, we are a long way from that being a reality and a practice. This is about enforcement as much as giving guidance.

I am pleased that the noble Baroness tabled these amendments. They raise important issues about individuals giving prior permission for their data to be used unless there is an easily accessible opt-out mechanism. I would like to know what the Minister thinks about all this. Does he think that the current legislation is sufficient to regulate the rise of LLMs? If it is not, what are the Government doing to address the increasingly widespread concerns about the legitimacy of web scraping? Have the Government considered using the Bill to introduce additional powers to protect against the misuse of personal and creative output?

In the meantime, does the Minister accept the amendments in the name of the noble Baroness, Lady Kidron? As we have said, they are only a small part of a much bigger problem, but they are a helpful initiative to build in some basic protections in the use of personal data. This is a real challenge to the Government to step up to the mark and be seen to address these important issues. I hope the Minister will say that he is happy to work with the noble Baroness and others to take these issues forward. We would be doing a good service to data citizens around the country if we did so.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the noble Baroness, Lady Kidron, for tabling these amendments. I absolutely recognise their intent. I understand that they are motivated by a concern about invisible types of processing or repurposing of data when it may not be clear to people how their data is being used or how they can exercise their rights in respect of the data.

On the specific points raised by noble Lords about intellectual property rather than personal data, I note that, in their response to the AI White Paper consultation, the Government committed soon to provide a public update on their approach to AI and intellectual property, noting the importance of greater transparency in the use of copyrighted material to train models, as well as labelling and attribution of outputs.

Amendment 103 would amend the risk-assessment provisions in Clause 20 so that any assessment of high-risk processing would always include an assessment of how the data controller would comply with the purpose limitation principle and how any new processing activity would be designed so that people could exercise their rights in respect of the data at the time it was collected and at any subsequent occasion.

I respectfully submit that this amendment is not necessary. The existing provisions in Clause 20, on risk assessments, already require controllers to assess the potential risks their processing activities pose to individuals and to describe how those risks would be mitigated. This would clearly include any risk that the proposed processing activities would not comply with the data protection principles—for example, because they lacked transparency—and would make it impossible for people to exercise their rights.

Similarly, any assessment of risk would need to take account of any risks related to difficulties in complying with the purpose limitation principle—for example, if the organisation had no way of limiting who the data would be shared with as a result of the proposed processing activity.

According to draft ICO guidance on generative AI, the legitimate interests lawful ground under Article 6(1)(f) of the UK GDPR can be a valid lawful ground for training generative AI models on web-scrape data, but only when the model’s developer can ensure that they pass the three-part test—that is, they identify a legitimate interest, demonstrate that the processing is necessary for that purpose and demonstrate that the individual’s interests do not override the interest being pursued by the controller.

Controllers must consider the balancing test particularly carefully when they do not or cannot exercise meaningful control over the use of the model. The draft guidance further notes that it would be very difficult for data controllers to carry out their processing activities in reliance on the legitimate interests lawful ground if those considerations were not taken into account.

16:30
Amendment 104 aims to make sure that any requirements to consult data subjects on potential risks identified by an assessment could be deemed to have been achieved if individuals are given an easy and effective way of opting out of the planned processing. The Government’s concern is that that this seems to conflate the purpose and role of risk assessments with separate requirements under the UK GDPR to process data fairly and transparently, and in a way which allows people to exercise their rights in respect of their data.
The Bill does not alter existing rights people have in respect of access to their data, objection to its processing, or requests for it to be rectified or deleted. These rights will continue to apply to processing activities undertaken by the developers of innovative technologies. It is up to the developers of those technologies to make sure that they can comply with existing requirements. Noble Lords may be aware—indeed, it was mentioned—that the ICO has been consulting innovators and other relevant organisations on draft guidance on how aspects of data protection law apply to the development and use of generative AI models to help make sure it is developed and deployed responsibly and with the trust of the people whose data it is built on.
For these reasons, I am not able to accept these amendments. I am of course willing to continue to engage with all Members of the Committee, but I hope that the noble Baroness will withdraw her amendment.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Jones, for their support. I will read the Minister’s speech, because this is a somewhat technical matter. I am not entirely sure that I agree with what he said, but I am also not sure that I could disagree with it adequately in the moment.

I will make two general points, however. First, I hear the Minister loud and clear on the question of the Government’s announcement on AI and IP but, at the beginning of my speech, I referenced Napster and how we ended up with personal data. The big guys won the battle for copyright, so we will see the likes of the New York Times, EMI and so on winning this battle, but small creatives and individuals will not be protected. I hope that, when that announcement comes, it includes the personal data issue as well.

Secondly, I say to the Minister that, if it is working now in the way he outlined from the ICO, then I do not think anybody thinks it is working very well. Either the ICO needs to do something, or we need to do something in this Bill. If not, we are letting all our data be taken for free to build the new world with no permission.

I know that the noble Viscount is interested in this area. It is one in which we could be creative. I suggest that we try to solve the conundrum about whether the ICO is not doing its work or we are not doing ours. I beg leave to withdraw my amendment.

Amendment 103 withdrawn.
Amendments 104 and 104A not moved.
Clause 20 agreed.
Clause 21: Consulting the Commissioner prior to processing
Amendments 105 to 108 not moved.
Clause 21 agreed.
Clauses 22 to 24 agreed.
Schedule 4: Obligations of controllers and processors: consequential amendments
Amendment 109 not moved.
Schedule 4 agreed.
Clause 25: Transfers of personal data to third countries and international organisations
Amendment 110
Moved by
110: Clause 25, page 44, line 18, leave out subsection (3) and insert—
“(3) In Schedule 7—(a) Part 1 contains minor and consequential amendments, and(b) Part 2 contains transitional provision.”Member’s explanatory statement
This amendment is consequential on the amendment in my name inserting amendments of section 119A of the Data Protection Act 2018 into Schedule 7 to the Bill.
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, UK law enforcement authorities processing personal data for law enforcement purposes currently use internationally based companies for data processing services, including cloud storage. The use of international processors is critical for modern organisations and law enforcement is no exception. The use of these international processors enhances law enforcement capabilities and underpins day-to-day functions.

Transfers from a UK law enforcement authority to an international processor are currently permissible under the Data Protection Act 2018. However, there is currently no bespoke mechanism for these transfers in Part 3, which has led to confusion and ambiguity as to how law enforcement authorities should approach the use of such processors. The aim of this amendment is to provide legal certainty to law enforcement authorities in the UK, as well as transparency to the public, so that they can use internationally based processors with confidence.

I have therefore tabled Amendments 110, 117 to 120, 122 to 129 and 131 to provide a clear, bespoke mechanism in Part 3 of the Data Protection Act 2018 for UK law enforcement authorities to use when transferring data to their contracted processors based outside the UK. This will bring Part 3 into line with the UK GDPR while clarifying the current law, and give UK law enforcement authorities greater confidence when making such transfers to their contracted processors for law enforcement purposes.

We have amended Section 73—the general principles for transfer—to include a specific reference to processors, ensuring that international processors can be a recipient of data transfers. In doing so, we have ensured that the safeguards within Chapter 5 that UK law enforcement authorities routinely apply to transfers of data to their international operational equivalents are equally applicable to transfers to processors. We are keeping open all the transfer mechanisms so that data can be transferred on the basis of an applicable adequacy regulation, the appropriate safeguards or potentially the special circumstances.

We have further amended Section 75—the appropriate safeguards provision—to include a power for the ICO to create, specifically for Part 3, an international data transfer agreement, or IDTA, to complement the IDTA which it has already produced to facilitate transfers using Article 46(2)(d) of the UK GDPR.

In respect of transfers to processors, we have disapplied the duty to inform the Information Commissioner about international transfers made subject to appropriate safeguards. As such, a requirement would be out of line with equivalent provisions in the UK GDPR. There is no strong rationale for complying with the provision, given that processors are limited in what they can do with data because of the nature of their contracts and that it would be unlikely to contribute to the effective functioning of the ICO.

Likewise, we have also disapplied the duty to document such transfers and to provide the documentation to the commissioner on request. This is because extending these provisions would duplicate requirements that already exist elsewhere in legislation, including in Section 61, which has extensive recording requirements that enable full accountability to the ICO.

We have also disapplied the majority of Section 78. While it provides a useful function in the context of UK law enforcement authorities transferring to their international operational equivalents, in the law enforcement to international processor context it is not appropriate because processors cannot decide to transfer data onwards on their own volition. They can only do so under instruction from the UK law enforcement authority controller.

Instead, we have retained the general prohibition on any further transfers to processors based in a separate third country by requiring UK law enforcement authority controllers to make it a condition of a transfer to its processor that data is only to be further transferred in line with the terms of the contract with or authorisation given by the controller, and where the further transfer is permitted under Section 73. We have also taken the opportunity to tidy up Section 77 which governs transfers to non-relevant authorities, relevant international organisations or international processors.

In respect of Amendment 121, tabled by the noble Lord, Lord Clement-Jones, on consultation with the Information Commissioner, I reassure the noble Lord that there is a memorandum of understanding between the Home Office and the Information Commissioner regarding international transfers approved by regulations, which sets out the role and responsibilities of the ICO. As part of this, the Home Office consults the Information Commissioner at various stages in the process. The commissioner, in turn, provides independent assurance and advice on the process followed and on the factors taken into consideration.

I understand that this amendment also relates to representations made by the National AIDS Trust. Perhaps the simplest thing is merely to reference my earlier remarks and commitment to engage with the National AIDS Trust ongoing. I beg to move that the government amendments which lead this group stand part of the Bill.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, very briefly, I thank the Minister for unpacking his amendments with some care, and for giving me the answer to my amendment before I spoke to it—that saves time.

Obviously, we all understand the importance of transfers of personal data between law enforcement authorities, but perhaps the crux of this, and the one question in our mind is, what is—perhaps the Minister could remind us—the process for making sure that the country that we are sending it to is data adequate? Amendment 121 was tabled as a way of probing that. It would be extremely useful if the Minister can answer that. This should apply to transfers between law enforcement authorities just as much as it does for other, more general transfers under Schedule 5. If the Minister can give me the answer, that would be useful, but if he does not have the answer to hand, I am very happy to suspend my curiosity until after Easter.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

My Lords, I too can be brief, having heard the Minister’s response. I thought he half-shot the Clement-Jones fox, with very good aim on the Minister’s part.

I was simply going to say that it is one in a sea of amendments from the Government, but the noble Lord, Lord Clement-Jones, made an important point about making sure that the country organisations that the commissioner looks at should meet the test of data adequacy—I also had that in my speaking note. The noble Lord, Lord Clement-Jones, was making a good point in terms of ensuring that appropriate data protections are in place internationally for us to be able to work with.

The Minister explained the government amendments with some care, but I wonder if he could explain how data transfers are made to an overseas processor using the powers relied on by reference to new Section 73(4)(aa) of the 2018 Act. The power is used as a condition and justification for several of the noble Lord’s amendments, and I wonder whether he has had to table these amendments because of the original drafting. That would seem to be to be the most likely reason.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones, for his amendment and his response, and I thank the noble Lord, Lord Bassam. The mechanism for monitoring international transfers was intended to be the subject for the next group in any case, and I would have hoped to give a full answer. I know we are all deeply disappointed that it looks as if we may not get to that group but, if the noble Lord is not willing to wait until we have that debate, I am very happy to write.

16:45
I am afraid that, in response to the question from the noble Lord, Lord Bassam, I am going to run up a white flag and offer to write to him. I do not know whether it was an oversight in the original drafting.
That said, at this last moment of speaking, maybe I can wish everybody a very happy Easter. I very sincerely thank all participants in this Bill, which I think we can all agree is slightly gruelling. The Committee’s contributions are hugely appreciated by me personally.
Amendment 110 agreed.
Clause 25, as amended, agreed.
Baroness Garden of Frognal Portrait The Deputy Chairman of Committees (Baroness Garden of Frognal) (LD)
- Hansard - - - Excerpts

My Lords, this may be a convenient moment for the Committee to adjourn. Happy Easter, everyone.

Committee adjourned at 4.46 pm.