Committee (2nd Day)
Scottish, Welsh and Northern Ireland Legislative Consent sought.
15:45
Clause 5: Lawfulness of processing
Amendment 11
Moved by
11: Clause 5, page 6, line 15, at end insert—
“(za) After point (a) insert—“(aa) the data subject has given consent for his or her personal data to enter the public domain via a public body;(ab) processing is carried out by a public body pursuant to a legal or statutory obligation or right, and the public body is entitled to make such data available to the public;””Member’s explanatory statement
This amendment would add to the list of GDPR Article 6(1) on the lawfulness of processing.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I rise to speak to my Amendment 11 and to Amendments 14, 16, 17, 18, Clause 5 stand part and Clause 7 stand part. I will attempt to be as brief as I can, but Clause 5 involves rather a large number of issues.

Processing personal data is currently lawful only if it is performed for at least one lawful purpose, one of which is that the processing is for legitimate interests pursued by the controller or a third party, except where those interests are overridden by the interests or fundamental rights of the data subject. As such, if a data controller relies on their legitimate interest as a legal basis for processing data, they must conduct a balancing test of their interest and those of the data subject.

Clause 5 amends the UK GDPR’s legitimate interest provisions by introducing the concept of recognised legitimate interest, which allows data to be processed without a legitimate interest balancing test. This provides businesses and other organisations with a broader scope of justification for data processing. Clause 5 would amend Article 6 of the UK GDPR to equip the Secretary of State with a power to determine these new recognised legitimate interests. Under the proposed amendment, the Secretary of State must have regard to,

“among other things … the interests and fundamental rights and freedoms of data subjects”.

The usual legitimate interest test is much stronger: rather than merely a topic to have regard to, a legitimate interest basis cannot lawfully apply if the data subject’s interests override those of the data controller.

Annexe 1, as inserted by the Bill, now provides a list of exemptions but is overly broad and vague. It includes national security, public security and defence, and emergencies and crime as legitimate interests for data processing without an assessment. Conservative MP, Marcus Fysh, said on Third Reading:

“Before companies share data or use data, they should have to think about what the balance is between a legitimate interest and the data rights, privacy rights and all the other rights that people may have in relation to their data. We do not want to give them a loophole or a way out of having to think about that.” —[Official Report, Commons, 29/11/23; col. 896.]


I entirely agree with that.

The amendment in Clause 5 also provides examples of processing that may be considered legitimate interests under the existing legitimate interest purpose, under Article 6(1)(f), rather than under the new recognised legitimate interest purpose. These include direct marketing, intra-group transmission of personal data for internal administrative purposes, and processing necessary to ensure the security of a network.

The Bill also provides a much more litigious data environment. Currently, an organisation’s assessment of its lawful purposes for processing data can be challenged through correspondence or an ICO complaint, whereas, under the proposed system, an individual may be forced to legally challenge a statutory instrument in order to contest the basis on which their data is processed.

As I will explain later, our preference is that the clause not stand part, but I accept that there are some areas that need clarification and Amendment 11 is designed to do this. The UK GDPR sets out conditions in which processing of data is lawful. The Bill inserts in Article 6(1) a provision specifying that processing shall be lawful for the purposes of a recognised legitimate interest, as I referred to earlier, an example of which may be for the purposes of direct marketing.

Many companies obtain data from the open electoral register. The register is maintained by local authorities, which have the right to sell this data to businesses. Amendment 11 would insert new Article (6)(1)(aa) and (ab), which provide that data processing shall be lawful where individuals have consented for their data

“to enter the public domain via a public body”,

or where processing is carried out by public bodies pursuant to their duties and rights, which may include making such data available to the public. Individuals are free to opt out of the open electoral register if they so wish and it would be disproportionate—in fact, irritating—to consumers to notify those who have consented to their data being processed that their data is being processed.

On Amendment 14, as mentioned, the Bill would give the Secretary of State the power to determine recognised legitimate interests through secondary legislation, which is subject to minimal levels of parliamentary scrutiny. Although the affirmative procedure is required, this does not entail much scrutiny or much of a debate. The last time MPs did not approve a statutory instrument under the affirmative procedure was in 1978. In practice, interests could be added to this list at any time and for any reason, facilitating the flow and use of personal data for limitless potential purposes. Businesses could be obligated to share the public’s personal data with government or law enforcement agencies beyond what they are currently required to do, all based on the Secretary of State’s inclination at the time.

We are concerned that this Henry VIII power is unjustified and undermines the very purpose of data protection legislation, which is to protect the privacy of individuals in a democratic data environment, as it vests undue power over personal data rights in the Executive. This amendment is designed to prevent the Secretary of State from having the ability to pre-authorise data processing outside the usual legally defined route. It is important to avoid a two-tier data protection framework in which the Secretary of State can decide that certain processing is effectively above the law.

On Amendment 17, some of the most common settings where data protection law is broken relate to the sharing of HIV status of an individual living with HIV in their personal life in relation to employment, healthcare services and the police. The sharing of an individual’s HIV status can lead to further discrimination being experienced by people living with HIV and can increase their risk of harassment or even violence. The National AIDS Trust is concerned that the Bill as drafted does not go far enough to prevent individuals’ HIV status from being shared with others without their consent. They and we believe that the Bill must clarify what an “administrative purpose” is for organisations processing employees’ personal data. Amendment 17 would add wording to clarify that, in paragraph 9(b) of Article 6,

“intra-group transmission of personal data”

in the workplace, within an organisation or in a group of organisations should be permitted only for individuals who need to access an employee’s personal data as part of their work.

As far as Amendment 18 is concerned, as it stands Clause 5 gives an advantage to large undertakings with numerous companies that can transmit data intra-group purely because they are affiliated to one central body. However, this contradicts both the ICO’s and the CMA’s repeated position that first party versus third party is not a meaningful distinction to cover privacy risk. Instead, it is the distinction of what data is processed, rather than the corporate ownership of the systems doing the processing. The amendment reflects the organisational measures that undertakings should have as safeguards. The groups of undertakings transmitting data should have organisational measures via contract to be able to take advantage of this transmission of data.

Then we come to the question of Clause 5 standing part of the Bill. This clause is unnecessary and creates risks. It is unnecessary because the legitimate interest balancing test is, in fact, flexible and practical; it already allows processing for emergencies, safeguarding and so on. It is risky because creating lists of specified legitimate interests inevitably narrows this concept and may make controllers less certain about whether a legitimate interest that is not a recognised legitimate interest can be characterised as such. In the age of AI, where change is exponential, we need principles and outcome-based legislation that are flexible and can be supplemented with guidance from an independent regulator, rather than setting up a system that requires the Government to legislate more and faster in order to catch up.

There is also a risk that the drafting of this provision does not dispense with the need to conduct a legitimate interest balancing test because all the recognised legitimate interests contain a test, of necessity. Established case law interprets the concept of necessity under data protection law as requiring a human rights balancing test to be carried out. This rather points to the smoke-and-mirrors effect of this drafting, which does nothing to improve legal certainty for organisations or protections for individuals.

I now come to Clause 7 standing part. This clause creates a presumption that processing will always be in the public interest or substantial public interest if done in reliance on a condition listed in proposed new Schedule A1 to the Data Protection Act 2018. The schedule will list international treaties that have been ratified by the UK. At present, the Bill lists only the UK-US data-sharing agreement as constituting relevant international law. Clause 7 seeks to remove the requirement for a controller to consider whether the legal basis on which they rely is in the public interest or substantial public interest, has appropriate safeguards and respects data subjects’ fundamental rights and freedoms. But the conditions in proposed new Schedule A1 in respect of the UK-US agreement also state that the processing must be necessary, as assessed by the controller, to respond to a request made under the agreement.

It is likely that a court would interpret “necessity” in the light of the ECHR. The court may therefore consider that the inclusion of a necessity test means that a controller would have to consider whether the UK-US agreement, or any other treaty added to the schedule, is proportionate to a legitimate aim pursued. Not only is it unreasonable to expect a controller to do such an assessment; it is also highly unusual. International treaties are drafted on a state-to-state basis and not in a way that necessarily corresponds clearly with domestic law. Further, domestic courts would normally consider the rights under the domestic law implementing a treaty, rather than having to interpret an international instrument without reference to a domestic implementing scheme. Being required to do so may make it more difficult for courts to enforce data subjects’ rights.

The Government have not really explained why it is necessary to amend the law in this way rather than simply implementing the UK-US agreement domestically. That would be the normal approach; it would remove the need to add this new legal basis and enable controllers to use the existing framework to identify a legal basis to process data in domestic law. Instead, this amendment makes it more difficult to understand how the law operates, which could in turn deter data sharing in important situations. Perhaps the Minister could explain why Clause 7 is there.

I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I rise to speak to Amendments 13 and 15. Before I do, let me say that I strongly support the comments of the noble Lord, Lord Clement-Jones, about HIV and the related vulnerability, and his assertion—almost—that Clause 5 is a solution in search of a problem. “Legitimate interest” is a flexible concept and I am somewhat bewildered as to why the Government are seeking to create change where none is needed. In this context, it follows that, were the noble Lord successful in his argument that Clause 5 should not stand part, Amendments 13 and 15 would be unnecessary.

On the first day in Committee, we debated a smaller group of amendments that sought to establish the principle that nothing in the Bill should lessen the privacy protections of children. In his response, the Minister said:

“if over the course of our deliberations the Committee identifies areas of the Bill where that is not the case, we will absolutely be open to listening on that, but let me state this clearly: the intent is to at least maintain, if not enhance, the safety and privacy of children and their data”.—[Official Report, 20/3/24; col. GC 75.]

I am glad the Minister is open to listening and that the Government’s intention is to protect children, but, as discussed previously, widening the definition of “research” in Clause 3 and watering down purpose limitation protections in Clause 6 negatively impacts children’s data rights. Again, in Clause 5, lowering the protections for all data subjects has consequences for children.

16:00
In Clause 5, proposed new paragraph (6) of Article 6 gives the Secretary of State power to amend the circumstances under which data processing is deemed legitimate in the public interest. Amendment 13 simply requires the Secretary of State to ensure that the Bill does not “reduce, minimise or undermine” existing standards and protections for children’s data when exercising these powers. Similarly, proposed new paragraph (9) gives examples of the types of processing that may be necessary for the purpose of a generalised—as opposed to a public interest—legitimate interest, including, in new paragraph (9)(a),
“processing that is necessary for the purposes of direct marketing”.
Amendment 15 limits direct marketing in paragraph (9)(a) to adults.
I struggle to understand why the Government believe it is appropriate to enable companies to market directly to anyone without their express consent. The requirement to opt in to marketing has served consumers well and, arguably, online users need more protection, rather than less, from intrusive marketing practices. But it seems a retrograde step that, if an individual, irrespective of age, expressly states they do not wish to receive direct marketing, a company could rely on paragraph (9)(a) to override those wishes. For children, not only is this intrusive and aggressive but it conflicts with their rights and protections, as set out in Article 6(1)(f) of the UK GDPR and codified in the age-appropriate design code.
Once again, I am finding it hard to marry the Government’s assurance—given privately, from the Dispatch Box in the other place, and by the noble Viscount the Minister—that the Government remain fully committed to the high standards of protection they set out for children with the proposal routinely to expose them to direct marketing. The changes in UK data law proposed by Clause 5, and numerous others scattered throughout the Bill, expose the reality that the Bill is intended to reduce privacy for UK citizens and, as a knock-on, the privacy and safety protection of children. The Government have a choice: to let the House decide whether children deserve a lesser standard of protection, or to amend the Bill to maintain the current standards.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, I support the noble Baroness, Lady Kidron, in Amendments 13 and 15, to which I have added my name. Rather than repeat her arguments—as we are now all trying not to do—I want to build on them and point to the debate we had on the first group in Committee, when my noble friend the Minister insisted that the Government had no desire to water down the protections for children in the Bill. In Clause 5, in proposed new paragraph (7) of Article 6, the Government have felt it necessary to be explicit, in that paragraph only, that children might need extra protection. This, on its own, makes me worried that the whole Bill is reducing the protection children have, because the Government felt it necessary to insert new paragraph (7)(b). Interestingly, it refers to,

“where relevant, the need to provide children”

with additional support. But where is that not relevant?

Amendment 13 simply looks to strengthen this—to accept the premise on which the Bill is currently drafted that we need to be explicit where children deserve the right to a higher level of protection, and to get the wording right. Will my noble friend the Minister reconsider? There are two choices here: to state right at the beginning of the Bill that there is a principle that there will be no reduction in children’s right to a higher level of protection, or to do as the Bill currently does and make sure that we get the wording right at every stage as we work through.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank noble Lords who have spoken to this group. As ever, I am grateful to the Delegated Powers and Regulatory Reform Committee for the care it has taken in scrutinising the Bill. In its 10th report it made a number of recommendations addressing the Henry VIII powers in the Bill, which are reflected in a number of amendments that we have tabled.

In this group, we have Amendment 12 to Clause 5, which addresses the committee’s concerns about the new powers for the Secretary of State to amend new Annexe 1 of Article 6. This sets out the grounds for treating data processing as a recognised legitimate interest. This issue was raised by the noble Lord, Lord Clement-Jones, in his introduction. The Government argue that they are starting with a limited number of grounds and that the list might need to be changed swiftly, hence the need for the Secretary of State’s power to make changes by affirmative regulations.

However, the Delegated Powers and Regulatory Reform Committee argues:

“The grounds for lawful processing of personal data go to the heart of the data protection legislation, and therefore in our view should not be capable of being changed by subordinate legislation”.


It also argues that the Government have not provided strong reasons for needing this power. It recommends that the delegated power in Clause 5(4) should be removed from the Bill, which is what our Amendment 12 seeks to do.

These concerns were echoed by the Constitution Committee, which went one stage further by arguing:

“Data protection is a matter of great importance in maintaining a relationship of trust between the state and the individual”.


It is important to maintain these fundamental individual rights. On that basis, the Constitution Committee asks us to consider whether the breadth of the Secretary of State’s powers in Clauses 5 and 6 is such that those powers should be subject to primary rather than secondary legislation.

I make this point about the seriousness of these issues as they underline the points made by other noble Lords in their amendments in this group. In particular, the noble Lord, Lord Clement-Jones, asked whether any regulations made by the Secretary of State should be the subject of the super-affirmative procedure. We will be interested to hear the Minister’s response, given the concerns raised by the Constitution Committee.

Will the Minister also explain why it was necessary to remove the balancing test, which would require organisations to show why their interest in processing data outweighs the rights of data subjects? Again, this point was made by the noble Lord, Lord Clement-Jones. It would also be helpful if the Minister could clarify whether the new powers for the Secretary of State to amend the recognised legitimate interest could have consequences for data adequacy and whether this has been checked and tested with the EU.

Finally, we also welcome a number of other amendments tabled by the noble Lord, Lord Clement-Jones, in particular those to ensure that direct marketing should be considered a legitimate interest only if there is proper consent. This was one of the themes of the noble Baroness, Lady Kidron, who made, as ever, a very powerful case for ensuring that children specifically should not be subject to direct market as routine and that there should be clear consent.

The noble Baronesses, Lady Kidron and Lady Harding, have once again, quite rightly, brought us back to the Bill needing to state explicitly that children’s rights are not being watered down by it, otherwise we will come back to this again and again in all the clauses. The noble Baroness, Lady Kidron, said that this will be decided on the Floor of the House, or the Minister could give in now and come back with some government amendments. I heartily recommend to the Minister that he considers doing that because it might save us some time. I look forward to the Minister’s response on that and on the Delegated Powers and Regulatory Reform Committee’s recommendations about removing the Secretary of State’s right to amend the legitimate interest test.

Viscount Camrose Portrait The Parliamentary Under-Secretary of State, Department for Science, Innovation and Technology (Viscount Camrose) (Con)
- Hansard - - - Excerpts

My Lords, I rise to speak to Amendments 11, 12, 13, 14, 15, 16, 17 and 18 and to whether Clauses 5 and 7 should stand part of the Bill. In doing so, I thank the noble Lord, Lord Clement-Jones, and the noble Baronesses, Lady Jones and Lady Kidron, for their amendments. The amendments in the group, as we have heard, relate to Clauses 5 and 7, which make some important changes to Article 6 of the UK GDPR on the lawfulness of processing.

The first amendment in the group, Amendment 11, would create a new lawful ground, under Article 6(1) of UK GDPR, to enable the use of personal data published by public bodies with a person’s consent and to enable processing by public bodies for the benefit of the wider public. The Government do not believe it would be necessary to create additional lawful grounds for processing in these circumstances. The collection and publication of information on public databases, such as the list of company directors published by Companies House, should already be permitted by existing lawful grounds under either Article 6(1)(c), in the case of a legal requirement to publish information, or Article 6(1)(e) in the case of a power.

Personal data published by public bodies can already be processed by other non-public body controllers where their legitimate interests outweigh the rights and interests of data subjects. However, they must comply with their requirements in relation to that personal data, including requirements to process personal data fairly and transparently. I am grateful to the noble Lord, Lord Clement-Jones, for setting out where he thinks the gaps are, but I hope he will accept my reassurances that it should already be possible under the existing legislation and will agree to withdraw the amendment.

On Clause 5, the main objectives introduce a new lawful ground under Article 6(1) of the UK GDPR, known as “recognised legitimate interests”. It also introduces a new annexe to the UK GDPR, in Schedule 1 to the Bill, that sets out an exhaustive list of processing activities that may be undertaken by data controllers under this new lawful ground. If an activity appears on the list, processing may take place without a person’s consent and without balancing the controller’s interests against the rights and interests of the individual: the so-called legitimate interests balancing test.

The activities in the annexe are all of a public interest nature, for example, processing of data where necessary to prevent crime, safeguarding national security, protecting children, responding to emergencies or promoting democratic engagement. They also include situations where a public body requests a non-public body to share personal data with it to help deliver a public task sanctioned by law.

The clause was introduced as a result of stakeholders’ concerns raised in response to the public consultation Data: A New Direction in 2021. Some informed us that they were worried about the legal consequences of getting the balancing test in Article 6(1)(f) wrong. Others said that undertaking the balancing test can lead to delays in some important processing activities taking place.

As noble Lords will be aware, many data controllers have important roles in supporting activities that have a public interest nature. It is vital that data is shared without delay where necessary in areas such as safeguarding, prevention of crime and responding to emergencies. Of course, controllers who share data while relying on this new lawful ground would still have to comply with wider requirements of data protection legislation where relevant, such as data protection principles which ensure that the data is used fairly, lawfully and transparently, and is collected and used for specific purposes.

In addition to creating a new lawful ground of recognised legitimate interests, Clause 5 also clarifies the types of processing activities that may be permitted under the existing legitimate interests lawful ground under Article 6(1)(f) of the UK GDPR. Even if a processing activity does not appear on the new list of recognised legitimate interests, data controllers may still have grounds for processing people’s data without consent if their interests in processing the data are not outweighed by the rights and freedoms that people have in relation to privacy. Clause 5(9) and (10) makes it clear this might be the case in relation to many common commercial activities, such as intragroup transfers.

16:15
The first stand part notice in this group would remove Clause 5 from the Bill in its entirety. As I have explained, these provisions are important because they will encourage responsible and necessary data sharing under a new lawful ground of recognised legitimate interest and clarify the types of processing activities that may take place under the existing lawful ground of legitimate interest under Article 6(1)(f). Therefore, I hope the noble Lord will not press his opposition to this clause.
Amendments 12 to 14 concern the Secretary of State’s regulation-making power to add new processing activities to the list. Amendment 12 would remove this delegated power, with the intention, as I understand it, to implement a recommendation of the Delegated Powers and Regulatory Reform Committee. Amendment 13 would make sure that the Secretary of State has greater regard to the rights of children before making use of the regulations, and Amendment 14 would increase parliamentary scrutiny over any additions to the list by making the regulation-making power subject to the super-affirmative procedure.
The Bill already provides for additions to Schedule 1 to be subject to the affirmative resolution procedure, and we believe that this provides the right level of scrutiny, given the other safeguards the Secretary of State must consider before bringing regulations to Parliament. These include requirements for the Secretary of State to consider the impact of any changes to the rights and freedoms of individuals, to have regard to the specific need to provide for the special protection of children, and to consult the Information Commissioner and any other persons the Secretary of State considers appropriate on future changes to the list.
Introducing a higher degree of parliamentary scrutiny to that included in the Bill, or removing the power to add to the list of activities, could be detrimental in instances where there is a need for Ministers to add other urgent public interest activities to the list of recognised legitimate interests and could lead to unnecessary delays in the sharing of vital information.
On the point made about EU data adequacy, across all reforms in the Bill, the Government maintain an ongoing dialogue with the EU and have a positive, constructive relationship with it. We continue to engage regularly with the EU to ensure that our reforms are understood, and we believe that they are compatible with maintaining our data adequacy decisions. For all these reasons, I hope that noble Lords will agree not to press their amendments.
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, may I just revisit that with the Minister? I fear that he is going to move on to another subject. The Delegated Powers Committee said that it thought that the Government had not provided strong enough reasons for needing this power. The public interest list being proposed, which the Minister outlined, is quite broad, so it is hard to imagine the Government wanting something not already listed. I therefore return to what the committee said. Normally, noble Lords like to listen to recommendations from such committees. There is no strong reason for needing that extra power, so, to push back a little on the Minister, why, specifically, is it felt necessary? If it were a public safety interest, or one of the other examples he gave, it seems to me that that would come under the existing list of public interests.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Indeed. Needless to say, we take the recommendations of the DPRRC very seriously, as they deserve. However, because this is an exhaustive list, and because the technologies and practices around data are likely to evolve very rapidly in ways we are unable currently to predict, it is important to retain as a safety measure the ability to update that list. That is the position the Government are coming from. We will obviously continue to consider the DPRRC’s recommendations, but that has to come with a certain amount of adaptiveness as we go. Any addition to the list would of course be subject to parliamentary debate, via the affirmative resolution procedure, as well as the safeguards listed in the provision itself.

Clause 50 ensures that the ICO and any other interested persons should be consulted before making regulations.

Amendments 15, 16, 17 and 18 would amend the part of Clause 5 that is concerned with the types of activities that might be carried out under the current legitimate interest lawful ground, under Article 6(1)(f). Amendment 15 would prevent direct marketing organisations relying on the legitimate interest lawful ground under Article 6(1)(f) if the personal data being processed related to children. However, the age and vulnerability in general of data subjects is already an important factor for direct marketing organisations when considering whether the processing is justified. The ICO already provides specific guidance for controllers carrying out this balancing test in relation to children’s data. The fact that a data subject is a child, and the age of the child in question, will still be relevant factors to take into account in this process. For these reasons, the Government consider this amendment unnecessary.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, am I to take it from that that none of the changes currently in the Bill will expose children on a routine basis to direct marketing?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

As is the case today and will be going forward, direct marketing organisations will be required to perform the balancing test; and as in the ICO guidance today and, no doubt, going forward—

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I am sorry if I am a little confused—I may well be—but the balancing test that is no longer going to be there allows a certain level of processing, which was the subject of the first amendment. The suggestion now is that children will be protected by a balancing test. I would love to know where that balancing test exists.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

The balancing test remains there for legitimate interests, under Article 6(1)(f).

Amendment 16 seeks to prevent organisations that undertake third-party marketing relying on the legitimate interest lawful ground under Article 6(1)(f) of the UK GDPR. As I have set out, organisations can rely on that ground for processing personal data without consent when they are satisfied that they have a legitimate interest to do so and that their commercial interests are not outweighed by the rights and interests of data subjects.

Clause 5(4) inserts in Article 6 new paragraph (9), which provides some illustrative examples of activities that may constitute legitimate interests, including direct marketing activities, but it does not mean that they will necessarily be able to process personal data for that purpose. Organisations will need to assess on a case-by-case basis where the balance of interest lies. If the impact on the individual’s privacy is too great, they will not be able to rely on the legitimate interest lawful ground. I should emphasise that this is not a new concept created by this Bill. Indeed, the provisions inserted by Clause 5(4) are drawn directly from the recitals to the UK GDPR, as incorporated from the EU GDPR.

I recognise that direct marketing can be a sensitive—indeed, disagreeable—issue for some, but direct marketing information can be very important for businesses as well as individuals and can be dealt with in a way that respects people’s privacy. The provisions in this Bill do not change the fact that direct marketing activities must be compliant with the data protection and privacy legislation and continue to respect the data subject’s absolute right to opt out of receiving direct marketing communications.

Amendment 17 would make sure that the processing of employee data for “internal administrative purposes” is subject to heightened safeguards, particularly when it relates to health. I understand that this amendment relates to representations made by the National AIDS Trust concerning the level of protection afforded to employees’ health data. We agree that the protection of people’s HIV status is vital and that it is right that it is subject to extra protection, as is the case for all health data and special category data. We have committed to further engagement and to working with the National AIDS Trust to explore solutions in order to prevent data breaches of people’s HIV status, which we feel is best achieved through non-legislative means given the continued high data protection standards afforded by our existing legislation. As such, I hope that the noble Lord, Lord Clement-Jones, will agree not to press this amendment.

Amendment 18 seeks to allow businesses more confidently to rely on the existing legitimate interest lawful ground for the transmission of personal data within a group of businesses affiliated by contract for internal administrative purposes. In Clause 5, the list of activities in proposed new paragraphs (9) and (10) are intended to be illustrative of the types of activities that may be legitimate interests for the purposes of Article 6(1)(f). They are focused on processing activities that are currently listed in the recitals to the EU GDPR but are simply examples. Many other processing activities may be legitimate interests for the purposes of Article 6(1)(f) of the UK GDPR. It is possible that the transmission of personal data for internal administrative purposes within a group affiliated by contract may constitute a legitimate interest, as may many other commercial activities. It would be for the controller to determine this on a case-by-case basis after carrying out a balancing test to assess the impact on the individual.

Finally, I turn to the clause stand part debate that seeks to remove Clause 7 from the Bill. I am grateful to the noble Lord, Lord Clement-Jones, for this amendment because it allows me to explain why this clause is important to the success of the UK-US data access agreement. As noble Lords will know, that agreement helps the law enforcement agencies in both countries tackle crime. Under the UK GDPR, data controllers can process personal data without consent on public interest grounds if the basis for the processing is set out in domestic law. Clause 7 makes it clear that the processing of personal data can also be carried out on public interest grounds if the basis for the processing is set out in a relevant international treaty such as the UK-US data access agreement.

The agreement permits telecommunications operators in the UK to disclose data about serious crimes with law enforcement agencies in the US, and vice versa. The DAA has been operational since October 2022 and disclosures made by UK organisations under it are already lawful under the UK GDPR. Recent ICO guidance confirms this, but the Government want to remove any doubt in the minds of UK data controllers that disclosures under the DAA are permitted by the UK GDPR. Clause 7 makes it absolutely clear to telecoms operators in the UK that disclosures under the DAA can be made in reliance on the UK GDPR’s public tasks processing grounds; the clause therefore contributes to the continued, effective functioning of the agreement and to keeping the public in both the UK and the US safe.

For these reasons, I hope that the noble Lord, Lord Clement-Jones, will agree to withdraw his amendment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My first reaction is “Phew”, my Lords. We are all having to keep to time limits now. The Minister did an admirable job within his limit.

I wholeheartedly support what the noble Baronesses, Lady Kidron and Lady Harding, said about Amendments 13 and 15 and what the noble Baroness, Lady Jones, said about her Amendment 12. I do not believe that we have yet got to the bottom of children’s data protection; there is still quite some way to go. It would be really helpful if the Minister could bring together the elements of children’s data about which he is trying to reassure us and write to us saying exactly what needs to be done, particularly in terms of direct marketing directed towards children. That is a real concern.

16:30
The Minister said, “Yes, the balancing test absolutely will have to be carried out. It won’t be a recognised legitimate interest; it’ll have to be a balancing test, as ever”. There is a mantra taking place here: “We have no desire to water down child protection”. I take that at face value, but something that brings all of this together for us would be extremely helpful.
The noble Baroness, Lady Jones, has done a fantastic job in pulling together all the DPRRC’s recommendations and saying, “Right, we need to understand throughout why the Secretary of State has these powers”. As she says, the powers in Clause 5 go to the heart of the Bill. In my view, the legitimate interest balancing test should not have been disturbed, but, if it is to be disturbed and we are to have this new category of recognised legitimate interests, we will need to be extremely careful. That is why I put down a super-affirmative, but I much prefer the noble Baroness’s amendment.
Annexe 1 of the Bill, which can be changed by the Secretary of State, now provides a list of exemptions that includes national security, public security and defence, emergencies and crime. They are as broad as that—barely with qualification—in an already extremely broad category. So, when the Minister says that this is needed for changes in technology, it sounds extremely expedient on his behalf that this is being put in place. He prayed in aid the qualification to the new Article 6(1), but this is taking away fundamental rights. This is probably the most important Secretary of State power in the whole Bill—it is even more important than Clause 14, which we will come on to.
On that basis, too, the noble Baroness was absolutely right to raise the issue of data adequacy. Certainly, the vibes I am getting are that individual Members of the European Parliament will be kicking the tyres on this Bill pretty hard if it ever goes through in its current form. The Minister says, “Oh, well, the National AIDS Trust can rest assured that everything’s fine with these administrative transfers because we have such high standards currently”, but what we are trying to do is make sure that we retain those high standards and, if anything, increase them.
There is not a great deal of plausibility of trust here. If we do not trust what is happening out there, how on earth are the public going to? This whole thing seems to be built on an edifice whereby the Government want things to be done without due care and attention. There is a feeling right through this Bill—particularly in Clause 5—that rights are being watered down. Of course, the legal advice I have on Clause 5 is that, at the end of the day, the necessity test may be necessary throughout, and so it may not be effective.
Then we get to Clause 7 standing part. That clause is also pretty baffling, but I will need to read the Minister’s response. Again, it is more wet towels. In the meantime, I beg leave to withdraw Amendment 11.
Amendment 11 withdrawn.
Amendments 12 to 18 not moved.
Clause 5 agreed.
Schedule 1: Lawfulness of processing: recognised legitimate interests
Amendment 19
Moved by
19: Schedule 1, page 192, line 21, leave out from beginning to end of line 6 on page 197
Member’s explanatory statement
This amendment is consequential on an amendment in the name of Baroness Jones of Whitchurch to leave out Clause 114. These Schedule 1 provisions would become redundant if Clause 114 is removed from the Bill.
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, Amendment 19 is consequential on my more substantive Clauses 114 and 115 stand part notices, which are also in this group. I am grateful to the noble Lord, Lord Clement-Jones, for his support.

These amendments all relate to the 150 or so pages of late amendments tabled in the Commons on Report and therefore not given adequate scrutiny before now. No real explanation has been given for why the Government felt it necessary to table the amendments in this way, and this group of amendments comes under the heading of so-called “democratic engagement”. Clause 113 extends a soft opt-in for direct mail marketing for furthering charitable or political objectives, while Clause 114 goes further and allows the Secretary of State to change the direct marketing rules through secondary legislation for the purpose of democratic engagement. This would allow the Government, in the run-up to an election, to switch off the direct mailing rules that apply to political parties.

Like many others, we are highly suspicious of the Government’s motives in introducing these amendments in the run-up to this election. Although we do not have a problem with a softer opt-in for direct mailing for charities, the application of Clause 114 to political parties gives politicians carte blanche to mine voters’ data given in good faith for completely different purposes. It would allow voters to be bombarded with calls, texts and personalised social media without their explicit consent.

When you consider these proposals in the context of other recent moves by the Government to make it harder for some people to vote and to vastly increase the amount of money that can be spent on campaigning in the run-up to an election, you have to wonder what the Government are up to, because these measures have certainly not been requested by Labour. In fact, these measures were not supported by the majority of respondents to the Government’s initial consultation, who wanted the existing rules upheld.

The Advertising Association has told us that it is concerned that switching off the rules could result in an increase in poor practice, such as political lobbying under the guise of research. This is apparently a practice known as “plugging”. It referred us to a report from the previous Information Commissioner on how political parties manage data protection, which provided key recommendations for how political parties could improve. These included providing clearer information about how data will be used and being more transparent about how voters are profiled and targeted via social media platforms. This is the direction our democratic engagement should be going in, with stronger and more honest rules that treat the electorate with respect, not watering down the rules that already exist.

When these proposals were challenged in the Commons on Report, the Minister, John Whittingdale, said:

“We have no immediate plans to use the regulation powers”.—[Official Report, Commons, 29/11/23; col. 912.]


If that is the case, why do the Government not take the proposals off the table, go back to the drawing board by conducting a proper consultation and test whether there is any appetite for these changes? They should also involve the Information Commissioner at an early stage, as he has already gone on record to say that this is

“an area in which there are significant potential risks to people if any future policy is not implemented very carefully”.

Finally, if there are to be any changes, they should be subject to full parliamentary scrutiny and approval.

We believe that Clauses 114 and 115 are taking us in fundamentally the wrong direction, against the interests of the electorate. I look forward to the Minister’s response, but I give notice now that, unless the Government adopt a very different strategy on this issue, we will return to this on Report. I beg to move.

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - - - Excerpts

My Lords, I follow the noble Baroness, Lady Jones of Whitchurch, with pleasure, as I agree with everything that she just said. I apologise for having failed to notice this in time to attach my name; I certainly would have done, if I had had the chance.

As the noble Baroness said, we are in an area of great concern for the level of democracy that we already have in our country. Downgrading it further is the last thing that we should be looking at doing. Last week, I was in the Chamber looking at the statutory instrument that saw a massive increase in the spending limits for the London mayoral and assembly elections and other mayoral elections—six weeks before they are held. This is a chance to spend an enormous amount of money; in reality, it is the chance for one party that has the money from donations from interesting and dubious sources, such as the £10 million, to bombard voters in clearly deeply dubious and concerning ways.

We see a great deal of concern about issues such as deepfakes, what might happen in the next general election, malicious actors and foreign actors potentially interfering in our elections. We have to make sure, however, that the main actors conduct elections fairly on the ground. As the noble Baroness, Lady Jones, just set out, this potentially drives a cart and horses through that. As she said, these clauses did not get proper scrutiny in the Commons—as much as that ever happens. As I understand it, there is the potential for us to remove them entirely later, but I should like to ask the Minister some direct questions, to understand what the Government’s intentions are and how they understand the meaning of the clauses.

Perhaps no one would have any problems with these clauses if they were for campaigns to encourage people to register to vote, given that we do not have automatic voter registration, as so many other countries do. Would that be covered by these clauses? If someone were conducting a “get out the vote” campaign in a non-partisan way, simply saying, “Please go out and vote. The election is on this day. You will need to bring along your voter ID”, would it be covered by these clauses? What about an NGO campaigning to stop a proposed new nuclear power station, or a group campaigning for stronger regulations on pesticides or for the Government to take stronger action against ultra-processed food? How do those kinds of politics fit with Clauses 114 and 115? As they are currently written, I am not sure that it is clear what is covered.

There is cause for deep concern, because no justification has been made for these two clauses. I look forward to hearing the Minister’s responses.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, this weekend, as I was preparing for the amendments to which I have put my name, I made the huge mistake of looking at the other amendments being discussed. As a result, I had a look at this group. I probably should declare an interest as the wife of a Conservative MP; therefore, our household is directly affected by this amendment and these clause stand part notices. I wholeheartedly agree with everything said by the noble Baronesses, Lady Jones and Lady Bennett of Manor Castle.

I have two additional points to make, because I am horrified by these clauses. First, did I miss something, in that we are now defining an adult as being 14-plus? At what point did that happen? I thought that you had the right to vote at 18, so I do not understand why electoral direct marketing should be free to bombard our 14 year-olds. That was my first additional point.

Secondly, I come back to what I said on the first day of Committee: this is all about trust. I really worry that Clauses 114 and 115 risk undermining two important areas where trust really matters. The first is our electoral system and the second is the data that we give our elected representatives, when we go to them not as party representatives but as our representatives elected to help us.

16:45
I have seen this as the wife of a politician. Many people go to their MP who are not supporters of that MP’s party or even of them as an individual, but they need their help. In doing so, they give their data, and I do not want to create any more barriers that reduce the trust that some of the most vulnerable in society have in our elected representatives. We live at a time when social media does enough of that for us, and we do not need to make it even easier for our electoral campaigning to diminish the trust the electorate has in the political system.
This is a fundamental group of amendments. It takes quite a lot for me to stand up on something so party political—I think my husband will be completely horrified that I did this homework over the weekend—but I ask the Minister to reconsider and to listen hard to the considered views, probably more considered than mine, on the Opposition Benches calling for more consultation before something such as this is introduced.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Baroness, Lady Harding and Lady Bennett, after the excellent introduction to the amendments in this group by the noble Baroness, Lady Jones. The noble Baroness, Lady Harding, used the word “trust”, and this is another example of a potential hidden agenda in the Bill. Again, it is destructive of any public trust in the way their data is curated. This is a particularly egregious example, without, fundamentally, any explanation. Sir John Whittingdale said that a future Government

“may want to encourage democratic engagement in the run up to an election by temporarily ‘switching off’ some of the direct marketing rules”.—[Official Report, Commons, 29/11/2023; col. 885.]

Nothing to see here—all very innocuous; but, as we know, in the past the ICO has been concerned about even the current rules on the use of data by political parties. It seems to me that, without being too Pollyannaish about this, we should be setting an example in the way we use the public’s data for campaigning. The ICO, understandably, is quoted as saying during the public consultation on the Bill that this is

“an area in which there are significant potential risks to people if any future policy is not implemented very carefully”.

That seems an understatement, but that is how regulators talk. It is entirely right to be concerned about these provisions.

Of course, they are hugely problematic, but they are particularly problematic given that it is envisaged that young people aged 14 and older should be able to be targeted by political parties when they cannot even vote, as we have heard. This would appear to contravene one of the basic principles of data protection law: that you should not process more personal data than you need for your purposes. If an individual cannot vote, it is hard to see how targeting them with material relating to an election is a proportionate interference with their privacy rights, particularly when they are a child. The question is, should we be soliciting support from 14 to 17 year-olds during elections when they do not have votes? Why do the rules need changing so that people can be targeted online without having consented? One of the consequences of these changes would be to allow a Government to switch off—the words used by Sir John Whittingdale—direct marketing rules in the run-up to an election, allowing candidates and parties to rely on “soft” opt-in to process data and make other changes without scrutiny.

Exactly as the noble Baroness, Lady Jones, said, respondents to the original consultation on the Bill wanted political communications to be covered by existing rules on direct marketing. Responses were very mixed on the soft opt-in, and there were worries that people might be encouraged to part with more of their personal data. More broadly, why are the Government changing the rules on democratic engagement if they say they will not use these powers? What assessment have they made of the impact of the use of the powers? Why are the powers not being overseen by the Electoral Commission? If anybody is going to have the power to introduce the ability to market directly to voters, it should be the Electoral Commission.

All this smacks of taking advantage of financial asymmetry. We talked about competition asymmetry with big tech when we debated the digital markets Bill; similarly, this seems a rather sneaky way of taking advantage of the financial resources one party might have versus others. It would allow it to do things other parties cannot, because it has granted itself permission to do that. The provisions should not be in the hands of any Secretary of State or governing party; if anything, they should be in entirely independent hands; but, even then, they are undesirable.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I thank the noble Baroness, Lady Jones, for tabling her amendments. Amendment 19 would remove processing which is necessary for the purposes of democratic engagement from the list of recognised legitimate interests. It is essential in a healthy democracy that registered political parties, elected representatives and permitted participants in referendums can engage freely with the electorate without being impeded unnecessarily by data protection legislation.

The provisions in the Bill will mean that these individuals and organisations do not have to carry out legitimate interest assessments or look for a separate legal basis. They will, however, still need to comply with other requirements of data protection legislation, such as the data protection principles and the requirement for processing to be necessary.

On the question posed by the noble Baroness about the term “democratic engagement”, it is intended to cover a wide range of political activities inside and outside election periods. These include but are not limited to democratic representation; communicating with electors and interested parties; surveying and opinion gathering; campaigning activities; activities to increase voter turnout; supporting the work of elected representatives, prospective candidates and official candidates; and fundraising to support any of these activities. This is reflected in the drafting, which incorporates these concepts in the definition of democratic engagement and democratic engagement activities.

The ICO already has guidance on the use of personal data by political parties for campaigning purposes, which the Government anticipate it will update to reflect the changes in the Bill. We will of course work with the ICO to make sure it is familiar with our plans for commencement and that it does not benefit any party over another.

On the point made about the appropriate age for the provisions, in some parts of the UK the voting age is 16 for some elections, and children can join the electoral register as attainers at 14. The age of 14 reflects the variations in voting age across the nation; in some parts of the UK, such as Scotland, a person can register to vote at 14 as an attainer. An attainer is someone who is registered to vote in advance of their being able to do so, to allow them to be on the electoral roll as soon as they turn the required age. Children aged 14 and over are often politically engaged and are approaching voting age. The Government consider it important that political parties and elected representatives can engage freely with this age group—

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

I am interested in what the Minister says about the age of attainers. Surely it would be possible to remove attainers from those who could be subject to direct marketing. Given how young attainers could be, it would protect them from the unwarranted attentions of campaigning parties and so on. I do not see that as a great difficulty.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Indeed. It is certainly worth looking at, but I remind noble Lords that such communications have to be necessary, and the test of their being necessary for someone of that age is obviously more stringent.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

But what is the test of necessity at that age?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

The processor has to determine whether it is necessary to the desired democratic engagement outcome to communicate with someone at that age. But I take the point: for the vast majority of democratic engagement communications, 14 would be far too young to make that a worthwhile or necessary activity.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

As I recall, the ages are on the electoral register.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am not aware one way or the other, but I will happily look into that to see what further safeguards we can add so that we are not bombarding people who are too young with this material.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

May I make a suggestion to my noble friend the Minister? It might be worth asking the legal people to get the right wording, but if there are different ages at which people can vote in different parts of the United Kingdom, surely it would be easier just to relate it to the age at which they are able to vote in those elections. That would address a lot of the concerns that many noble Lords are expressing here today.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, this whole area of democratic engagement is one that the Minister will need to explain in some detail. This is an Alice in Wonderland schedule: “These words mean what I want them to mean”. If, for instance, you are engaging with the children of a voter—at 14, they are children—is that democratic engagement? You could drive a coach and horses through Schedule 1. The Minister used the word “necessary”, but he must give us rather more than that. It was not very reassuring.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

The Minister mentioned a presumption that the ICO will update its guidance. Is there a timeframe for that? Will the guidance be updated before this comes into effect? How does the age of 14 relate to the AADC, which sets the age of adulthood at 18?

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - - - Excerpts

Before the Minister replies, we may as well do the full round. I agree with him, in that I very much believe in votes at 16 and possibly younger. I have been on many a climate demonstration with young people of 14 and under, so they can be involved, but the issue here is bigger than age. The main issue is not age but whether anybody should be subjected to a potential barrage of material in which they have not in any way expressed an interest. I am keen to make sure that this debate is not diverted to the age question and that we do not lose the bigger issue. I wanted to say that I sort of agree with the Minister on one element.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

I agree with the noble Baroness, but with one rider. We will keep coming back to the need for children to have a higher level of data protection than adults, and this is but one of many examples we will debate. However, I agree with her underlying point. The reason why I support removing both these clauses is the hubris of believing that you will engage the electorate by bombarding them with things they did not ask to receive.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

A fair number of points were made there. I will look at ages under 16 and see what further steps, in addition to being necessary and proportionate, we can think about to provide some reassurance. Guidance would need to be in effect before any of this is acted on by any of the political parties. I and my fellow Ministers will continue to work with the ICO—

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I am sorry to press the Minister, but does the Bill state that guidance will be in place before this comes into effect?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am not sure whether it is written in the Bill. I will check, but the Bill would not function without the existence of the guidance.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

I am sorry to drag this out but, on the guidance, can we be assured that the Minister will involve the Electoral Commission? It has a great deal of experience here; in fact, it has opined in the past on votes for younger cohorts of the population. It seems highly relevant to seek out its experience and the benefits of that.

17:00
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I would of course be very happy to continue to engage with the Electoral Commission.

We will continue to work with the ICO to make sure that it is familiar with the plans for commencement and that its plans for guidance fit into that. In parts of the UK where the voting age is 18 and the age of attainment is 16, it would be more difficult for candidates and parties to show that it was necessary or proportionate to process the personal data of 14 and 15 year-olds in reliance on the new lawful ground. In this context, creating an arbitrary distinction between children at or approaching voting age and adults may not be appropriate; in particular, many teenagers approaching voting age may be more politically engaged than some adults. These measures will give parties and candidates a clear lawful ground for engaging them in the process. Accepting this amendment would remove the benefits of greater ease of identification of a lawful ground for processing by elected representatives, candidates and registered political parties, which is designed to improve engagement with the electorate. I therefore hope that the noble Baroness, Lady Jones, will withdraw her amendment.

I now come to the clause stand part notice that would remove Clause 114, which gives the Secretary of State a power to make exceptions to the direct marketing rules for communications sent for the purposes of democratic engagement. As Clause 115 defines terms for the purposes of Clause 114, the noble Baroness, Lady Jones, is also seeking for that clause to be removed. Under the current law, many of the rules applying to electronic communications sent for commercial marketing apply to messages sent by registered political parties, elected representatives and others for the purposes of democratic engagement. It is conceivable that, after considering the risks and benefits, a future Government might want to treat communications sent for the purposes of democratic engagement differently from commercial marketing. For example, in areas where voter turnout is particularly low or there is a need to increase engagement with the electoral process, a future Government might decide that the direct marketing rules should be modified. This clause stand part notice would remove that option.

We have incorporated several safeguards that must be met prior to regulations being laid under this clause. They include the Secretary of State having specific regard to the effect the exceptions could have on an individual’s privacy; a requirement to consult the Information Commissioner and other interested parties, as the Secretary of State considers appropriate; and the regulations being subject to parliamentary approval via the affirmative procedure.

For these reasons, I hope that the noble Baroness will agree to withdraw or not press her amendments.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I am pleased that I have sparked such a lively debate. When I tabled these amendments, it was only me and the noble Lord, Lord Clement-Jones, so I thought, “This could be a bit sad, really”, but it has not been. Actually, it has been an excellent debate and we have identified some really good issues.

As a number of noble Lords said, the expression “democratic engagement” is weasel words: what is not to like about democratic engagement? We all like it. Only when you drill down into the proposals do you realise the traps that could befall us. As noble Lords and the noble Baroness, Lady Bennett, rightly said, we have to see this in the context of some of the other moves the Government are pursuing in trying to skew the electoral rules in their favour. I am not convinced that this is as saintly as the Government are trying to pretend.

The noble Baroness, Lady Harding, is absolutely right: this is about trust. It is about us setting an example. Of all the things we can do on data protection that we have control over, we could at least show the electorate how things could be done, so that they realise that we, as politicians, understand how precious their data is and that we do not want to misuse it.

I hope we have all knocked on doors, and I must say that I have never had a problem engaging with the electorate, and actually they have never had a problem engaging with us. This is not filling a gap that anybody has identified. We are all out there and finding ways of communicating that, by and large, I would say the electorate finds perfectly acceptable. People talk to us, and they get the briefings through the door. That is what they expect an election campaign to be about. They do not expect, as the noble Baroness, Lady Harding, said, to go to see their MP about one thing and then suddenly find that they are being sent information about something completely different or that assumptions are being made about them which were never the intention when they gave the information in the first place. I just feel that there is something slightly seedy about all this. I am sorry that the Minister did not pick up a little more on our concerns about all this.

There are some practical things that I think it was helpful for us to have talked about, such as the Electoral Commission. I do not think that it has been involved up to now. I would like to know in more detail what its views are on all this. It is also important that we come back to the Information Commissioner and check in more detail what his view is on all this. It would be nice to have guidance, but I do not think that that will be enough to satisfy us in terms of how we proceed with these amendments.

The Minister ultimately has not explained why this has been introduced at this late stage. He is talking about this as though conceivably, in the future, a Government might want to adopt these rules. If that is the case, I respectfully say that we should come back at that time with a proper set of proposals that go right through the democratic process that we have here in Parliament, scrutinise it properly and make a decision then, rather than being bounced into something at a very late stage.

I have to say that I am deeply unhappy at what the Minister has said. I will obviously look at Hansard, but I may well want to return to this.

Amendment 19 withdrawn.
Schedule 1 agreed.
Clause 6: The purpose limitation
Amendment 20
Moved by
20: Clause 6, page 8, leave out lines 20 to 22 and insert—
“(c) the nature of the processing, including whether it is processing described in Article 9(1) (processing of special categories of personal data) or Article 10(1) (processing of personal data relating to criminal convictions etc);”Member's explanatory statement
This technical amendment changes new Article 8A(2)(c) of the UK GDPR so that it refers to processing rather than personal data, reflecting the terms of Articles 9(1) and 10(1).
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I rise to speak to a series of minor and technical, yet necessary, government amendments which, overall, improve the functionality of the Bill. I hope the Committee will be content if I address them together. Amendments 20, 42, 61 and 63 are minor technical amendments to references to special category data in Clauses 6 and 14. All are intended to clarify that references to special category data mean references to the scope of Article 9(1) of the UK GDPR. They are simply designed to improve the clarity of the drafting.

I turn now to the series of amendments that clarify how time periods within the data protection legal framework are calculated. For the record, these are Amendments 136, 139, 141, 149, 151, 152, 176, 198, 206 to 208, 212 to 214, 216, 217, 253 and 285. Noble Lords will be aware that the data protection legislation sets a number of time periods or deadlines for certain things to happen, such as responding to subject access requests; in other words, at what day, minute or hour the clock starts and stops ticking in relation to a particular procedure. The Data Protection Act 2018 expressly applies the EU-derived rules on how these time periods should be calculated, except in a few incidences where it is more appropriate for the UK domestic approach to apply, for example time periods related to parliamentary procedures. I shall refer to these EU-derived rules as the time periods regulation.

In response to the Retained EU Law (Revocation and Reform) Act 2023, we are making it clear that the time periods regulation continues to apply to the UK GDPR and other regulations that form part of the UK’s data protection and privacy framework, for example, the Privacy and Electronic Communications (EC Directive) Regulations 2003. By making such express provision, our aim is to ensure consistency and continuity and to provide certainty for organisations, individuals and the regulator. We have also made some minor changes to existing clauses in the Bill to ensure that application of the time periods regulation achieves the correct effect.

Secondly, Amendment 197 clarifies that the requirement to consult before making regulations that introduce smart data schemes may be satisfied by a consultation before the Bill comes into force. The regulations must also be subject to affirmative parliamentary scrutiny to allow Members of both Houses to scrutinise legislation. This will facilitate the rapid implementation of smart data schemes, so that consumers and businesses can start benefiting as soon as possible. The Government are committed to working closely with business and wider stakeholders in the development of smart data.

Furthermore, Clause 96(3) protects data holders from the levy that may be imposed to meet the expenses of persons and bodies performing functions under smart data regulations. This levy cannot be imposed on data holders that do not appear capable of being directly affected by the exercise of those functions.

Amendment 196 extends that protection to authorised persons and third-party recipients on whom the levy may also be imposed. Customers will not have to pay to access their data, only for the innovative services offered by third parties. We expect that smart data schemes will deliver significant time and cost savings for customers.

The Government are committed to balancing the incentives for businesses to innovate and provide smart data services with ensuring that all customers are empowered through their data use and do not face undue financial barriers or digital exclusion. Any regulations providing for payment of the levy or fees will be subject to consultation and to the affirmative resolution procedure in Parliament.

Amendments 283 and 285 to Schedule 15 confer a general incidental power on the information commission. It will have the implied power to do things incidental to or consequential upon the exercise of its functions, for example, to hold land and enter into agreements. This amendment makes those implicit powers explicit for the avoidance of doubt and in line with standard practice. It does not give the commission substantive new powers. I beg to move.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

My Lords, I know that these amendments were said to be technical amendments, so I thought I would just accept them, but when I saw the wording of Amendment 283 some alarm bells started ringing. It says:

“The Commission may do anything it thinks appropriate for the purposes of, or in connection with, its functions”.


I know that the Minister said that this is stating what the commission is already able to do, but I am concerned whenever I see those words anywhere. They give a blank cheque to any authority or organisation.

Many noble Lords will know that I have previously spoken about the principal-agent theory in politics, in which certain powers are delegated to an agency or regulator, but what accountability does it have? I worry when I see that it “may do anything … appropriate” to fulfil its tasks. I would like some assurance from the Minister that there is a limit to what the information commission can do and some accountability. At a time when many of us are asking who regulates the regulators and when we are looking at some of the arm’s-length bodies—need I mention the Post Office?—there is some real concern about accountability.

I understand the reason for wanting to clarify or formalise what the Minister believes the information commission is doing already, but I worry about this form of words. I would like some reassurance that it is not wide-ranging and that there is some limit and accountability to future Governments. I have seen this sentiment across the House; people are asking who regulates the regulators and to whom are they accountable.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I must congratulate the noble Lord, Lord Kamall. Amid a blizzard of technical and minor amendments from the Minister, he forensically spotted one to raise in that way. He is absolutely right. The Industry and Regulators Committee has certainly been examining the accountability and scrutiny devoted to regulators, so we need to be careful in the language that we use. I think we have to take a lot on trust from the Minister, particularly in Grand Committee.

I apparently failed to declare an interest at Second Reading. I forgot to state that I am a consultant to DLA Piper and the Whips have reminded me today that I failed to do so on the first day in Committee, so I apologise to the Committee for that. I am not quite sure why my consultancy with DLA Piper is relevant to the data protection Bill, but there it is. I declare it.

17:15
Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

I should also declare an interest. I apologise that I did not do so earlier. I worked with a think tank and wrote a series of papers on who regulates the regulators. I still have a relationship with that think tank.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

My Lords, I have been through this large group and, apart from my natural suspicion that there might be something dastardly hidden away in it, I am broadly content, but I have a few questions.

On Amendment 20, can the Minister conform that the new words “further processing” have the same meaning as the reuse of personal data? Can he confirm that Article 5(1)(b) will prohibit this further processing when it is not in line with the original purpose for which the data was collected? How will the data subject know that is the case?

On Amendment 196, to my untutored eye it looks like the regulation-making power is being extended away from the data holder to include authorised persons and third-party recipients. My questions are simple enough: was this an oversight on the part of the original drafters of that clause? Is the amendment an extension of those captured by the effect of the clause? Is it designed to achieve consistency across the Bill? Finally, can I assume that an authorised person or third party would usually be someone acting on behalf of an agent of the data holder?

I presume that Amendments 198, 212 and 213 are needed because of a glitch in the drafting—similarly with Amendment 206. I can see that Amendments 208, 216 and 217 clarify when time periods begin, but why are the Government seeking to disapply time periods in Amendment 253 when surely some consistency is required?

Finally—I am sure the Minister will be happy about this—I am all in favour of flexibility, but Amendment 283 states that the Information Commissioner has the power to do things to facilitate the exercise of his functions. The noble Lord, Lord Kamall, picked up on this. We need to understand what those limits are. On the face of it, one might say that the amendment is sensible, but it seems rather general and broad in its application. As the noble Lord, Lord Kamall, rightly said, we need to see what the limits of accountability are. This is one of those occasions.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the noble Lords, Lord Kamall and Lord Bassam, for their engagement with this group. On the questions from the noble Lord, Lord Kamall, these are powers that the ICO would already have in common law. As I am given to understand is now best practice, they are put on a statutory footing in the Bill as part of best practice with all Bills. The purpose is to align with best practice. It does not confer substantial new powers but clarifies the powers that the regulator has. I can also confirm that the ICO was and remains accountable to Parliament.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

I am sorry to intervene as I know that noble Lords want to move on to other groups, but the Minister said that the ICO remains accountable to Parliament. Will he clarify how it is accountable to Parliament for the record?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

The Information Commissioner is directly accountable to Parliament in that he makes regular appearances in front of Select Committees that scrutinise the regulator’s work, including progress against objectives.

The noble Lord, Lord Bassam, made multiple important and interesting points. I hope he will forgive me if I undertake to write to him about those; there is quite a range of topics to cover. If there are any on which he requires answers right away, he is welcome to intervene.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

I want to be helpful to the Minister. I appreciate that these questions are probably irritating but I carefully read through the amendments and aligned them with the Explanatory Notes. I just wanted some clarification to make sure that we are clear on exactly what the Government are trying to do. “Minor and technical” covers a multitude of sins; I know that from my own time as a Minister.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Indeed. I will make absolutely sure that we provide a full answer. By the way, I sincerely thank the noble Lord for taking the time to go through what is perhaps not the most rewarding of reads but is useful none the less.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

On the question of the ICO being responsible to Parliament, in the then Online Safety Bill and the digital markets Bill we consistently asked for regulators to be directly responsible to Parliament. If that is something the Government believe they are, we would like to see an expression of it.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I would be happy to provide such an expression. I will be astonished if that is not the subject of a later group of amendments. I have not yet prepared for that group, I am afraid, but yes, that is the intention.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

In which case, you are warned.

Amendment 20 agreed.
Amendments 21 to 23 not moved.
Clause 6, as amended, agreed.
Schedule 2 agreed.
Clauses 7 and 8 agreed.
Clause 9: Vexatious or excessive requests by data subjects
Amendment 24
Moved by
24: Clause 9, page 17, leave out line 33
Member’s explanatory statement
This amendment would mean that the resources available to the controller could not be taken into account when determining whether a request by a data subject is vexatious or excessive.
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, in moving Amendment 24, I will speak also to Amendment 26. I welcome the amendments in the name of the noble Lord, Lord Clement-Jones.

Together, these amendments go to the heart of questioning why the Government have found it necessary to change the grounds for the refusal of a subject access request from “manifestly unfounded” to “vexatious or excessive”. At the moment, Article 15 of the UK GDPR gives data subjects a right of access to find out what personal information an organisation hold on them, how it is using it and whether it is sharing it. This right of access is key to transparency and often underpins people’s ability to exercise other data rights and human rights; for example, it impacts on an individual’s right to privacy in Article 8 of the ECHR and their right to non-discrimination in Article 40 of the same.

The Equality and Human Rights Commission has raised specific concerns about these proposals, arguing that subject access requests

“are a vital mechanism for data subjects to exercise their fundamental rights to privacy and freedom from discrimination”.

It argues that these rights will be even more vital as AI systems are rolled out, using personal information

“in ways that may be less than transparent to data subjects”.

So we must be suspicious as to why these changes are being made and whether they are likely to reduce the legitimate opportunities for data subjects to access their personal information.

This comes back to the mantra of the noble Lord, Lord Clement-Jones, regarding a number of the clauses we have dealt with and, I am sure, ones we have yet to deal with: why are these changes necessary? That is the question we pose as well. Is it simply to give greater clarity, as the Minister in the Commons claimed; or is it to lighten the burden on business—the so-called Brexit dividend—which would result in fewer applications being processed by data controllers? Perhaps the Minister could clarify whether data subject rights will be weakened by these changes.

In the Commons, the Minister, John Whittingdale, also argued that some data search requests are dispro-portionate when the information is of low importance or low relevance to the data subject. However, who has the right to make that decision? How is a data controller in a position to judge how important the information is to an individual? Can the Minister clarify whether the data controller would have the right to ask the data subject their reasons for requesting the information? This is not permitted under the current regime.

A number of stakeholders have argued that the new wording is too subjective and is open to abuse by data controllers who find responding to such requests, by their very nature, vexatious or excessive. For a busy data operator, any extra work could be seen as excessive. Although the Information Commissioner has said that he is clear how these words should be applied, he has also said that they are open to numerous interpretations. Therefore, there is a rather urgent need for the Information Commissioner to provide clear statutory guidance on the application of the terms, so that only truly disruptive requests can be rejected. Perhaps the Minister can clarify whether this is the intention.

In the meantime, our Amendment 24 aims to remove the easy get-out clause for refusing a request by making it clear that the resources available to the controller should not, by itself, be a reason for rejecting an application for information. There is an inevitable cost involved in processing requests, and we need to ensure that it does not become the standard excuse for denying data subjects their rights. Our Amendment 26 would require the data controller to produce evidence of why a request is considered vexatious or excessive if it is being denied. It should not be possible to assert this as a reason without providing the data subject with a clear and justifiable explanation. Amendment 25, from the noble Lord, Lord Clement-Jones, has a similar intent.

We remain concerned about the changes and the impact they will have on established data and human rights. As a number of stakeholders have argued, access to personal data and its uses underpins so many other rights that can be enforced by law. We should not give these rights away easily or without proper justification. I look forward to hearing what the Minister has to say, but without further clarification in the Bill, I doubt whether our concerns will be assuaged. I beg to move.

Lord Sikka Portrait Lord Sikka (Lab)
- Hansard - - - Excerpts

My Lords, I will say a little bit about my intention to delete this clause altogether. Clause 9 significantly changes the data and privacy landscape, and for the worse. The Constitution Committee’s report on the Bill, published on 25 January, noted:

“Clause 9 amends Article 12 of the UK GDPR to broaden the basis for refusal”—


not for enhancing, but for refusal—

“of a data access request by providing more leeway to ‘data controllers’”.

In the world we live in, there is a huge imbalance of power between corporations, governments, public bodies and individuals. People must have a right to know what information is held about them, and how and when it is used. It is vital in order to check abuses and hold powerful elites to account.

The request for information can, at the moment, be wholly or partly denied, depending on the circumstances. It can be refused if it is considered to be manifestly unfounded or manifestly excessive. These phrases, “manifestly unfounded” and “manifestly excessive”, are fairly well understood. There is already a lot of case law on that. Clause 9, however, lowers the threshold for refusing information from “manifestly unfounded or excessive” to “vexatious or excessive”.

17:30
As has been pointed out, under the Bill, data controllers are required to carry out only those searches they think “reasonable and proportionate”. On 29 November, the Minister in the other place pointed out that data controllers can reject inquiries that they deem to be
“of low importance or of low relevance to the data subject”.—[Official Report, Commons, 29/11/23; col. 873.]
The Bill therefore effectively allows organisations to make assumptions about the reasons for requests and then refuse to act upon requests that
“are intended to cause distress”,
or
“are not made in good faith”.
That in itself is highly problematic. It means that there will be little or no transparency about the data controllers’ decisions. The unilateral decision of the controllers cannot easily be challenged, which is a further erosion of people’s right to know.
I have no doubt that, at some point, the Minister will refer to the private costs of meeting the SARs. I would welcome some data, if the Minister has any, on how many requests for information are received each year, how many are considered “manifestly unfounded or excessive” at the outset by the data controllers, how many are rejected, how many requests go to the Information Commissioner and are rejected, how many subsequently go to tribunal, and whether the initial decision to refuse the request is accepted or rejected. If he going to refer to any costs, I would also like to ask him some accounting questions about how the costs are computed. I hope he will be able to answer those, because I simply will not take his word that it actually has a cost. If those costs exist, who audited them and when? He has a lot of information to give on that.
In many ways, any focus on costs is primitive, because these private costs do not consider the social costs. There is a social cost associated with refusal. That social cost is highly evident from the case of the Post Office scandal. People regularly asked for information. In November 2015, for example, the Justice For Subpostmasters Alliance urged its members to submit subject access requests to find out what information the Post Office held about them. The information they received, even though only part of their request was met, helped to show that the Post Office knew about the flaws of the Horizon system. Clause 9 makes it easier for companies such as the Post Office to refuse to provide such information to people, and to refuse even to tell them what data it holds. If Clause 9 becomes law, it will help to hide wrongdoings and corrupt practices.
If, in his response, the Minister is tempted to argue that the “vexatious” threshold of the Bill will somehow be aligned with the freedom of information regime, I would remind him that the FoI regime is much broader in scope, as it enables individuals to seek access to
“information held by public authorities or by persons providing services”.
Instead, this Bill empowers individuals to make requests only in relation to their personal data. The scope of these requests is therefore much narrower—the two cannot really be compared. Of course, numerous FoI requests by sub-postmasters were also refused or only partly answered. These included a request for six months’ correspondence between the business department and Paula Vennells. That was refused on the grounds of cost, even though a previous request for a longer period of correspondence had been fulfilled.
I come to the further link between FoI and this Bill. I personally experienced such selective obstruction by the Treasury when I requested some information about the forced closure of the Bank of Credit and Commerce International in July 1991. After some five and half years of obstruction and the legal process, three judges, in the 2011 case of Professor Prem Sikka v the Information Commissioner and the Commissioners of Her Majesty’s Treasury, ordered the Government to release a document codenamed the Sandstorm report to me. It showed that the Government were covering up fraud and money laundering on a gigantic scale. They used BCCI to fund al-Qaeda, which was created by the western powers, Saudi intelligence services, arms smugglers, criminals, murderers—all the lowlifes.
As a result, I have no idea what information the Government now hold about me. Under this clause, it would be so easy for someone to deny that information to me if I requested it. If people such as me cannot access personal data, it will be almost impossible for us to exercise our right to call for the erasure of that data. I cannot ask anyone to delete that data if someone refuses to give it to me. I urge the Minister to withdraw this clause, as it is an affront to human rights and public accountability.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Lord, Lord Sikka. He raised even more questions about Clause 9 than I ever dreamed of. He has illustrated the real issues behind the clause and why it is so important to debate its standing part, because, in our view, it should certainly be removed from the Bill. It would seriously limit people’s ability to access information about how their personal data is collected and used. We are back to the dilution of data subject rights, within which the rights of data subject access are, of course, vital. This includes limiting access to information about automated decision-making processes to which people are subject.

A data subject is someone who can be identified directly or indirectly by personal data, such as a name, an ID number, location data, or information relating to their physical, economic, cultural or social identity. Under existing law, data subjects have a right to request confirmation of whether their personal data is being processed by a controller, to access that personal data and to obtain information about how it is being processed. The noble Lord, Lord Sikka, pointed out that there is ample precedent for how the controller can refuse a request from a data subject only if it is manifestly unfounded or excessive. The meaning of that phrase is well established.

There are three main ways in which Clause 9 limits people’s ability to access information about how their personal data is being collected and used. First, it would lower the threshold for refusing a request from “manifestly unfounded or excessive” to “vexatious or excessive”. This is an inappropriately low threshold, given the nature of a data subject access request—namely, a request by an individual for their own data.

Secondly, Clause 9 would insert a new mandatory list of considerations for deciding whether the request is vexatious or excessive. This includes vague considerations, such as

“the relationship between the person making the request (the ‘sender’) and the person receiving it (the ‘recipient’)”.

The very fact that the recipient holds data relating to the sender means that there is already some form of relationship between them.

Thirdly, the weakening of an individual’s right to obtain information about how their data is being collected, used or shared is particularly troubling given the simultaneous effect of the provisions in Clause 10, which means that data subjects are less likely to be informed about how their data is being used for additional purposes other than those for which it was originally collected, in cases where the additional purposes are for scientific or historical research, archiving in the public interest or statistical purposes. Together, the two clauses mean that an individual is less likely to be proactively told how their data is being used, while it is harder to access information about their data when requested.

In the Public Bill Committee in the House of Commons, the Minister, Sir John Whittingdale, claimed that:

“The new parameters are not intended to be reasons for refusal”,


but rather to give

“greater clarity than there has previously been”.—[Official Report, Commons, Data Protection and Digital Information Bill Committee, 16/5/23; cols. 113-14.]

But it was pointed out by Dr Jeni Tennison of Connected by Data in her oral evidence to the committee that the impact assessment for the Bill indicates that a significant proportion of the savings predicted would come from lighter burdens on organisations dealing with subject access requests as a result of this clause. This suggests that, while the Government claim that this clause is a clarification, it is intended to weaken obligations on controllers and, correspondingly, the rights of data subjects. Is that where the Secretary of State’s £10 billion of benefit from this Bill comes from? On these grounds alone, Clause 9 should be removed from the Bill.

We also oppose the question that Clause 12 stand part of the Bill. Clause 12 provides that, in responding to subject access requests, controllers are required only to undertake a

“reasonable and proportionate search for the personal data and other information”.

This clause also appears designed to weaken the right of subject access and will lead to confusion for organisations about what constitutes a reasonable and proportionate search in a particular circumstance. The right of subject access is central to individuals’ fundamental rights and freedoms, because it is a gateway to exercising other rights, either within the data subject rights regime or in relation to other legal rights, such as the rights to equality and non-discrimination. Again, the lowering of rights compared with the EU creates obvious risks, and this is a continuing theme of data adequacy.

Clause 12 does not provide a definition for reasonable and proportionate searches, but when introducing the amendment, Sir John Whittingdale suggested that a search for information may become unreasonable or disproportionate

“when the information is of low importance or of low relevance to the data subject”.—[Official Report, Commons, 29/11/23; col. 873.]

Those considerations diverge from those provided in the Information Commissioner’s guidance on the rights of access, which states that when determining whether searches may be unreasonable or disproportionate, the data controller must consider the circumstances of the request, any difficulties involved in finding the information and the fundamental nature of the right of access.

We also continue to be concerned about the impact assessment for the Bill and the Government’s claims that the new provisions in relation to subject access requests are for clarification only. Again, Clause 12 appears to have the same impact as Clause 9 in the kinds of savings that the Government seem to imagine will emerge from the lowering of subject access rights. This is a clear dilution of subject access rights, and this clause should also be removed from the Bill.

We always allow for belt and braces and if our urging does not lead to the Minister agreeing to remove Clauses 9 and 12, at the very least we should have the new provisions set out either in Amendment 26, in the name of the noble Baroness, Lady Jones of Whitchurch, or in Amendment 25, which proposes that a data controller who refuses a subject access request must give reasons for their refusal and tell the subject about their right to seek a remedy. That is absolutely the bare minimum, but I would far prefer to see the deletion of Clauses 9 and 12 from the Bill.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

As ever, I thank noble Lords for raising and speaking to these amendments. I start with the stand part notices on Clauses 9 and 36, introduced by the noble Lord, Lord Clement-Jones. Clauses 9 and 36 clarify the new threshold to refuse or charge a reasonable fee for a request that is “vexatious or excessive”. Clause 36 also clarifies that the Information Commissioner may charge a fee for dealing with, or refuse to deal with, a vexatious or excessive request made by any persons and not just data subjects, providing necessary certainty.

17:45
It is important to be clear that controllers already have the ability to refuse or charge a reasonable fee for “manifestly unfounded or excessive” data subject requests. However, the scope of the current provision is unclear, and there are a variety of circumstances where controllers would benefit from being able to confidently refuse or charge a reasonable fee for a request. The Government are introducing the new “vexatious or excessive” terminology to clarify the scope of the provision. Clause 36 amends the grounds for refusing to deal with a request to ensure consistency with this terminology and clarifies that the Information Commissioner may refuse a request by any persons, not just data subjects.
On Amendment 24, the Government believe that it is reasonable to consider
“the resources available to the controller”
as one of the new circumstances for controllers to determine “vexatious or excessive” requests. This will give controllers the confidence to focus resources on responding to reasonable requests.
Today, controllers can already consider resources when refusing or charging a reasonable fee for a request. The Government do not want to change that. The current ICO guidance sets out that controllers can consider resources as a factor when determining whether a request is excessive. We expect the new parameters to be considered individually, as well as in relation to one another. A controller should consider which parameters may be relevant when deciding how to respond to a request. Thus a controller may also consider available resources when deciding whether to respond to a request in full—for example, where the resource impact of responding would be minimal, even if a large amount of information had been requested.
I will take Amendments 25 and 26 together. They would require controllers to provide evidence for why a request is considered vexatious or excessive. In the view of the Government, these amendments are redundant, because the Bill already requires controllers to provide data subjects with reasons for why and when they have not acted on data subject requests. When a data subject is not satisfied, they have the right to complain to the controller and then to the ICO. If they are still not satisfied, the data subject can take the controller to court to attempt to resolve the dispute.
Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

From looking at the wording of the Members’ explanatory statements for wishing to leave out Clauses 9 and 36, I do not think that the Minister has addressed this, but does he accept that the Bill now provides a more lax approach? Is this a reduction of the standard expected? To me, “vexatious or excessive” sounds very different from “manifestly unfounded or excessive”. Does he accept that basic premise? That is really the core of the debate; if it is not, we have to look again at the issue of resources, which seems to be the argument to make this change.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

If that is the case and this is a dilution, is this where the Government think they will get the savings identified in the impact assessment? It was alleged in the Public Bill Committee that this is where a lot of the savings would come from—we all have rather different views. My first information was that every SME might save about £80 a year then, suddenly, the Secretary of State started talking about £10 billion of benefit from the Bill. Clarification of that would be extremely helpful. There seems to be a dichotomy between the noble Lord, Lord Bassam, saying that this is a way to reduce the burdens on business and the Minister saying that it is all about confident refusal and confidence. He has used that word twice, which is worrying.

Lord Sikka Portrait Lord Sikka (Lab)
- Hansard - - - Excerpts

I apologise for intervening, but the Minister referred to resources. By that, he means the resources for the controller but, as I said earlier, there is no consideration of what the social cost may be. If this Bill had already become law, how would the victims of the Post Office scandal have been able to secure any information? Under this Bill, the threshold for providing information will be much lower than it is under the current legislation. Can the Minister say something about how the controllers will take social cost into account or how the Government have taken that into account?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

First, on the point made by the noble Lord, Lord Bassam, it is not to be argumentative—I am sure that there is much discussion to be had—but the intention is absolutely not to lower the standard for a well-intended request.

Sadly, a number of requests that are not well intended are made, with purposes of cynicism and an aim to disrupt. I can give a few examples. For instance, some requests are deliberately made with minimal time between them. Some are made to circumvent the process of legal disclosure in a trial. Some are made for other reasons designed to disrupt an organisation. The intent of using “vexatious” is not in any way to reduce well-founded, or even partially well-founded, attempts to secure information; it is to reduce less desirable, more cynical attempts to work in this way.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

But the two terms have a different legal meaning, surely.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

The actual application of the terms will be set out in guidance by the ICO but the intention is to filter out the more disruptive and cynical ones. Designing these words is never an easy thing but there has been considerable consultation on this in order to achieve that intention.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords—sorry; it may be that the Minister was just about to answer my question. I will let him do so.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I will have to go back to the impact assessment but I would be astonished if that was a significant part of the savings promised. By the way, the £10.6 billion—or whatever it is—in savings was given a green rating by the body that assesses these things; its name eludes me. It is a robust calculation. I will check and write to the noble Lord, but I do not believe that a significant part of that calculation leans on the difference between “vexatious” and “manifestly unfounded”.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

It would be very useful to have the Minister respond on that but, of course, as far as the impact assessment is concerned, a lot of this depends on the Government’s own estimates of what this Bill will produce—some of which are somewhat optimistic.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, can we join in with the request to see that information in a letter? We would like to see where these savings will be made and how much will, as noble Lords have said, be affected by the clauses that we are debating today.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

The noble Baroness, Lady Jones, has given me an idea: if an impact assessment has been made, clause by clause, it would be extremely interesting to know just where the Government believe the golden goose is.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am not quite sure what is being requested because the impact assessment has been not only made but published.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Yes, but it is a very broad impact assessment.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I see—so noble Lords would like an analysis of the different components of the impact assessment. It has been green-rated by the independent Regulatory Policy Committee. I have just been informed by the Box that the savings from these reforms to the wording of SARs are valued at less than 1% of the benefit of more than £10 billion that this Bill will bring.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

That begs the question of where on earth the rest is coming from.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Which I will be delighted to answer. With this interesting exchange, I have lost in my mind the specific questions that the noble Lord, Lord Sikka, asked but I am coming on to some of his other ones; if I do not give satisfactory answers, no doubt he will intervene and ask again.

I appreciate the further comments made by the noble Lord, Lord Sikka, about the Freedom of Information Act. I hope he will be relieved to know that this Bill does nothing to amend that Act. On his accounting questions, he will be aware that most SARs are made by private individuals to private companies. The Government are therefore not involved in that process and do not collect the kind of information that he described.

Following the DPDI Bill, the Government will work with the ICO to update guidance on subject access requests. Guidance plays an important role in clarifying what a controller should consider when relying on the new “vexatious or excessive” provision. The Government are also exploring whether a code of practice on subject access requests can best address the needs of controllers and data subjects.

On whether Clause 12 should stand part of the Bill, Clause 12 is only putting on a statutory footing what has already been established—

Lord Sikka Portrait Lord Sikka (Lab)
- Hansard - - - Excerpts

My apologies. The Minister just said that the Government do not collect the data. Therefore, what is the basis for changing the threshold? No data, no reasonable case.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

The Government do not collect details of private interactions between those raising SARs and the companies they raise them with. The business case is based on extensive consultation—

Lord Sikka Portrait Lord Sikka (Lab)
- Hansard - - - Excerpts

I hope that the Government have some data about government departments and the public bodies over which they have influence. Can he provide us with a glimpse of how many requests are received, how many are rejected at the outset, how many go to the commissioners, what the cost is and how the cost is computed? At the moment, it sounds like the Government want to lower the threshold without any justification.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

As I say, I do not accept that the threshold is being lowered. On the other hand, I will undertake to find out what information can be reasonably provided. Again, as I said, the independent regulatory committee gave the business case set out a green rating; that is a high standard and gives credibility to the business case calculations, which I will share.

The reforms keep reasonable requests free of charge and instead seek to ensure that controllers can refuse or charge a reasonable fee for requests that are “vexatious or excessive”, which can consume a significant amount of time and resources. However, the scope of the current provision is unclear and, as I said, there are a variety of circumstances where controllers would benefit from being able confidently to refuse or charge the fee.

Lord Sikka Portrait Lord Sikka (Lab)
- Hansard - - - Excerpts

The Minister used the phrase “reasonable fee”. Can he provide some clues on that, especially for the people who may request information? We have around 17.8 million individuals living on less than £12,570. So, from what perspective is the fee reasonable and how is it determined?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

“Reasonable” would be set out in the guidance to be created by the ICO but it would need to reflect the costs and affordability. The right of access remains of paramount importance in the data protection framework.

Lastly, as I said before on EU data adequacy, the Government maintain an ongoing dialogue with the EU and believe that our reforms are compatible with maintaining our data adequacy decisions.

For the reasons I have set out, I am not able to accept these amendments. I hope that noble Lords will therefore agree to withdraw or not press them.

18:00
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I thank all noble Lords who have spoken in this debate. I am grateful to my noble friend Lord Sikka for rightly sharing the Constitution Committee’s concerns that, on the face of it, it looks like this is broadening the basis for refusal of data requests. He made an important point about the costs needing to be balanced against the social costs of refusing requests and the social impact that there may be, particularly if it is to do with employment or access to public services.

At the heart of this is that we need to ensure that data controllers are not making subjective judgments about whether a request is reasonable. The Minister says that the Information Commissioner will produce guidance. This is important, as that guidance will be absolutely crucial to making a judgment about whether we think this new regime will be credible. The Minister introduced a new phrase: that the intention is to support “well-intended” requests. Well, then we need to start defining “well intended”. I think we will chase these phrases round and round before we get some proper clarification; it would have helped if it had been in the Bill.

We have also gone round and round a bit on whether the changes in the wording weaken the rights of data subjects and whether they save money. The Minister talked about the 1% saving. I am fascinated by that because it does not seem very much; if it is not very much, why are we doing it? We come back to all of this again. I do not quite know what we are hoping to achieve here.

I will need to look at what the Minister said but we need a lot more clarification on this to be reassured that data subjects will not be refused more and more access to the information they want. I was disappointed to hear the Minister say that the controller can consider resources because that seems to me to be the ultimate get-out clause: if a controller can say that they cannot afford to do the data search, does not that mean that individual rights can be ignored just on that basis? That seems too easy; if somebody does not want to do the piece of work, that is an obvious get-out clause, so I remain concerned about the Minister’s response to that amendment as well.

We have explored a lot of this in a lot of different ways and we have had a good debate. I will look again at Hansard but, for the moment, I beg leave to withdraw my amendment.

Amendment 24 withdrawn.
Amendments 25 and 26 not moved.
Clause 9 agreed.
Clause 10 agreed.
Clause 11: Information to be provided to data subjects
Amendment 27
Moved by
27: Clause 11, page 23, line 10, leave out “to the extent that” and insert “when any one or more of the following is true”
Member’s explanatory statement
This amendment would clarify that only one condition under paragraph 5 must be present for paragraphs 1 to 4 to not apply.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, in moving Amendment 27 in my name, I will also express my support for Amendments 28 to 34. I thank my noble friend Lord Black, the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones, for supporting and signing a number of these amendments.

This is quite a specific issue compared to the matters of high policy that we have been debating this afternoon. There is a specific threat to the continuing ability of companies to use the open electoral register for marketing purposes without undue burdens. Some 37% of registered voters choose not to opt out of their data being used for direct marketing via the open electoral register, so quite a significant proportion of the population openly agrees that that data can be used for direct marketing. It is an essential resource for accurate postal addresses and for organisations such as CACI—I suspect that a number of us speaking have been briefed by it; I thank it for its briefing—and it has been used for more than 40 years without detriment to consumers and with citizens’ full knowledge. The very fact that 63% of people on the electoral register have opted out tells you that this is a conscious choice that people have knowingly made.

Why is it in doubt? A recent First-tier Tribunal ruling in a legal case stated, by implication, that every company using open electoral register data must, by 20 May 2024, notify individuals at their postal addresses whenever their data on the electoral register is used and states that cost cannot be considered “dispro-portionate effort”. That means that organisations that are using the electoral roll would need to contact 24.2 million individuals between now and the middle of May, making it completely practically and financially unviable to use the electoral register at scale.

This group of amendments to Clause 11 aims to address this issue. I fully acknowledge that we have tried to hit the target with a number of shots in this group, and I encourage the Minister, first, to acknowledge that he recognises that this is a real problem that the Bill should be able to address and, secondly, if the wording in individual amendments is not effective or has some unintended consequences that we have missed, I encourage him to respond appropriately.

To be clear, the amendments provide legal certainty about the use of the open electoral register without compromising on any aspect of the data privacy of UK citizens or risking data adequacy. The amendments specify that companies are exempt from the requirement to provide individuals with information in cases where their personal data has not been obtained from them directly if that data was obtained from the open electoral register. They provide further clarification of what constitutes “disproportionate effort” under new paragraph (e) in Article 14(5) of the GDPR. These additional criteria include the effort and cost of compliance, the damage and distress caused to the data subjects and the reasonable expectation of the data subjects, which the percentage of people not opting out shows.

Why is this a problem that we need to fix? First, if we do not fix this, we might create in the physical world the very problem that parts of the Bill are trying to address in the digital world: the bombarding of people with lots of information that they do not want to receive, lots of letters telling us that a company is using the electoral roll that we gave it permission to use in the first place. It will also inadvertently give more power to social media companies for targeting because it will make physical direct marketing much harder to target, so SMEs will be forced into a pretty oligopolistic market for social media targeting. Finally, it will mean that we lose jobs and reduce productivity at a time when we are trying to do the opposite.

This is quite a simple issue and there is cross-party support. It is not an issue of great philosophical import, but for the companies in this space, it is very real, and for the people working in this industry, it is about their jobs. Inch by inch, we need to look at things that improve productivity rather than actively destroy it, even when people have agreed to it. With that, I note the hour and I beg to move.

Lord Black of Brentwood Portrait Lord Black of Brentwood (Con)
- Hansard - - - Excerpts

My Lords, I support Amendments 27 to 34, tabled variously by my noble friend Lady Harding, and the noble Lord, Lord Clement-Jones, to which I have added my name. As this is the first time I have spoken in Committee, I declare my interests as deputy chairman of the Telegraph Media Group and president of the Institute of Promotional Marketing and note my other declarations in the register.

The direct marketing industry is right at the heart of the data-driven economy, which is crucial not just to the future of the media and communications industries but to the whole basis of the creative economy, which will power economic growth into the future. The industry has quite rightly welcomed the Bill, which provides a long-term framework for economic growth as well as protecting customers.

However, there is one area of great significance, as my noble friend Lady Harding has just eloquently set out, on which this Bill needs to provide clarity and certainty going forward, namely, the use of the open electoral register. That register is an essential resource for a huge number of businesses and brands, as well as many public services, as they try to build new audiences. As we have heard, it is now in doubt because of a recent legal ruling that could, as my noble friend said, lead to people being bombarded with letters telling them that their data on the OER has been used. That is wholly disproportionate and is not in the interests of the marketing and communications industry or customers.

These sensible amendments would simply confirm the status quo that has worked well for so long. They address the issue by providing legal certainty around the use of the OER. I believe they do so in a proportionate manner that does not in any way compromise any aspect of the data privacy of UK citizens. I urge the Minister carefully to consider these amendments. As my noble friend said, there are considerable consequences of not acting for the creative economy, jobs in direct marketing, consumers, the environment and small businesses.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I am extremely grateful to the noble Baroness, Lady Harding, and the noble Lord, Lord Black, for doing all the heavy lifting on these amendments. I of course support them having put forward my own amendments. It is just the luck of the draw that the noble Baroness, Lady Harding, put forward her amendment along with all the others. I have very little to say in this case, and just echo what the noble Lord, Lord Black, said about the fact that the open electoral register has played an important part in the direct marketing, data-driven economy, as it is described. It is particularly interesting that he mentioned the creative industries as well.

The First-tier Tribunal precedent could impact on other public sources of data, including the register of companies, the register of judgments, orders and fines, the land register and the food standards agency register. It could have quite far-reaching implications unless we manage to resolve the issue. There is a very tight timescale. The First-tier Tribunal’s ruling means that companies must notify those on the electoral register by 20 May or be at risk of breaching the law. This is really the best route for trying to resolve the issue. Secondly, the First-tier Tribunal’s ruling states that costs cannot be considered as disproportionate effort. That is why these amendments explicitly refer to that. This is no trivial matter. It is a serious area that needs curing by this Bill, which is a good opportunity to do so.

I shall speak briefly to Clause 11 as a whole standing part. That may seem a bit paradoxical, but it is designed to address issues arising in Article 13, not Article 14. Article 13 of the UK GDPR requires controllers, where they intend to process data that was collected directly from data subjects—as opposed to Article 14 obligations, which apply to personal data not obtained from the data subject—for a new purpose, to inform data subjects of various matters to the extent necessary,

“to ensure fair and transparent processing”.

Clause 11(1) removes this obligation for certain purposes where it would require disproportionate effort. The obligation is already qualified to what is necessary to make processing fair and transparent, the fundamental requirements of the GDPR. If, in these circumstances, processing cannot be made fair and transparent without disproportionate effort, then it should not take place. Clause 11(1) would sidestep the requirement and allow unfair, untransparent processing to go ahead for personal data that the data controllers had themselves collected. Perhaps I should have tabled a rather more targeted amendment, but I hope that noble Lords get the point of the difference between this in terms of Article 13 and Article 14.

18:15
Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - - - Excerpts

My Lords, I rise briefly to support the amendments in the name of my noble friend Lady Harding and the others in this group. She has comprehensively explained their importance; they may not be philosophical, as she says, but they have practical importance. One of the most compelling reasons for us to act is as she so precisely described: if we do not, we create a situation in the real world that the Bill seeks to address in the digital world.

Although this is about direct marketing, allied to it are pressures on advertising revenues and the greater control that is being taken by the larger platforms in this area all the time. The effect that has on revenues means that this is an important issue that deserves a proper response from the Government. I hope that my noble friend the Minister acts in the way that we want by, if not accepting one of these amendments, coming forward with something from the Government.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I can also be relatively brief. I thank all noble Lords who have spoken and the noble Baroness, Lady Harding, and the noble Lord, Lord Clement-Jones, for their amendments, to many of which I have added my name.

At the heart of this debate is what constitutes a disproportionate or impossibility exemption for providing data to individuals when the data is not collected directly from data subjects. Amendments 29 to 33 provide further clarity on how exemptions on the grounds of disproportionate effort should be interpreted —for example, by taking into account whether there would be a limited impact on individuals, whether they would be caused any distress, what the exemptions were in the first place and whether the information had been made publicly available by a public body. All these provide some helpful context, which I hope the Minister will take on board.

I have also added my name to Amendments 27 and 28 from the noble Baroness, Lady Harding. They address the particular concerns about those using the open electoral register for direct marketing purposes. As the noble Baroness explained, the need for this amendment arises from the legal ruling that companies using the OER must first notify individuals at their postal addresses whenever their data is being used. As has been said, given that individuals already have an opt-out when they register on the electoral roll, it would seem unnecessary and impractical for companies using the register to follow up with individuals each time they want to access their data. These amendments seek to close that loophole and return the arrangements back to the previous incarnation, which seemed to work well.

All the amendments provide useful forms of words but, as the noble Baroness, Lady Harding, said, if the wording is not quite right, we hope that the Minister will help us to craft something that is right and that solves the problem. I hope that he agrees that there is a useful job of work to be done on this and that he provides some guidance on how to go about it.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank my noble friend Lady Harding for moving this important amendment. I also thank the cosignatories—the noble Lords, Lord Clement-Jones and Lord Black, and the noble Baroness, Lady Jones. As per my noble friend’s request, I acknowledge the importance of this measure and the difficulty of judging it quite right. It is a difficult balance and I will do my best to provide some reassurance, but I welcomed hearing the wise words of all those who spoke.

I turn first to the clarifying Amendments 27 and 32. I reassure my noble friend Lady Harding that, in my view, neither is necessary. Clause 11 amends the drafting of the list of cases when the exemption under Article 14(5) applies but the list closes with “or”, which makes it clear that you need to meet only one of the criteria listed in paragraph (5) to be exempt from the transparency requirements.

I turn now to Amendments 28 to 34, which collectively aim to expand the grounds of disproportionate effort to exempt controllers from providing certain information to individuals. The Government support the use of public data sources, such as the OER, which may be helpful for innovation and may have economic benefits. Sometimes, providing this information is simply not possible or is disproportionate. Existing exemptions apply when the data subject already has the information or in cases where personal data has been obtained from someone other than the data subject and it would be impossible to provide the information or disproportionate effort would be required to do so.

We must strike the right balance between supporting the use of these datasets and ensuring transparency for data subjects. We also want to be careful about protecting the integrity of the electoral register, open or closed, to ensure that it is used within the data subject’s reasonable expectations. The exemptions that apply when the data subject already has the information or when there would be a disproportionate effort in providing the information must be assessed on a case-by-case basis, particularly if personal data from public registers is to be combined with other sources of personal data to build a profile for direct marketing.

These amendments may infringe on transparency—a key principle in the data protection framework. The right to receive information about what is happening to your data is important for exercising other rights, such as the right to object. This could be seen as going beyond what individuals might expect to happen to their data.

The Government are not currently convinced that these amendments would be sufficient to prevent negative consequences to data subject rights and confidence in the open electoral register and other public registers, given the combination of data from various sources to build a profile—that was the subject of the tribunal case being referenced. Furthermore, the Government’s view is that there is no need to amend Article 14(6) explicitly to include the “reasonable expectation of the data subjects” as the drafting already includes reference to “appropriate safeguards”. This, in conjunction with the fairness principle, means that data controllers are already required to take this into account when applying the disproportionate effort exemption.

The above notwithstanding, the Government understand that the ICO may explore this question as part of its work on guidance in the future. That seems a better way of addressing this issue in the first instance, ensuring the right balance between the use of the open electoral register and the rights of data subjects. We will continue to work closely with the relevant stakeholders involved and monitor the situation.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

I wonder whether I heard my noble friend correctly. He said “may”, “could” and “not currently convinced” several times, but, for the companies concerned, there is a very real, near and present deadline. How is my noble friend the Minister suggesting that deadline should be considered?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

On the first point, I used the words carefully because the Government cannot instruct the ICO specifically on how to act in any of these cases. The question about the May deadline is important. With the best will in the world, none of the provisions in the Bill are likely to be in effect by the time of that deadline in any case. That being the case, I would feel slightly uneasy about advising the ICO on how to act.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I am not quite getting from the Minister whether he has an understanding of and sympathy with the case that is being made or whether he is standing on ceremony on its legalities. Is he saying, “No, we think that would be going too far”, or that there is a good case and that guidance or some action by the ICO would be more appropriate? I do not get the feeling that somebody has made a decision about the policy on this. It may be that conversations with the Minister between Committee and Report would be useful, and it may be early days yet until he hears the arguments made in Committee; I do not know, but it would be useful to get an indication from him.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Yes. I repeat that I very much recognise the seriousness of the case. There is a balance to be drawn here. In my view, the best way to identify the most appropriate balancing point is to continue to work closely with the ICO, because I strongly suspect that, at least at this stage, it may be very difficult to draw a legislative dividing line that balances the conflicting needs. That said, I am happy to continue to engage with noble Lords on this really important issue between Committee and Report, and I commit to doing so.

On the question of whether Clause 11 should stand part of the Bill, Clause 11 extends the existing disproportionate effort exemption to cases where the controller collected the personal data directly from the data subject and intends to carry out further processing for research purposes, subject to the research safeguards outlined in Clause 26. This exemption is important to ensure that life-saving research can continue unimpeded.

Research holds a privileged position in the data protection framework because, by its nature, it is viewed as generally being in the public interest. The framework has various exemptions in place to facilitate and encourage research in the UK. During the consultation, we were informed of various longitudinal studies, such as those into degenerative neurological conditions, where it is impossible or nearly impossible to recontact data subjects. To ensure that this vital research can continue unimpeded, Clause 11 provides a limited exemption that applies only to researchers who are complying with the safeguards set out in Clause 26.

The noble Lord, Lord Clement-Jones, raised concerns that Clause 11 would allow unfair processing. I assure him that this is not the case, as any processing that uses the disproportionate effort exemption in Article 13 must comply with the overarching data protection principles, including lawfulness, fairness and transparency, so that even if data controllers rely on this exemption they should consider other ways to make the processing they undertake as fair and transparent as possible.

Finally, returning to EU data adequacy, the Government recognise its importance and, as I said earlier, are confident that the proposals in Clause 11 are complemented by robust safeguards, which reinforces our view that they are compatible with EU adequacy. For the reasons that I have set out, I am unable to accept these amendments, and I hope that noble Lords will not press them.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, I am not quite sure that I understand where my noble friend the Minister is on this issue. The noble Lord, Lord Clement-Jones, summed it up well in his recent intervention. I will try to take at face value my noble friend’s assurances that he is happy to continue to engage with us on these issues, but I worry that he sees this as two sides of an issue—I hear from him that there may be some issues and there could be some problems—whereas we on all sides of the Committee have set out a clear black and white problem. I do not think they are the same thing.

I appreciate that the wording might create some unintended consequences, but I have not really understood what my noble friend’s real concerns are, so we will need to come back to this on Report. If anything, this debate has made it even clearer to me that it is worth pushing for clarity on this. I look forward to ongoing discussions with a cross-section of noble Lords, my noble friend and the ICO to see if we can find a way through to resolve the very real issues that we have identified today. With that, and with thanks to all who have spoken in this debate, I beg leave to withdraw my amendment.

Amendment 27 withdrawn.
Amendments 28 to 34 not moved.
Clause 11 agreed.
Clauses 12 and 13 agreed.
Amendment 35 not moved.
18:30
Clause 14: Automated decision-making
Amendment 36
Moved by
36: Clause 14, page 26, line 10, after “processing” insert “, including profiling,”
Member’s explanatory statement
This amendment, and another in the name of Baroness Jones of Whitchurch to the proposed new Article 22A of the UK GDPR, would make clear that protection is offered for profiling operations leading to decisions.
Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

My Lords, this is the first group of amendments covering issues relating to automated decision-making, one of the most interesting areas of data use but also one of the most contested and, for the public at large, one of the most controversial and difficult to navigate. The development of AI and data systems that easily enable automatable decisions could offer huge efficiencies for consumers of public services. Equally, the use of such systems can, if used and regulated in the wrong way, have a devastating impact on people’s lives. If we have learned one thing from the Horizon scandal it is simply that, in the wrong hands and with the wrong system in place, the misuse of data can destroy lives and livelihoods.

Our country has a massive social security system, which includes everything from pension payments to disability income support and, of course, the universal credit system, which covers people entitled to in-work and out-of-work benefits. Over 22 million people receive DWP benefits of one sort or another. If automated decisions make errors in this field the potential to damage lives is enormous, as I am sure the Minister will appreciate.

I turn to the four amendments in the group in the name of my noble friend Lady Jones. Amendments 36 and 37 seek to amend new Article 22A of the UK GDPR and make it clear that protection is provided for profiling operations that lead to decisions. This is important, not least because the clause further reduces the scope for the human review of automated decision-making. Profiling is used as part of this process, and these amendments seek to protect individual data subjects from its effect. We take the view that it is essential that human interaction is involved in making subject access decisions.

Amendment 40 also makes it clear that, in the context of the new Article 22A, for human involvement to be considered meaningful, the review of the decision must be completed by a competent person. One of the positive changes made by the Bill is the introduction of the concept of “meaningful human involvement” in a decision. Meaningful human review is a key component for achieving an appropriate level of oversight over automated decision-making, for protecting individuals from unfair treatment and for offering an avenue for redress. The aim of the amendment is to bring more clarity around what “meaningful human involvement” should consist of. It would require that a review needs to be performed by a person with the necessary competence, training and understanding of the data, and, of course, the authority to alter the decision.

Our Amendment 109 is not so much about building protections as introducing something new and adding to the strength of what is already there. Users have never been able to get personalised explanations of automated decisions but, given the impact that these can have, we feel that systems should be in place for people to understand why a computer has simply said yes or no.

As it stands, the Bill deletes Section 14 of the Data Protection Act 2018 in its entirety. Our amendment would undo that and then add personalisation in. The amendment would retain Section 14 of that Act, which is where most automated decision-making safeguards are currently detailed in law. It would introduce an entitlement for data subjects to receive a personalised explanation of an automated decision made about them. This is based on public attitudes research conducted by the Ada Lovelace Institute, which shows a clear demand for greater transparency over these sorts of decisions.

The amendment also draws on independent legal analysis commissioned by the Ada Lovelace Institute, which found that the generic nature of explanations provided under current law are insufficient for individuals to understand how they have been affected by automated decision-making. This was considered to be a major barrier to meaningful protection from and redress for harms caused by AI. As many noble Lords have made clear in these debates, we have put building trust at the heart of how we get the most from AI and, more particularly, ADM systems.

I turn to the amendments in the name of the noble Lord, Lord Clement-Jones. In essence, they are about—as the noble Lord will, I am sure, explain better than I possibly could—the level of engagement of individuals in decisions about data subject automated decision-making processes. The common thread through the amendments is that they raise the bar in terms of the safeguards for data subjects’ rights and freedoms. We have joined the noble Lord, Lord Clement-Jones, on Amendment 47, and might equally have added our names to the other amendments in the group as we broadly support those too.

Amendment 38A, in the name of the noble Baroness, Lady Bennett, would place an additional requirement under new Article 22A to ensure human engagement in the automated decision-making processes.

I am sure the Committee will want more than warm words from the Minister when he comes to wind up the debate. For all of us, ADM is the here and now; it shapes how we use and consume public services and defines what and who we are. Reducing our protections from its downsides is not to be done lightly and we cannot easily see how that can be justified. I want to hear from the Minister how the Government came to conclude that this was acceptable, not least because, as we will hear in later debates on the Bill, the Government are seeking powers that provide for invasive bulk access to potentially every citizen’s bank accounts. I beg to move the amendments in the name of the noble Baroness, Lady Jones.

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Lord, Lord Bassam, who has already set out very clearly what the group is about. I will chiefly confine myself to speaking to my Amendment 38A, which seeks to put in the Bill a clear idea of what having a human in the loop actually means. We need to have a human in the loop to ensure that a human interpreted, assessed and, perhaps most crucially, was able to intervene in the decision and any information on which it is based.

Noble Lords will be aware of many situations that have already arisen in which artificial intelligence is used—I would say that what we are currently describing is artificial intelligence but, in real terms, it is not truly that at all. What we have is a very large use of big data and, as the noble Lord, Lord Bassam, said, big data can be a very useful and powerful tool to be used for many positive purposes. However, we know that the quality of decision-making often depends on the quality of the data going in. A human is able to see whether something looks astray or wrong; there is a kind of intelligence that humans apply to this, which machines simply do not have the capacity for.

I pay credit to Justice, the law reform and human rights organisation which produced an excellent briefing on the issues around Clause 14. It asserts that, as it is currently written, it inadequately protects individuals from automated harm.

The noble Lord, Lord Bassam, referred to the Horizon case in the UK; that is the obvious example but, while we may think of some of the most vulnerable people in the UK, the Robodebt case in Australia is another case where crunching big data, and then crunching down on individuals, had truly awful outcomes. We know that there is a real risk of unfairness and discrimination in the use of these kinds of tools. I note that the UK has signed the Bletchley declaration, which says that

“AI should be designed, developed, deployed, and used, in a manner that is … human-centric, trustworthy and responsible”.

I focus particularly on “human-centric”: human beings can sympathise with and understand other human beings in a way that big data simply does not.

I draw a parallel with something covered by a special Select Committee of your Lordships’ House, last year: lethal autonomous weapon systems, or so-called killer robots. This is an obvious example of where there is a very strong argument for having a human in the loop, as the terminology goes. From the last I understood and heard about this, I am afraid that the UK Government are not fully committed to a human in the loop in the case of killer robots, but I hope that we get to that point.

When we talk about how humans’ data is used and managed, we are also talking about situations that are—almost equally—life and death: whether people get a benefit, whether they are fairly treated and whether they do not suddenly disappear off the system. Only this morning, I was reading a case study of a woman aged over 80, highlighting how she had been through multiple government departments, but could not get her national insurance number. Without a national insurance number, she could not get the pension to which she was entitled. If there is no human in the loop to cut through those kinds of situations, there is a real risk that people will find themselves just going around and around machines—a circumstance with which we are personally all too familiar, I am sure. My amendment is an attempt to put a real explanation in the Bill for having that human in the loop.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, the number of amendments proposed to Clause 14 reflects the Committee’s very real concern about the impact of automated decision-making on the privacy, safety and prospects of UK data subjects. I have specific amendments in groups 7 and 8, so I will speak to the impact of Clause 14 on children later. I will again be making arguments about the vulnerability of these systems in relation to the Government’s proposals on the DWP.

Without repeating the arguments made, I associate myself with most the proposals and the intention behind them—the need to safeguard the prospects of a fair outcome when algorithms hold sway over a person’s future. It seems entirely logical that, if the definition of solely automated decision-making requires “no meaningful human involvement”, we should be clear, as Amendment 40 proposes, about what is considered “meaningful”, so that the system cannot be gamed by providing human involvement that provides an ineffective safeguard and is therefore not meaningful.

I have sympathy with many of these amendments—Amendments 38A, 39, 47, 62, 64 and 109—and ultimately believe, as was suggested by the noble Lord, Lord Bassam, that it is a matter of trust. I refer briefly to the parliamentary briefing from the BMA, which boldly says that:

“Clause 14 risks eroding trust in AI”.


That would be a very sad outcome.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, we have heard some powerful concerns on this group already. This clause is in one of the most significant parts of the Bill for the future. The Government’s AI policy is of long standing. They started it many years ago, then had a National AI Strategy in 2021, followed by a road map, a White Paper and a consultation response to the White Paper. Yet this part of the Bill, which is overtly about artificial intelligence and automated decision-making, does not seem to be woven into their thinking at all.

18:45
All the Government have decided to do to is to water down the original Article 22 provisions in the GDPR. I find that somewhat baffling, particularly as we have heard about the importance of artificial intelligence and the algorithmic and automated tools that are increasingly being used across the public sector, in particular. The noble Baroness, Lady Bennett, talked about how it impacts on the welfare system, healthcare, policing, immigration and so on—many sensitive areas of individuals’ lives—and in the private sector, in really sensitive areas such as financial services.
I do not think that, nowadays, we are unaware of the problems that arise in relation to decisions that are made solely by automated means. I think we are all aware, not just in the context of lethal autonomous weapons, that human oversight can help guard against the machine’s errors, mitigate risk, such as encoded bias, and ensure that there are robust and rational reasons behind a decision.
It is important that we hold at the front of our mind that we are trying to ensure that there are core ethical duties in place when we encounter artificial intelligence. We now know that that is going to become even more important with the recent Budget with considerable capital expenditure promised to public services. The Secretary of State for DSIT has stated that the Government intend to revolutionise our public services using Al. That makes this kind of provision and the changes being made to Article 22 of even greater significance. As the noble Lord, Lord Bassam, mentioned, the Post Office Horizon scandal has demonstrated the disastrous consequences that can occur when faulty technology is not used responsibly and safely by the humans involved. I was very interested in the example from Australia raised by the noble Baroness, Lady Bennett.
The Government’s data consultation response acknowledged that,
“the right to human review of an automated decision was a key safeguard”.
Currently, Sections 49 and 50 of the DPA and Article 22 of the UK GDPR provide a right not to be subject to a decision based solely on automated processing, with some narrow exemptions. Even that is pretty limited; it is “solely” and obviously we are going to have a bit more argument about whether it should be a bit wider than that. Despite that, Clause 14 would reverse the presumption against solely automated decision-making and would permit ADM—as I think we should call it—in a much wider range of contexts. This is being done without any evidence that the current prohibition on ADM is not working as intended, when we should be enhancing rights in the first place.
Clause 14 would mean that solely automated decision-making would be allowed, unless it is a significant decision and is based on special categories of personal data, in which case specified conditions must be met. The conditions are that the automated decision-making is required or authorised by law or that the data subject has explicitly consented. As part of this change, solely automated decisions that do not involve sensitive personal data are now permissible, so that is quite a change. The burden is now shifted to the individual to complain, requiring controllers to provide information and have measures in place which enable individuals to contest and make representations to a human.
Automated decisions can have significant effects on people’s lives without involving sensitive personal data. Examples include decisions concerning access to financial products, educational decisions such as the A-level algorithm scandal, or the SyRI case in the Netherlands, where innocuous datasets such as household water usage were used to accuse individuals of benefit fraud.
It is also unclear what will meet the threshold of a “significant decision”. Big Brother Watch has identified local authorities that use predictive models to identify children deemed at high risk of committing crimes and include them on a database. Whether a decision to include someone on a database meets the threshold of a significant decision is simply not known, leading to uncertainty for decision-makers and data subjects.
These changes would mean that solely automated decision-making is permitted in a much wider range of contexts. It is especially concerning given that many high-impact algorithmic decisions do not involve processing of special categories of personal data, which is a narrow and specific category. Further, the proposed changes would mean that Article 22 will no longer be cast as a right not to be subject to solely automated decision-making, but rather as a restriction on solely automated decision-making.
There are quite a number of amendments in this group. Many of them speak for themselves, but the whole idea, particularly of the amendments relating to “meaningful automated processing”, is to try to reverse the way that Clause 14 operates so that, if there is meaningful involvement by automated decision-making, these rights arise under the clause. The amendments seek not only to maintain but to improve the current level of protection, so that public authorities that use automation even partially to make decisions must ensure that safeguards for data subjects’ rights and freedoms are in place.
I do not think that I can read out all the amendments, but a number of them would ensure that those decisions are qualified by this concept of “meaningful” automated processing. The review must not be superficial, and the person performing it must have appropriate training, competency and authority to change the decision.
Amendment 43 seeks to ensure that restrictions on automated decision-making in Clause 14 apply to all categories of personal data, not just sensitive personal data. The amendment would ensure similar levels of protection around automated decision-making to those we currently have. It would do this by widening the scope of the restrictions in new Article 22B so that they restrict automation based on all kinds of personal data, not just automation based on special category data.
We very much support Amendments 36 and 37, proposed by the noble Baroness, Lady Jones, on profiling. My 10 minutes are running out very quickly so, sadly, I must leave it there.
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

As ever, I thank the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones, for their detailed consideration of Clause 14, and all other noble Lord who spoke so well. I carefully note the references to the DWP’s measure on fraud and error. For now, I reassure noble Lords that a human will always be involved in all decision-making relating to that measure, but I note that this Committee will have a further debate specifically on that measure later.

The Government recognise the importance of solely automated decision-making to the UK’s future success and productivity. These reforms ensure that it can be responsibly implemented, while any such decisions with legal or similarly significant effects have the appropriate safeguards in place, including the rights to request a review and to request one from a human. These reforms clarify and simplify the rules related to solely automated decision-making without watering down any of the protections for data subjects or the fundamental data protection principles. In doing so, they will provide confidence to organisations looking to use these technologies in a responsible way while driving economic growth and innovation.

The Government also recognise that AI presents huge opportunities for the public sector. It is important that AI is used responsibly and transparently in the public sector; we are already taking steps to build trust and transparency. Following a successful pilot, we are making the Algorithmic Transparency Reporting Standard—the ATRS—a requirement for all government departments, with plans to expand this across the broader public sector over time. This will ensure that there is a standardised way for government departments proactively to publish information about how and why they are using algorithms in their decision-making. In addition, the Central Digital and Data Office—the CDDO—has already published guidance on the procurement and use of generative AI for the UK Government and, later this year, DSIT will launch the AI management essentials scheme, setting a minimum good practice standard for companies selling AI products and services.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, could I just interrupt the Minister? It may be that he can get an answer from the Box to my question. One intriguing aspect is that, as the Minister said, the pledge is to bring the algorithmic recording standard into each government department and there will be an obligation to use that standard. However, what compliance mechanism will there be to ensure that that is happening? Does the accountable Permanent Secretary have a duty to make sure that that is embedded in the department? Who has the responsibility for that?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

That is a fair question. I must confess that I do not know the answer. There will be mechanisms in place, department by department, I imagine, but one would also need to report on it across government. Either it will magically appear in my answer or I will write to the Committee.

The CDDO has already published guidance on the procurement and use of generative AI for the Government. We will consult on introducing this as a mandatory requirement for public sector procurement, using purchasing power to drive responsible innovation in the broader economy.

I turn to the amendments in relation to meaningful involvement. I will first take together Amendments 36 and 37, which aim to clarify that the safeguards mentioned under Clause 14 are applicable to profiling operations. New Article 22A(2) already clearly sets out that, in cases where profiling activity has formed part of the decision-making process, controllers have to consider the extent to which a decision about an individual has been taken by means of profiling when establishing whether human involvement has been meaningful. Clause 14 makes clear that a solely automated significant decision is one without meaningful human involvement and that, in these cases, controllers are required to provide the safeguards in new Article 22C. As such, we do not believe that these amendments are necessary; I therefore ask the noble Baroness, Lady Jones, not to press them.

Turning to Amendment 38, the Government are confident that the existing reference to “data subject” already captures the intent of this amendment. The existing definition of “personal data” makes it clear that a data subject is a person who can be identified, directly or indirectly. As such, we do not believe that this amendment is necessary; I ask the noble Lord, Lord Clement-Jones, whether he would be willing not to press it.

Amendments 38A and 40 seek to clarify that, for human involvement to be considered meaningful, the review must be carried out by a competent person. We feel that these amendments are unnecessary as meaningful human involvement may vary depending on the use case and context. The reformed clause already introduces a power for the Secretary of State to provide legal clarity on what is or is not to be taken as meaningful human involvement. This power is subject to the affirmative procedure in Parliament and allows the provision to be future-proofed in the wake of technological advances. As such, I ask the noble Baronesses, Lady Jones and Lady Bennett, not to press their amendments.

19:00
Amendments 39, 47, 51, 56, 60 and 64 to 68 appear to restrict public authorities’ use of automated decision-making by introducing a new definition of decisions that meaningfully involve automated processing. Our reforms clarify that a solely automated decision is one that is taken without any meaningful human involvement going beyond a cursory or rubber-stamping exercise. These amendments seek to bring in an entirely separate threshold for the use of automated decision-making by public authorities and controllers acting on their behalf. We consider this unnecessary as the Article 22 safeguards, as they apply to solely automated decisions, are robust and provide strong protections to all data subjects. These safeguards are applicable to all controllers whether they are or act for a public authority. As such, we believe that the reforms in the Bill will benefit society by allowing public authorities to use automated decision-making with appropriate safeguards in place.
Amendments 43 and 62 seek to extend the limitations on the use of special categories of data to all automated decision-making. Such restrictions would be unnecessary and would impede the use that controllers can make of this technology. We believe that the safeguards set out under Article 22C and Section 50C, which entitle data subjects to information about decisions taken about them, to make representations to contest decisions and to obtain human review, provide sufficient protection for personal, non-sensitive data. As such, we do not believe that these amendments are necessary; I ask the noble Lord, Lord Clement-Jones, not to press them.
Amendment 109 seeks to preserve and amend Section 14 of the Data Protection Act relating to automated decision-making authorised by law. The Government believe that the same uniform safeguards should be applicable across all processing conditions, including the processing of special category data, to simplify and clarify the obligations of controllers. Having different safeguards and obligations depending on the lawful ground of processing would lead to uncertainty among controllers as well as among data subjects. The Government aim to simplify and clarify the rules to ensure clear understanding of organisations’ obligations to protect data subjects’ rights. Furthermore, this amendment would also require data subjects to receive a personalised explanation of decisions reached following the automated processing of their data. Article 22C(2)(a) already requires controllers to provide data subjects with information about decisions taken about them. As such, we do not believe that this amendment is necessary; I ask the noble Baroness, Lady Jones, not to press it.
I shall return briefly to the rollout of the Algorithmic Transparency Reporting Standard. To date, we have taken a deliberatively iterative and agile approach on ATRS development and rollout with the intention of generating buy-in from departments, gathering feedback, informing the evidence base, and improving and adapting the standard.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

That means no compliance mechanism.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am not sure I agree with that characterisation. The ATRS is a relatively new development. It needs time to bed in and needs to be bedded in on an agile basis in order to ensure not only quality but speed of implementation. That said, I ask the noble Lord to withdraw his amendment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

The Minister has taken us through what Clause 14 does and rebutted the need for anything other than “solely”. He has gone through the sensitive data and the special category data aspects, and so on, but is he reiterating his view that this clause is purely for clarification; or is he saying that it allows greater use of automated decision-making, in particular in public services, so that greater efficiencies can be found and therefore it is freeing up the public sector at the expense of the rights of the individual? Where does he sit in all this?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

As I said, the intent of the Government is: yes to more automated data processing to take advantage of emerging technologies, but also yes to maintaining appropriate safeguards. The safeguards in the present system consist—if I may characterise it in a slightly blunt way—of providing quite a lot of uncertainty, so that people do not take the decision to positively embrace the technology in a safe way. By bringing in this clarity, we will see an increase not only in the safety of their applications but in their use, driving up productivity in both the public and private sectors.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

My Lords, I said at the outset that I thought this was the beginning of a particular debate, and I was right, looking at the amendments coming along. The theme of the debate was touched on by the noble Baroness, Lady Bennett, when she talked about these amendments, in essence, being about keeping humans in the loop and the need for them to be able to review decisions. Support for that came from the noble Baroness, Lady Kidron, who made some important points. The point the BMA made about risking eroding trust cut to what we have been talking about all afternoon: trust in these processes.

The noble Lord, Lord Clement-Jones, talked about this effectively being the watering down of Article 22A, and the need for some core ethical principles in AI use and for the Government to ensure a right to human review. Clause 14 reverses the presumption of that human reviewing process, other than where solely automated decision-making exists, where it will be more widely allowed, as the Minister argued.

However, I am not satisfied by the responses, and I do not think other Members of your Lordships’ Committee will be either. We need more safeguards. We have moved from one clear position to another, which can be described as watering down or shifting the goalposts; I do not mind which, but that is how it seems to me. Of course, we accept that there are huge opportunities for AI in the delivery of public services, particularly in healthcare and the operation of the welfare system, but we need to ensure that citizens in this country have a higher level of protection than the Bill currently affords them.

At one point I thought the Minister said that a solely automated decision was a rubber-stamped decision. To me, that gave the game away. I will have to read carefully what he said in Hansard¸ but that is how it sounded, and it really gets our alarm bells ringing. I am happy to withdraw my amendment, but we will come back to this subject from time to time and throughout our debates on the rest of the Bill.

Amendment 36 withdrawn.
Amendments 37 to 40 not moved.
Amendment 41
Moved by
41: Clause 14, page 26, line 21, at end insert—
“A1. The data subject may not be subject to any decision based on data processing which contravenes a requirement of the Equality Act 2010.”Member's explanatory statement
This amendment to new Article 22B of the UK GDPR, aims to make clear that data processing which contravenes any part of the Equality Act 2010 is prohibited.
Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

My Lords, this group, in which we have Amendments 41, 44, 45, 49, 50, 98A and 104A and have cosigned Amendments 46 and 48, aims to further the protections that we discussed in the previous group. We are delighted that the noble Lord, Lord Clement-Jones, and others joined us in signing various of these amendments.

The first amendment, Amendment 41, is a straight prohibition of any data processing that would contravene the Equality Act 2010. All legislation should conform to the terms of the Equality Act, so I expect the Minister to confirm that he is happy to accept that amendment. If he is not, I think the Committee will want to understand better why that is the case.

Amendment 44 to new Article 22B of the UK GDPR is, as it says, designed,

“to prevent data subjects from becoming trapped in unfair agreements and being unable to exercise their data rights”,

because of the contract terms. One might envisage some sensitive areas where the exercise of these rights might come into play, but there is nothing that I could see, particularly in the Explanatory Notes, which seeks to argue that point. We have no knowledge of when this might occur, and I see no reason why the legislation should be changed to that effect. Special category data can be used for automated decision-making only if certain conditions are met. It involves high-risk processing and, in our view, requires explicit consent.

The amendments remove performance of a contract as one of the requirements that allows the processing of special category data for reaching significant decisions based on automated processing. It is difficult to envisage a situation where it would be acceptable to permit special category data to be processed in high-risk decisions on a purely automated basis, simply pursuant to a contract where there is no explicit consent.

Furthermore, relying on performance of a contract for processing special category data removes the possibility for data subjects to exercise their data rights, for example, the right to object and the ability to withdraw consent, and could trap individuals in unfair agreements. There is an implicit power imbalance between data subjects and data controllers when entering a contract, and people are often not given meaningful choices or options to negotiate the terms. It is usually a take-it-or-leave-it approach. Thus, removing the criteria for performance of a contract reduces the risks associated with ADM and creates a tighter framework for protection. This also aligns with the current wording of Article 9 of the UK GDPR.

Amendment 45 changes the second condition to include only decisions that are required or authorised by law, with appropriate safeguards, and that are necessary for reasons of substantial public interest. The safeguards are retained from Section 14 of the DPA 2018, with amendments to strengthen transparency provisions.

Amendment 49 seeks to ensure that the protections conferred by Article 22C of the UK GDPR would apply to decisions “solely or partly” based on ADM rather than just “solely”. This would help to maximise the protections that data subjects currently enjoy.

Amendment 50 is another strengthening measure, which would make sure that safeguards in the new Article 22C are alongside rather than instead of those contained in Articles 12 to 15.

Our Amendment 104A would insert a new Section into the 2018 Act, requiring data controllers who undertake high-risk processing in relation to work-related decisions or activities to carry out an additional algorithmic impact assessment and make reasonable mitigations in response to the outcome of that assessment.

I ought to have said earlier that Amendment 98A is a minor part of the consequential text.

An improved workplace-specific algorithmic impact assessment is the best way to remedy clear deficiencies in Clause 20 as drafted, and it signals Labour’s international leadership and alignment with international regulatory and AI ethics initiatives. These are moving towards the pre-emptive evaluation of significant social and workplace impacts by responsible actors, combined with a procedure for ongoing monitoring, which is not always possible. It also moves towards our commitment to algorithmic assurance and will help to ensure that UK businesses are not caught up in what is sometimes described as the “Brussels effect”.

19:15
The impact assessment should cover known impacts on work and workers’ rights and the exercise of those, combining the best of audit technology and legal impact assessments. There would also be a duty to respond appropriately to the findings of that assessment. One of the simplest and most effective ways to boost transparency and consultation provisions is to attach them to these improved impact assessments by requiring disclosure of the assessment, at least in summary form, and permitting requests for additional information relevant to that assessment.
In our view, the definition of “high risk” in the Bill should be deemed to include significant impacts on work and workers. For clarity, this includes: any impact on equal opportunities or outcomes of work, access to employment, pay, contractual status, terms and conditions of employment, health and well-being, lawful association rights, and associated training. This could be done by a discreet deeming provision at several places in the Bill. These factors would also provide for a threshold for the more rigorous workplace assessment.
In our view, the core components of that assessment are: a requirement to establish a process for undertaking impact assessments; a requirement to assess significant impacts on work and employees; a requirement to involve those affected, including employees, workers and official representatives; a requirement to take appropriate steps in response, or, in other words, to mitigate and impose safeguards; and a requirement to disclose metrics, methods and mitigation taken.
In many ways, Amendment 104A is a continuation of the debates on the DMCC Bill on changing uses of technology in workplaces and the potential for workers to be disadvantaged by the decisions produced by software. Given the risks, we feel that there should be more protections in data legislation rather than fewer, and transparency and consultation are key.
We support Amendment 46 because it offers a further measure of protection to children’s rights. We believe that, in this area, we should retain the existing legislative framework from the 2018 Act, and we cannot see any case for weakening those protections. Amendment 48 largely echoes our Amendment 49. The amendment of the noble Lord, Lord Holmes, is in this group, although he is not here. To our way of looking at things, it seems eminently sensible. I look forward to the opportunity to listen to him talk to that at a later stage of the Bill. I beg to move.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, the amendments in this group highlight that Clause 14 lacks the necessary checks and balances to uphold equality legislation, individual rights and freedoms, data protection rights, access to services, fairness in the exercise of public functions and workers’ rights. I add my voice to that of the noble Lord, Lord Clement-Jones, in his attempt to make Clause 14 not stand part, which he will speak to in the next group.

I note, as the noble Lord, Lord Bassam, has, that all the current frameworks have fundamental rights at their heart, whether it is the White House blueprint, the UN Secretary-General’s advisory body on AI, with which I am currently involved, or the EU’s AI Act. I am concerned that the UK does not want to work within this consensus.

With that in mind, I particularly note the importance of Amendment 41. As the noble Lord said, we are all supposed to adhere to the Equality Act 2010. I support Amendments 48 and 49, which are virtually inter-changeable in wanting to ensure that the standard of decisions being “solely” based on automated decision-making cannot be gamed by adding a trivial human element to avoid that designation.

Again, I suggest that the Government cannot have it both ways—with nothing diminished but everything liberated and changed—so I find myself in agreement with Amendment 52A and Amendment 59A, which is in the next group, from the noble Lord, Lord Holmes, who is not in his place. These seek clarity from the Information Commissioner.

I turn to my Amendment 46. My sole concern is to minimise the impact of Clause 14 on children’s safety, privacy and life chances. The amendment provides that a significant decision about a data subject must not be based solely on automated processing if

“the data subject is a child or may be a child unless the provider is satisfied that the decision is in, and compatible with, the best interests of a child”,

taking into account the full gamut of their rights and development stage. Children have enhanced rights under the UNCRC, to which the UK is a signatory. Due to their evolving capacities as they make the journey from infancy to adulthood, they need special protections. If their rights are diminished in the digital world, their rights are diminished full stop. Algorithms determine almost every aspect of a child’s digital experience, from the videos they watch to their social network and from the sums they are asked to do in their maths homework to the team they are assigned when gaming. We have seen young boys wrongly profiled as criminal and girls wrongly associated with gangs.

In a later group, I will speak to a proposal for a code of practice on children and AI, which would codify standards and expectations for the use of AI in all aspects of children’s lives, but for now, I hope the Minister will see that, without these amendments to automated decision-making, children’s data protection will be clearly weakened. I hope he will agree to act to make true his earlier assertion that nothing in the Bill will undermine child protection. The Minister is the Minister for AI. He knows the impact this will have. I understand that, right now, he will probably stick to the brief, but I ask him to go away, consider this from the perspective of children and parents, and ask, “Is it okay for children’s life chances to be automated in this fashion?”

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I will speak to my Amendment 48. By some quirk of fate, I failed to sign up to the amendments that the noble Lord, Lord Bassam, so cogently introduced. I would have signed up if I had realised that I had not, so to speak.

It is a pleasure to follow the noble Baroness, Lady Kidron. She has a track record of being extremely persuasive, so I hope the Minister pays heed in what happens between Committee and Report. I very much hope that there will be some room for manoeuvre and that there is not just permanent push-back, with the Minister saying that everything is about clarifying and us saying that everything is about dilution. There comes a point when we have to find some accommodation on some of these areas.

Amendments 48 and 49 are very similar—I was going to say, “Great minds think alike”, but I am not sure that my brain feels like much of a great mind at the moment. “Partly” or “predominantly” rather than “solely”, if you look at it the other way round, is really the crux of what I think many of us are concerned about. It is easy to avoid the terms of Article 22 just by slipping in some sort of token human involvement. Defining “meaningful” is so difficult in these circumstances. I am concerned that we are opening the door to something that could be avoided. Even then, the terms of the new clause—we will have a clause stand part debate on Wednesday, obviously—put all the onus on the data subject, whereas that was not the case previously under Article 22. The Minister has not really explained why that change has been made.

I conclude by saying that I very much support Amendment 41. This whole suite of amendments is well drafted. The point about the Equality Act is extremely well made. The noble Lord, Lord Holmes, also has a very good amendment here. It seems to me that involving the ICO right in the middle of this will be absolutely crucial—and we are back to public trust again. If nothing else, I would like explicitly to include that under Clause 14 in relation to Article 22 by the time this Bill goes through.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank noble Lords and the noble Baroness for their further detailed consideration of Clause 14.

Let me take first the amendments that deal with restrictions on and safeguards for ADM and degree of ADM. Amendment 41 aims to make clear that solely automated decisions that contravene any part of the Equality Act 2010 are prohibited. We feel that this amendment is unnecessary for two reasons. First, this is already the case under the Equality Act, which is reinforced by the lawfulness principle under the present data protection framework, meaning that controllers are already required to adhere to the Equality Act 2010. Secondly, explicitly stating in the legislation that contravening one type of legislation is prohibited—in this case, the Equality Act 2010—and not referring to other legislation that is also prohibited will lead to an inconsistent approach. As such, we do not believe that this amendment is necessary; I ask the noble Baroness, Lady Jones, to withdraw it.

Amendment 44 seeks to limit the conditions for special category data processing for this type of automated decision-making. Again, we feel that this is not needed given that a set of conditions already provides enhanced levels of protection for the processing of special category data, as set out in Article 9 of the UK GDPR. In order to lawfully process special category data, you must identify both a lawful basis under Article 6 of the UK GDPR and a separate condition for processing under Article 9. Furthermore, where an organisation seeks to process special category data under solely automated decision-making on the basis that it is necessary for contract, in addition to the Articles 6 and 9 lawful bases, they would also have to demonstrate that the processing was necessary for substantial public interest.

Similarly, Amendment 45 seeks to apply safeguards when processing special category data; however, these are not needed as the safeguards in new Article 22C already apply to all forms of processing, including the processing of special category data, by providing sufficient safeguards for data subjects’ rights, freedoms and legitimate interests. As such, we do not believe that these amendments are necessary; I ask the noble Baroness, Lady Jones, not to press them.

19:30
Amendment 46, in the name of the noble Baroness, Lady Kidron, intends to prevent solely automated decisions that have significant effects on children unless they are in a child’s best interest. I absolutely recognise the intent behind this amendment; indeed, the Government agree with the noble Baroness that all organisations must take great care when making solely automated decisions about the use of children’s data.
The Bill already includes a range of safeguards relating to solely automated decision-making that would protect children and adults alike, including ensuring that children and their parents are provided with information related to significant decisions that have been taken about them through solely automated means and given the opportunity to make representations and seek human review of those decisions. Where the processing involves children, organisations will need to provide the information in a clear, age-appropriate manner to ensure that they comply with their transparency obligations. The Government do not want solely ADM to be used when it negatively impacts children, nor do they believe that it should be; this is in line with lawfulness, fairness and the other data protection principles.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Can the Minister give me an indication of the level at which that kicks in? For example, say there is a child in a classroom and a decision has been made about their ability in a particular subject. Is it automatic that the parent and the child get some sort of read-out on that? I would be curious to know where the Government feel that possibility starts.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

In that example, where a child was subject to a solely ADM decision, the school would be required to inform the child of the decision and the reasons behind it. The child and their parent would have the right to seek a human review of the decision.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

We may come on to this when we get to edtech but a lot of those decisions are happening automatically right now, without any kind of review. I am curious as to why it is on the school whereas the person actually doing the processing may well be a technology company.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

It may be either the controller or the processor but for any legal or similarly significant decision right now—today—there is a requirement before the Bill comes into effect. That requirement is retained by the Bill.

In line with ICO guidance, children need particular protection when organisations collect and process their personal data because they may be less aware of the risks involved. If organisations process children’s personal data they should think about the need to protect them from the outset and should design their systems and processes with this in mind. This is the case for organisations processing children’s data during solely automated decision-making, just as it is for all processing of children’s data.

Building on this, the Government’s view is that automated decision-making has an important role to play in protecting children online, for example with online content moderation. The current provisions in the Bill will help online service providers understand how they can use these technologies and strike the right balance between enabling the best use of automated decision-making technology while continuing to protect the rights of data subjects, including children. As such, we do not believe that the amendment is necessary; I ask the noble Baroness if she would be willing not to press it.

Amendments 48 and 49 seek to extend the Article 22 provisions to “predominantly” and “partly” automated decision-making. These types of processing already involve meaningful human involvement. In such instances, other data protection requirements, including transparency and fairness, continue to apply and offer relevant protections. As such, we do not believe that these amendments are necessary; I ask the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones, if they would be willing not to press them.

Amendment 50 seeks to ensure that the Article 22C safeguards will apply alongside, rather than instead of, the transparency obligations in the UK GDPR. I assure the noble Baroness, Lady Jones, that the general transparency obligations in Articles 12 to 15 will continue to apply and thus will operate alongside the safeguards in the reformed Article 22. As such, we do not believe that this amendment is necessary; I ask the noble Baroness if she would be willing not to press it.

The changes proposed by Amendment 52A are unnecessary as Clause 50 already provides for an overarching requirement for the Secretary of State to consult the ICO and other persons that the Secretary of State considers appropriate before making regulations under the UK GDPR, including for the measures within Article 22. Also, any changes to the regulations are subject to the affirmative procedure so must be approved by both Houses of Parliament. As with other provisions of the Bill, the ICO will seek to provide organisations with timely guidance and support to assist them in interpreting and applying the legislation. As such, we do not believe that this amendment is necessary and, if he were here, I would ask my noble friend Lord Holmes if he would be willing not to press it.

Amendments 98A and 104A are related to workplace rights. Existing data protection legislation and our proposed reforms provide sufficient safeguards for automated decision making where personal data is being processed, including in workplaces. The UK’s human rights law, and existing employment and equality laws, also ensure that employees are informed and consulted about any workplace developments, which means that surveillance of employees is regulated. As such, we do not believe that these amendments are necessary and I ask the noble Baroness not to move them.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I hear what the Minister said about the workplace algorithmic assessment. However, if the Government believe it is right to have something like an algorithmic recording standard in the public sector, why is it not appropriate to have something equivalent in the private sector?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I would not say it is not right, but if we want to make the ATRS a standard, we should make it a standard in the public sector first and then allow it to be adopted as a means for all private organisations using ADM and AI to meet the transparency principles that they are required to adopt.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

So would the Minister not be averse to it? It is merely so that the public sector is ahead of the game, allowing it to show the way and then there may be a little bit of regulation for the private sector.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am not philosophically averse to such regulation. As to implementing it in the immediate future, however, I have my doubts about that possibility.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

My Lords, this has been an interesting and challenging session. I hope that we have given the Minister and his team plenty to think about—I am sure we have. A lot of questions remain unanswered, and although the Committee Room is not full this afternoon, I am sure that colleagues reading the debate will be studying the responses that we have received very carefully.

I am grateful to the noble Baroness, Lady Kidron, for her persuasive support. I am also grateful to the noble Lord, Lord Clement-Jones, for his support for our amendments. It is a shame the noble Lord, Lord Holmes, was not here this afternoon, but I am sure we will hear persuasively from him on his amendment later in Committee.

The Minister is to be congratulated for his consistency. I think I heard the phrase “not needed” or “not necessary” pretty constantly this afternoon, but particularly with this group of amendments. He probably topped the lot with his response on the Equality Act on Amendment 41.

I want to go away with my colleagues to study the responses to the amendments very carefully. That being said, however, I am happy to withdraw Amendment 41 at this stage.

Amendment 41 withdrawn.
Amendment 42
Moved by
42: Clause 14, page 26, line 22, leave out from “on” to “may” in line 23 and insert “processing described in Article 9(1) (processing of special categories of personal data)”
Member's explanatory statement
This technical amendment adjusts the wording of new Article 22B(1) of the UK GDPR to reflect the terms of Article 9(1).
Amendment 42 agreed.
Amendments 43 to 52 not moved.
Committee adjourned at 7.40 pm.