(1 week, 3 days ago)
Lords ChamberMy Lords, I declare an interest as the chair of the Authors’ Licensing and Collecting Society. We should all be grateful to the noble Lord, Lord Berkeley, for the very gracious way he introduced his amendment, particularly given the history of this inter-House discussion.
Whether it is betrayal, disrespect, negligence, bloody-mindedness, a bad dream or tone-deafness, whatever the reality, we find ourselves once again in this Chamber debating an issue that should have been settled long ago. I share the profound anger and frustration expressed by the noble Baroness, Lady Kidron, and admire her unwavering determination, even if she, for very honourable reasons, will not be voting today. As she pointed out, the Prime Minister, who entertained the tech industry at Chequers and Downing Street, is complicit in the situation we are in today.
We are here today because the Government have point-blank refused to move, repeatedly presenting the same proposition on three occasions while this House, by contrast, has put forward a series of genuine solutions in an attempt to find a way forward, as the noble Lord, Lord Forsyth, pointed out. The only new element seems to be a promise of a cross-party parliamentary working party, but what is so enticing about merely more talking when action is desperately needed?
Amendment 49U, tabled by the noble Lord, Lord Berkeley, and designed to amend the 1988 copyright Act, is a reasoned compromise. It requires identifying the copyrighted works and the means by which they were accessed, unless the developer has obtained a licence. That seems to be a fair trade-off. The noble Lord also pointed out that Minister Bryant has rather inadvertently made it clear that today’s amendment does not invoke financial privilege on this occasion. The Government argue that legislating piecemeal would be problematic, but the historical precedent of the Napster clause in the Digital Economy Act 2010 demonstrates that Parliament can and should take powers to act when a sector is facing an existential threat. There is an exact parallel with where we are today.
This is not about picking a side between AI and creativity, as we have heard across the House today. It is about ensuring that both can thrive through fair collaboration based on consent and compensation. We must ensure that the incentive remains for the next generation of creators and innovators. Given how Ministers have behaved in the face of the strength of feeling of the creative industries, how can anyone in those industries trust this Government and these Ministers ever again? Will they trust their instincts to appease big tech? I suspect not. I do not regard the noble Baroness, Lady Jones, as personally liable in this respect, but I hope she feels ashamed of her colleagues in the Commons, of the behaviour of her department and of her Government. In this House we will not forget.
There is still time for the Government to listen, to act and to secure a future where human creativity is not plundered but valued and protected. If the noble Lord, Lord Berkeley, chooses to put this to a vote, on these Benches we will support him to the hilt. I urge all noble Lords from all Benches, if he does put it to a vote, to support the UK creative industries once again.
My Lords, as everybody has said, it is deeply disappointing that we once again find ourselves in this position. The noble Baroness, Lady Kidron, has brought the concerns of copyright owners to the attention of the Government time and again. Throughout the progress of the Bill, the Government have declined to respond to the substance of those concerns and to engage with them properly. As I said in the previous round of ping-pong—I am starting to lose count—the uncertainty of the continued delay to this Bill is hurting all sides. Even businesses that are in industries far removed from concerns about AI and copyright are waiting for the data Bill. It has been delayed because of the Government’s frankly stubborn mismanagement of the Bill.
I understand completely why the noble Lord, Lord Berkeley of Knighton, feels sufficiently strongly about how the Government have acted to move his very inventive amendment. It strikes at the heart of how this Government should be treating your Lordships’ House. If Ministers hope to get their business through your Lordships’ House in good order, they will rely on this House trusting them and collaborating with them. I know that these decisions are often made by the Secretary of State. I have the highest respect for the Minister, but this is a situation of the Government’s making. I note in passing that it was very disappointing to read that the Government’s planned AI Bill will now be delayed by at least a year.
All that said, as the Official Opposition we have maintained our position, as ping-pong has progressed, that protracted rounds of disagreement between the other place and your Lordships’ House should be avoided. This situation could have been avoided if the Government had acted in good faith and sought compromise.
My Lords, I thank noble Lords for their contributions. I repeat again our absolute commitment to the creative sector and our intention to work with it to help it flourish and grow. This is London Tech Week. All Ministers, including me and my colleagues, have been involved in that, showcasing the UK’s rising tech talent to the world. I do not feel I should apologise for our involvement with the tech sector in that regard.
(2 weeks, 3 days ago)
Lords ChamberMy Lords, I once again declare an interest as chair of the Authors’ Licensing and Collecting Society, and once again give the staunch support of these Benches to the noble Baroness, Lady Kidron, on her Motion A1. She made an incontestable case once again with her clarion call.
I follow the noble Lord, Lord Russell, and others in saying that we are not in new territory. I have a treasured cartoon on my wall at home that relates to the passage of the Health and Social Care Bill as long ago as 2001, showing Secretary of State Alan Milburn recoiling from ping-pong balls. Guess who was hurling the ping-pong balls? The noble Earl, Lord Howe, that notable revolutionary, and I were engaging in rounds of parliamentary ping-pong—three, I think. Eventually, compromises were reached and the Bill received Royal Assent in April 2001.
What we have done today and what we are going to do today as a House is not unprecedented. There is strong precedent for all Benches to work together on ping-pong to rather good effect. As the noble Baroness, Lady Kidron, says, what we are proposing today will not, in the words of the Minister, “collapse” the Bill: it will be the Government’s choice what to do when the Bill goes back to the Commons. I hugely respect the noble Lord, Lord Knight, but I am afraid that he is wrong. It was not a manifesto commitment; there is no Salisbury convention that can be invoked on this occasion. It has nothing at all to do with data adequacy except that the Government feel that they have to get the Bill through in order to get the EU Commission to start its work. If anything, the Bill makes data adequacy more difficult. I say to the noble Lord, Lord Brennan, that I agree with almost everything he said: everything he said was an argument for the noble Baroness’s amendment. Once again, as ever, I agree with the noble Lord, Lord Stevenson, as I so often do on these occasions. I regard him as the voice of reason, and I very much hope that the Government will listen to what he has to say.
Compromise is entirely within the gift of the Government. The Secretary of State should take a leaf out of Alan Milburn’s book. He did compromise on an important Bill in key areas and saw his Bill go through. I am afraid to say that the letter that Peers have received from the Minister is simply a repeat of her speech on Monday, which was echoed by Minister Bryant in the Commons yesterday. The Government have tabled these new amendments, which reflect the contents of that letter. Despite those amendments, however, the Government have not offered a concession to legislate for mandated transparency provisions within the Bill, which has been the core demand of the Lords amendments championed by the noble Baroness, Lady Kidron, for the reasons set out in the speeches we have heard today.
In the view of these Benches, the noble Baroness, Lady Kidron, other Members of this House, and countless creatives have made the absolutely convincing case for a transparency duty which would not prejudge the outcome of the AI and copyright consultation. We have heard the chilling points made by the noble Lords, Lord Russell and Lord Pannick, about US policy in this area and about the attitude of the big tech companies towards copyright. We are at a vital crossroads in how we ensure the future of our creative industries. In the face of the development of AI and how it is being trained, we must take the right road, and I urge the Government to settle now.
My Lords, given where we are, I will speak very briefly, but I will make just two points. First, I think it is worth saying that the uncertainty surrounding where we are with AI and copyright is itself damaging, not just to the creative sector, not just to AI labs and big tech in general, but to all those who will themselves be impacted by the Bill’s many other provisions. Overall, I think it is worth reminding ourselves that this is an important Bill whose original conception did not even address AI and copyright. It carried very important and valuable provisions—as the Minister pointed out in her opening remarks—on digital verification services, smart data schemes, the national underground asset register and others. These can genuinely drive national productivity. Indeed, that is why my party proposed them when we were in government. It is, therefore, deeply frustrating that the Government have not yet found a way forward on this, and I am afraid that I very much agree with the noble Lord, Lord Knight. The way the Government have gone about this has been reprehensible: I think that is the word I would use.
(2 weeks, 5 days ago)
Lords ChamberMy Lords, I declare an interest as chair of the Authors’ Licensing and Collecting Society. I offer the unequivocal and steadfast support from the Liberal Democrat Benches for Motion A1 in the name of the noble Baroness, Lady Kidron, which introduces Amendment 49F in lieu of Amendment 49D.
It is absolutely clear that the noble Baroness’s speeches become better and more convincing the more we go on. Indeed, the arguments being made today for these amendments become better and more convincing as time goes on. I believe we should stand firm, as the noble Lord, Lord Berkeley, said.
Time and time again, we all have had to address the narrative stated in the consultation paper and repeated by Ministers suggesting there is uncertainty or a lack of clarity in existing UK copyright law regarding AI training. We have heard that the Secretary of State has just recently acknowledged that the existing copyright law is “very certain”, but as I said to the noble Lord, Lord Liddle, he has also stated that
“it is not fit for purpose”.—[Official Report, Commons, 22/5/25; col. 1234.]
That makes the narrative even worse than saying that copyright law is uncertain.
As the noble Baroness, Lady Kidron, has rightly asserted, we do not need to change copyright law. It is the view of many that existing law is clear and applies to the commercial use of copyrighted works for AI training. The issue is not a deficient law but rather the ability to enforce it in the current AI landscape. As the noble Baroness has also profoundly put it—I have got a number of speeches to draw on, as you can see—what you cannot see, you cannot enforce. The core problem is a lack of transparency from AI developers: without knowing what copyrighted material has been used to train models and how it was accessed, creators and rights holders are unable to identify potential infringements and pursue appropriate licensing or legal action.
In striking down previous Lords amendments, the Government have suggested that this House was at fault for using the wrong Bill. They have repeatedly claimed that it is too soon for transparency and too late to prevent stealing, and they have asserted that accepting the Lords transparency amendment would prioritise one sector over another. But that is exactly what the Government are doing. They have suggested an expert working group, an economic impact assessment, a report on the use of copyright, and then, I think, a report on progress in what the noble Baroness the Minister had to say. But, as many noble Lords have said today, none of that gives us the legislative assurance —the certainty, as the noble Lord, Lord Brennan, put it—that we need in these circumstances.
The Government have objected to being asked to introduce regulations because of financial privilege, and now, it seems—I can anticipate what the noble Baroness the Minister is going to say—are objecting to the requirement to bring forward a draft Bill with this amendment. But the Government are perfectly at liberty to bring forward their own amendment allowing for transparency via regulations, a much more expeditious and effective route that the House has already overwhelmingly supported. Transparency is the necessary foundation for a functioning licensing market, promotes trust between the AI sector and the creative industries, and allows creators to be fairly compensated when their work contributes value to AI models.
The Government have asked for a degree of trust for their plans. This amendment, while perhaps less than creators deserve—I think the noble Baroness, Lady Kidron, described it as the bare minimum—is a step that would help earn that trust. It is this Government who can do that, and I urge them to heed the words of their own Back-Benchers: the noble Lords, Lord Cashman, Lord Rooker and Lord Brennan, all asked the Government to find a compromise.
I urge all noble Lords, in the face of a lack of compromise by the Government, to support Motion A1.
My Lords, as this is the third round of ping-pong, as many noble Lords have observed, I will speak very briefly. If the noble Baroness the Minister has not by now understood how strongly noble Lords on all sides of the House feel about this issue, it may be too late anyway.
The noble Baroness, Lady Kidron, has made an increasingly powerful case for the Government to act in defence of the rights of copyright owners, and we continue to call on the Government to listen. We have of course discussed this at great length. The noble Baroness has tabled a new Motion which would require Ministers to make a Statement and bring forward a draft Bill. Given that the Minister has expressed her sympathy for the concerns of your Lordships’ House previously, surely this new Motion would be acceptable to the Government as a pathway toward resolving the problem, and we again urge the Government to accept it.
However, whatever choice the Government make—I do not think anyone could claim that any part of this is an easy problem, as my noble friend Lord Vaizey pointed out—many of us are frustrated by the absence of agility, boldness and imagination in their approach. That said, speaking at least from the Front Bench of a responsible Opposition, we take the view that we cannot engage further in protracted ping-pong. We are a revising Chamber, and, although it is right to ask the Government to think again when we believe they have got it wrong, we feel we must ultimately respect the will of the elected Chamber.
My Lords, I must once again thank all noble Lords who have spoken during this debate, and of course I continue to recognise the passion and the depth of feeling on this issue.
I did not think I needed to reiterate this, but we absolutely believe in the importance of the creative sector, and of course we want it to have a flourishing future. In previous debates, I have spelled out all the work that we are doing with the creative sector and how fundamental it is to our economic planning going forward. I do not intend to go over that, but I have said it time and again from this Dispatch Box. Our intention is to find a substantial and workable solution to this challenge that we are all facing.
I also reassure the noble Lord, Lord Forsyth, and others that we have had numerous discussions with the noble Baroness, Lady Kidron, and others and have of course taken those discussions seriously. As a result, we have come today with an honest and committed plan to work together to resolve the contentious issue of AI and copyright both quickly and effectively.
(1 month ago)
Lords ChamberThe noble Viscount will know that schools already have a policy, or are expected by the Department for Education to have one, to ensure that children do not have access to phones in schools. That is a clear policy that the Government are keen to reiterate. What we are talking about here is what children do outside the school environment. From July, the children’s code of practice will provide much greater reassurance and protection for children. Services will be expected to provide age-appropriate experiences online by protecting children from bullying, violent content, abuse and misogynistic content. In other words, there will be much more forceful regulation to specifically protect children. Obviously, we will continue to monitor the codes of practice, but there are specific new powers under the code that come into effect in July and we want to see their impact.
My Lords, I very much hope the Government are actively tracking and measuring the effects of schools’ own policies on mobile phone use during the school day. If so, what conclusions can be drawn about the wisdom of an outright ban? If they are not tracking that information, why not?
My Lords, as I said, the Department for Education’s mobile phones in schools guidance is clear that schools should prohibit the use of devices with smart technology throughout the school day, including during lessons, transitions and breaks. The Government expect all schools to take steps in line with that. Beyond that, my own department, DSIT, has commissioned a piece of research to look at young people’s use of social media and their access to it throughout the day. The outcome of the research is due very soon and we will learn the lessons from that. Up until now, the evidence has not been as clear-cut as we would like. We hope to learn on an international basis how to protect young people throughout the day, and will apply those lessons once the evidence has been assessed.
(1 month ago)
Lords ChamberMy Lords, I thank the Minister for her introduction. In view of the remarks made a week ago by the Minister, the noble Lord, Lord Vallance, who referred to government datasets from the past 15 years which mixed up sex and gender as “accurate”—or perhaps “sort of accurate”, because the exchange in the report varied slightly—do the Government defend the accuracy of those datasets, even though they were, and continue to be, muddled because no one knew what “sex” meant? Are we expected to rely on the accuracy of data which mixed up sex and gender—that is, male and female—or do the Government mean that we cannot defend those data because they were only sort of accurate? I am not entirely clear what the Government are telling us about relying on historic data.
I am also concerned about what insight this gives into what the Government intend to regard as accurate from now on. I continue to think that the Government are on quite a sticky wicket in regard to data accuracy on sex and gender and their refusal to enshrine true sex accuracy in this Bill. We continue to have a bit of a fudge, which shakes confidence in their intentions. This is a huge missed opportunity, but I realise we are not having a further vote.
I shall ask just one question. Clause 29 allows for the Secretary of State to publish supplementary codes for DVS providers. Will the Government commit to publishing a supplementary code to ensure that DVS providers understand how to verify sex accurately and avoid what has been described by the Government Benches as the “muddle” of the last 15 years?
My Lords, I thank all noble Lords who have contributed to this important debate. I will first speak to the issues around accurate recording of sex data before coming on to talk about scientific research.
Throughout the passage of the Bill, we have been clear that digital verification services will be a significant driver of data reliability and productivity. They are absolutely dependent on accurate recording and rigorous management of data. We supported my noble friend Lord Lucas in his original amendments on Report, and we tabled our own amendments from the Front Bench for Lords consideration of Commons amendments last week.
I am grateful to the Minister for her engagement on this issue, and I know she has taken our concerns seriously. That said, we remain concerned about the accurate recording and management of sex data, especially in light of the recent judgment of the Supreme Court. The Government must continue to remain vigilant and to take steps to ensure datasets held by the Government and arm’s-length bodies are, and continue to be, accurate.
My Lords, I declare an interest as chair of the Authors’ Licensing and Collecting Society.
I express the extremely strong support of all on these Benches for Motion C1, proposed by the noble Baroness, Lady Kidron. I agree with every speech that we have heard so far in today’s debate—I did not hear a single dissenting voice to the noble Baroness’s Motion. Once again, I pay tribute to her; she has fought a tireless campaign for the cause of creators and the creative industries throughout the passage of the Bill.
I will be extremely brief, given that we want to move to a vote as soon as possible. The House has already sent a clear message by supporting previous amendments put forward by the noble Baroness, and I hope that the House will be as decisive today. As we have heard this afternoon, transparency is crucial. This would enable the dynamic licensing market that is needed, as we have also heard. How AI is developed and who it benefits are two of the most important questions of our time—and the Government must get the answer right. As so many noble Lords have said, the Government must listen and must think again.
My Lords, it is probably redundant to pay tribute to the noble Baroness, Lady Kidron, for her tenacity and determination to get to a workable solution on this, because it speaks for itself. It has been equally compelling to hear such strong arguments from all sides of the House and all Benches—including the Government Benches—that we need to find a solution to this complex but critical issue.
Noble Lords will recall that, on these Benches, we have consistently argued for a pragmatic, technology-based solution to this complex problem, having made the case for digital watermarking both in Committee and on Report. When we considered the Commons amendments last week, we worked closely with the noble Baroness, Lady Kidron, to find a wording for her amendment which we could support, and were pleased to be able to do so and to vote with her.
It is important that the Government listen and take action to protect the rights of creatives in the UK. We will not stop making the case for our flourishing and important creative sector. We have put that case to Ministers, both in your Lordships’ House and at meetings throughout the passage of the Bill. As a responsible Opposition, though, it is our view that we must be careful about our approach to amendments made by the elected House. We have, I hope, made a clear case to the Government here in your Lordships’ House and the Government have, I deeply regret to say, intransigently refused to act. I am afraid that they will regret their failure to take this opportunity to protect our creative industries. Sadly, there comes a point where we have to accept that His Majesty’s Government must be carried on and the Government will get their Bill.
Before concluding, I make two final pleas to the Minister. First, as others have asked, can she listen with great care to the many artists, musicians, news organisations, publishers and performers who have called on the Government to help them more to protect their intellectual property?
Secondly, can she find ways to create regulatory clarity faster? The process that the Government envisage to resolve this issue is long—too long. Actors on all sides of the debate will be challenged by such a long period of uncertainty. I understand that the Minister is working at pace to find a solution, but not necessarily with agility. I echo the brilliant point made by my noble friend Lady Harding that agility and delivering parts of the solution are so important to pick up the pace of this, because perfect is the enemy of good in this instance. When she gets up to speak, I hope that the Minister will tell us more about the timeline that she envisages, particularly for the collaboration of DSIT and DCMS.
This is a serious problem. It continues to grow and is not going away. Ministers must grip it with urgency and agility.
My Lords, once again, I acknowledge the passion and depth of feeling from those noble Lords who have spoken and, again, I emphasise that we are all on the same side here. We all want to see a way forward that protects our creative industries, while supporting everyone in the UK to develop and benefit from AI.
Of course, we have listened, and are continuing to listen, to the views that have been expressed. We are still going through the 11,500 responses to our consultation, and I have to tell noble Lords that people have proposed some incredibly creative solutions to this debate which also have a right to be heard.
This is not about Silicon Valley; it is about finding a solution for the UK creative and AI tech sectors that protects both. I am pleased that the noble Baroness, Lady Kidron, now endorses the Government’s reports as the right way to identify the right solutions; however, I will address some of her other points directly.
First, she talked about her amendment providing certainty to the creative industries. I can provide that certainty now, as Minister Bryant did in the other place last week. Copyright law in the UK is unchanged by this Bill. Works are protected unless one of the exemptions, which have existed for some time, such as those for teaching and research, applies, or the rights holders have guaranteed permission for their work to be used. That is the law now and it will be the law tomorrow.
I also want to reassure my noble friend Lord Cashman and the noble Baroness, Lady Benjamin, who talked about us stripping away rights today. I want to be clear that the Government have proposed no legislation on this issue; the Bill does no such thing. The amendment from the noble Baroness, Lady Kidron, would provide no certainty other than that of more uncertainty—of continuous regulations, stacked one upon another in a pile of instruments. This cannot be what anyone desires, and it is why the Government do not agree to it.
The noble Baronesses, Lady Kidron and Lady Harding, suggested that her amendment, requiring regulations on only one issue ahead of all others and via a different process, would somehow leave Parliament free to consider all the other issues independently. I am afraid that this is not the case; this is a policy decision with many moving parts. Jumping the gun on one issue will hamstring us in reaching the best outcome on all the others, especially because, as I said earlier, this is a global issue, and we cannot ring-fence the UK from the rest of the world.
We refute the suggestion that we are being complacent on this. I say to my noble friend Lord Brennan that I of course agree that the UK should be a global leader, but we need to make sure that we have the right approach before we plant our flag on that. There is a reason that no other territory has cracked this either. The EU, for example, is still struggling to find a workable solution. It is not easy, but we are working quickly.
The noble Baroness once again raised enforcement, and she has left the mechanism to the discretion of the Government in her new amendment. While we are pleased that the noble Baroness has changed her approach on enforcement in light of the Commons reasons, we all agree that for new transparency requirements to work, enforcement mechanisms will be needed and must be effective.
The noble Baroness said she has tried everything to persuade the Government, and I would have welcomed a further meeting with her to discuss this and other aspects of her revised proposals. Unfortunately, however, that invitation was not accepted. To reiterate, in spite of all our different positions on this Bill, we are all working towards the same goal.
Following proper consideration of consultation responses and publication of our technical reports, we will bring forward comprehensive and workable proposals that will give certainty to all sides. If the House has strong views when the proposals come forward, there will of course be the opportunity for us to debate them. We have made it clear that our reports will be delivered within 12 months and earlier if we can. I remind noble Lords that the amendments in the name of the noble Baroness, Lady Kidron, will not take effect for 18 months. There is not an instant solution, as many noble Lords want to hear today. Neither the noble Baroness’s nor our amendment is an instant solution; it will take time, and we have to recognise that.
We do not believe, in the meantime, that protracted ping-pong on this one remaining issue in the Bill is in anyone’s interest. The elected House has spoken twice and through legislative and non-legislative commitments, the Government have shown they are committed to regulating quickly and effectively. Therefore, I hope the noble Baroness and your Lordships’ House will accept these assurances and continue working with the Government to make progress on this important issue.
A lot has been said in this debate about the importance of transparency. To my noble friend Lord Brennan, I say that the Government have said from the very beginning that we will prioritise the issue of transparency in all the work we do. Transparency is essential to licensing; licensing is essential to the question of remuneration; and remuneration is essential to AI being high quality, effective and able to be deployed in the UK. These are the challenges we are facing, but all these things have to be addressed in the round and together, not in a piecemeal fashion. However, noble Lords are absolutely right to say that, without transparency, it is, of course, worth nothing.
On enforcement, the Government are sympathetic to the argument that it is a different matter for individuals to enforce their rights via the courts as opposed to large creative agencies. This is the kind of the thing that the working groups I have mentioned will explore. As Minister Bryant said last week, we want to make the new regime effective for everybody, large and small.
I will finish with some things I am sure we can all agree on: the urgency of the problem; the need to be evidence-based; that solutions will require collaboration between the creative and the AI sectors; and the solutions must work for everyone. I assure the noble Baroness, Lady Kidron, that everybody will have a seat at the table in the discussions. I hope noble Lords will agree with me and truly support the innovators and creators in the UK by voting with the Government on this Motion, which will deliver a full, comprehensive package that will make a difference to the creative sector for years to come in this country.
(1 month, 1 week ago)
Lords ChamberMy Lords, I will speak to some of the amendments made in the other place, starting with Amendments 1 to 31. These will ensure that smart data schemes can function optimally and that Part 1 is as clear as possible. Similarly, Amendments 35 to 42 from the other place reflect discussions on the national underground asset register with the devolved Governments. Finally, Amendments 70 to 79 make necessary consequential updates to the final provisions of the Bill and some updates to Schedules 11 and 15.
I will now speak to the amendments tabled by noble Lords, starting with those relating to sex data. Motion 32A disagrees with the amendment to remove Clause 28(3) and (4), and instead proposes changes to the initial drafting of those subsections. These would require the Secretary of State, when preparing the trust framework, to assess whether the 15 specified public authorities can reliably ascertain the data they collect, record and share. Amendment 32B limits this assessment to sex data, as defined through Amendment 32C; that definition limits sex to biological sex only and provides a definition of acquired gender.
It is also relevant to speak now to Motion 52A, which disagrees with the amendment to remove Clause 140 and, instead, suggests changes to the drafting. Clause 140, as amended by Amendment 52B, seeks to, through a regulation-making power, give the Secretary of State the ability to define sex as being only biological sex in certain areas or across public sector data processing more widely. Let me be clear that this Government accept the recent Supreme Court judgment on the definition of sex for the purposes of equality legislation. We need to work through the effects of this ruling holistically and with care, sensitivity and—dare I say it—kindness. In line with the law, we need to take care not to inappropriately extend its reach. This is not best done by giving the Secretary of State the power to define sex as biological in all cases through secondary legislation without appropriate scrutiny, given the potential impact on people’s human rights, privacy and dignity, and the potential to create legal uncertainty. Likewise, giving the Secretary of State a role in reviewing how other public authorities process sex data in all circumstances based on that definition would be inappropriate and disproportionate, and I note that the Supreme Court’s ruling relates specifically to the meaning of sex in equalities legislation.
The driver behind these amendments has been the importance of sex data being accurate when processed by public authorities. I strongly agree with that aim: accurate data is essential. This Government take data accuracy—including the existing legislation that requires personal data to be accurate—and data standards seriously. That is why we are addressing the question of sex information in public sector data. First, the EHRC is updating its statutory code of practice to support service providers in light of the Supreme Court judgment. Secondly, the Data Standards Authority is developing data standards on the monitoring of diversity information, including sex and gender data, and the effect of the Supreme Court judgment will be considered as part of that work.
Thirdly, the Office for Statistics Regulation published updated guidance on collecting and reporting data and statistics about sex and gender identity data last year. Fourthly, the Office for National Statistics published a work plan in December 2024 for developing harmonised standards on data more generally. Finally, the department is currently considering the implementation of the Sullivan review, published this year, which I welcome.
On digital verification services, I reassure noble Lords that these measures do not change the evidence that individuals rely on to prove things about themselves. The measures simply enable that to be done digitally. This Government are clear that data must be accurate for the purpose for which it is being used and must not be misleading. It should be clear to digital verification services what the information public authorities are sharing with them means. I will give an important example. If an organisation needs to know a person’s biological sex, this Government are clear that a check cannot be made against passport data, as it does not capture biological sex. DVS could only verify biological sex using data that records that attribute specifically, not data that records sex or gender more widely.
I know this is a concern of the noble Lord, Lord Arbuthnot, and I hope this provides some reassurance. The data accuracy principle of GDPR is part of existing law. That includes where data is misleading—this is a point I will return to. I hope that noble Lords find this commitment reassuring and, as such, will agree with Commons Amendment 32.
Motion 34A on Amendments 34B and 34C address the security of the national underground asset register. Security has always been at the heart of the national underground asset register. We have therefore listened to the well-thought-through concerns that prompted the amendment previously tabled by the noble Viscount, Lord Camrose, regarding cybersecurity. Following consideration, the Government are instead proposing an amendment we have drafted with support of colleagues in the security services. We believe this addresses the intention of ensuring the security of the national underground asset register data, with three key improvements.
First, it broadens the scope from cybersecurity only to the general security of information kept in or obtained from the national underground asset register. This will ensure that front-end users have guidance on a range of measures for security good practice—for example, personnel vetting, which should be considered for implementation—while avoiding the need to publish NUAR-specific cybersecurity features that should not be in the public domain. Secondly, it specifies the audience for this guidance; namely, users accessing NUAR. Finally, it broadens the scope of the amendment to include Northern Ireland alongside England and Wales, consistent with the NUAR measures overall. Clearly, it remains the case that access to NUAR data can be approved for purposes only by eligible users, with all access controlled and auditable. As such, I hope that noble Lords will be content to support government Motion 34A and Amendments 34B and 34C.
Commons Amendment 43, made in the other place, on scientific research removes the public interest test inserted in the definition of scientific research by the noble Viscount, Lord Colville. While recognising the concern the noble Lord raises, I want to be clear that anything that does not count as scientific research now would not do so under the Bill. Indeed, we have tightened the requirement and added a reasonableness test. The Bill contains strong safeguards. Adding precise definitions in the Bill would not strengthen these protections but impose a significant, new legal obligation on our research community at a time when, in line with the good work of the previous Government, we are trying to reduce bureaucracy for researchers, not increase it with new processes. The test proposed will lead to burgeoning bureaucracy and damage our world-leading research. This disproportionate step would chill basic and curiosity-driven research, and is not one we can support.
I beg to move that the House agree with the Commons in their Amendment 1. I have spoken to the other amendments.
My Lords, I first thank the Minister for his—as ever—clear and compelling remarks. I thank all noble Lords who have been working in a collegiate, collaborative fashion to find a way forward on the few but important remaining points of disagreement with the Government.
Before I come to the issue of accurate recording of personal data, I also thank the Minister, the noble Baroness, Lady Jones, for tabling the government amendments on the national underground asset register and her constructive engagement throughout the progress of the Bill.
As noble Lords will recall, I set out our case for stronger statutory measures to require the Secretary of State to provide guidance to relevant stakeholders on the cybersecurity measures that should be in place before they receive information from the national underground asset register. I am of course delighted that the Government have responded to the arguments that we and others made and have now tabled their own version of my amendment which would require the Secretary of State to provide guidance on the security of this data. We are happy to support them in that.
I turn to Motions 32A and 52A standing in my name, which seek to ensure that data is recorded accurately. They amend the original amendment, which my noble friends Lord Lucas and Lord Arbuthnot took through your Lordships’ House. My noble friend Lord Lucas is sadly unable to attend the House today, but I am delighted to bring these Motions forward from the Opposition Front Bench. In the other place, the Conservative Front Bench tabled new Clause 21, which would, we feel, have delivered a conclusive resolution to the problem. Sadly, the Government resisted that amendment, and we are now limited by the scope of the amendments of my noble friend Lord Lucas, so we were unable to retable the, in my view, excellent amendment in your Lordships’ House.
Moved by
32A: Leave out from “House” to end and insert “do disagree with the Commons in their Amendment 32, and do propose Amendments 32B and 32C to the words so restored to the Bill—
I thank the Minister for his very able summing up of his position, but I am afraid I cannot get past the question in my mind of how existing legacy data, even if it is managed by a DVS system going forward, will suddenly be of high quality when it is currently, as we know from the Sullivan report, in a muddle. As a result, for all his eloquence, I beg leave to test the opinion of the House.
My Lords, I thank the Minister for setting out the Government’s case so clearly. I will speak to my Amendment 46A, which seeks to improve the report that the Government brought forward in the other place. This issue is causing real concern for copyright owners and so many others in the creative industries. Let us remind ourselves that the creative industries contributed £124 billion in gross value added to the UK economy in 2023 and outperformed the UK economy between 2010 and 2023 in terms of growth. The Government are, wisely and rightly, prioritising growth over other concerns, and the creative industries will have to be an essential part of this—but only to the extent that they have a trusted and efficient marketplace for intellectual property.
Our amendment would improve the Government’s proposed report by adding consideration of extra territorial use of creators’ copyright works by operators of web crawlers and AI systems, as well as consideration of establishing a digital watermark for the purposes of identifying licensed content. I very much take on board the Minister’s point that this must be international to work, but few countries, if any, would have better or greater convening power to initiate the process of creating such digital standards. I urge the Government to pursue that avenue.
I pay tribute to all noble Lords who have raised the issue of copyright during the passage of this Bill. I am sure that I will be joining many others in thanking the noble Baroness, Lady Kidron, who has led such a powerful and successful campaign on this issue. Throughout the passage of the Bill, we have recognised the serious concerns raised by the creative sector and, on Report, we tabled an amendment seeking to create a digital watermark to identify this content and to protect copyright owners. I am very pleased that the Government have taken the first step by amending the Bill in the other place to put a report in it. That being said, the report needs to go further. If the Government are unwilling to accept our changes, I will test the opinion of the House when my amendment is called.
I turn briefly to Motion 49A, I the name of the noble Baroness, Lady Kidron. I once again pay tribute to the work that she has done to make progress on this. While we had concerns about the drafting of her amendment on Report, I am very pleased that she has tabled her Amendment 49B today. With the additional parts of it targeted at supporting small businesses and micro-entities, we are delighted to support it. It is increasingly clear that the Government must do the right thing for our creative industries, and we are delighted to offer our support to Motoin 49A. I intend to test the opinion of the House on Amendment 46A when it is called.
My Lords, I will speak to my Motion 49A and offer my support to Amendment 46A in the name of the noble Viscount, Lord Camrose. It is a sensible amendment and I hope that the Government find a way to accept it without challenge.
I start by rebutting three assertions that have been circling over the past few weeks. First, I reject the notion that those of us who have raised our voices against government plans are against technology. I quote the Secretary of State, Peter Kyle, who I am delighted to see is below Bar this afternoon. He said to the FT that:
“Just as in every other time there is change in society, there will be some people who will either resist change or try to make change too difficult to deliver”.
Well, creative people are early adopters of technology. Their minds are curious and their practices innovative. In my former career as a film director, I watched the UK film industry transform from working on celluloid to being a world-leading centre of digital production. For the past five years at Oxford’s Institute for Ethics in AI, where I am an advisor, I have been delighted to watch the leaps and bounds of AI development. Those at the frontier of AI development are creative thinkers, and creative people are natural innovators. The Government’s attempt to divide us is wrong.
The transformational impact of technology is something that all the signatories of this weekend’s letter to the Prime Minister understand. Creators do not deny the creative and economic value of AI, but we do deny the assertion that we should have to build AI for free with our work and then rent it back from those who stole it. Ours is not an argument about progress but about value. The AI companies fiercely defend their own IP but deny the value of our work. Not everything new is progress, not everything that already exists is without value, but we, the creative industries, embody both change and tradition, and we reject the assertion that we are standing in the way of change. We are merely asserting our right to continue to exist and play our part in the UK’s future growth.
Secondly, there is no confusion about copyright law in relation to AI, nor does the phenomenal number of submissions to the consultation prove anything other than the widespread outrage of the creative industries that the Government sought to redefine theft rather than uphold their property rights. In our last debate, my noble and learned friend Lady Butler-Sloss made an unequivocal statement to that effect which has been widely supported by other legal opinion. The Government’s spokesman, who has greeted every press inquiry of the last few weeks by saying that the Government are consulting to sort out the confusion in copyright in relation to AI is, at best, misinformed. Let me be clear: the amendment would not change copyright. We do not need to change copyright law. We need transparency so that we can enforce copyright law, because what you cannot see you cannot enforce.
Thirdly, I rebut the idea that this is the wrong Bill and the wrong time. AI did not exist in the public realm until the early 2020s. The speed and scale at which copyright works are being stolen is eye-watering. Property that people have invested in, have created, have traded and that they rely on for their livelihood is being stolen at all parts of the value chain. It is an assault on the British economy, happening at scale to a sector worth £120 billion to the UK, an industry that is central to the industrial strategy and of enormous cultural import. It is happening now, and we have not even begun to catch up with the devastating consequences. The Government have taken our amendments out of the Bill and replaced them with a couple of toothless reports. Whatever these reports bring forward and whatever the consultation offers, we need the amendment in front of us today now. If this Bill does not protect copyright then, by the time that the Government work out their policy, there will be little to save.
The language of AI—scraping, training, data modules, LLMs—does not evoke the full picture of what is being done. AI corporations, many of which are seeking to entrench their existing information monopolies, are not stealing nameless data. They are stealing some of the UK’s most valuable cultural and economic assets—Harry Potter, the entire back catalogue of every music publisher in the UK, the voice of Hugh Grant, the design of an iconic handbag and the IP of our universities, great museums and library collections. Even the news is stolen in real time, all without payment, with economic benefits being taken offshore. It costs UK corporations and individuals their hard-earned wealth and the Treasury much needed revenue. It also denudes the opportunities of the next generation because, whether you are a corporation or an individual, if work is stolen at every turn, you cannot survive. The time is now, and this Bill is the vehicle.
Motion 49A replaces the previous package of Lords amendments. I pay tribute to the noble Lord, Lord Stevenson, who wishes he could be with us; the noble Lord, Lord Clement-Jones, and his colleagues, who have been uncompromising in their support; and my noble friend Lord Freyberg, who were all co-sponsors of the original amendment.
Amendment 49B would simply provide that a copyright holder be able to see who took their work, what was taken, when and why, allowing them a reasonable route to assert their moral right to determine whether they wish to have their work used, and if so, on what terms. It is a slimmer version of the previous package of amendments, but it covers the same ground and, importantly, it puts a timeline of 12 months on bringing forward these provisions and makes specific provision for SMEs and micro-entities and for UK-headquartered AI companies.
I thank the Minister for her full and detailed answer. Having heard the tone of the debate, I think it is clear that the focus and energy of the House are more on the amendment from the noble Baroness, Lady Kidron, but I am happy to take up the Minister’s offer of a further meeting.
52A: Leave out from “House” to end and insert “do disagree with the Commons in their Amendment 52, and do propose Amendments 52B and 52C to the words so restored to the Bill—
A little time has elapsed since the original debate, but I beg leave to test the opinion of the House.
(2 months, 3 weeks ago)
Grand CommitteeMy Lords, I thank the Minister for her introduction to this draft statutory instrument; it was brief and to the point. These penalties will be able to reach 10% of turnover or £100,000 per day for continuing breaches, so getting the calculations right is crucial. However, I have some concerns about the SI, the first of which is about timing.
I do not understand why we are looking at a three-year gap between the enabling powers and the calculation rules. The Telecommunications (Security) Act 2021, which I worked on, was presented to this House as urgent legislation to protect critical national infrastructure, yet here we are, in 2025, only now establishing how to calculate penalties for breaches in the way set out in this SI. During this period, we have had enforcement powers without the ability to properly determine penalties. As I understand it, tier 1 providers had to comply by March 2024, yet the penalty calculation mechanism will not be in place until this year—no doubt in a few weeks’ time.
Secondly, there is the absence of consultation. The Explanatory Memorandum cites the reason as the SI’s “technical nature”, but these penalties—I mentioned their size—could have major financial implications for providers. The telecoms industry has complex business structures and revenue streams. Technical expertise from the industry could have helped to ensure that these calculations are practical and comprehensive. The technical justification seems remarkably weak, given the impact these rules could have. For example, the current definition of “relevant business” for these calculations focuses on traditional network and service provision, but modern telecoms companies often have diverse revenue streams. There is no clear provision for new business models or technologies. How will we handle integrated service providers? What about international revenues? The treatment of associated services needs clarification.
Thirdly, the implementation sequence is an issue. We are being asked to approve penalty calculations before seeing the enforcement guidelines. There is no impact assessment, so we cannot evaluate potential consequences. I understand that the post-implementation review is not scheduled until 2026, and there is no clear mechanism for adjusting the framework if problems emerge. The interaction with the existing penalty regime needs clarification.
There are also technical concerns that need some attention. The switch from “notified provider” to “person” in the 2003 order, as a result of this SI, needs rather more explanation. The calculation method for continuing breaches is not fully detailed, there is no specific provision for group companies or complex corporate structures and the treatment of joint ventures and partnerships remains unclear.
Finally, I hope that, in broad terms, the Minister can give us an update on progress on the removal of equipment covered by the Telecommunications (Security) Act 2021. That was mandated by the Act; I know it is under way but it is not yet complete.
This is about not merely technical calculations but creating an effective deterrent to the telecoms industry, while ensuring fair and practical enforcement of important security measures. Getting these rules right is essential for both national security and our telecoms sector. I look forward to the Minister’s response on these points.
My Lords, I thank the Minister for bringing this important SI forward today and for setting it out so clearly and briefly. I also thank the noble Lord, Lord Clement-Jones. He made a range of interesting points: in particular, the point on timing was well made, and I look forward to hearing the Minister’s answers on that. This instrument seeks to implement provisions relating to the enforcement of designated vendor directions—DVDs—which form part of the broader framework established under the Telecommunications (Security) Act 2021. That Act, introduced under the previous Government, was designed to strengthen the security and resilience of the UK’s telecommunications networks, particularly in response to emerging national security risks.
We all know only too well that one of the most prominent issues at the forefront of this framework has been the removal of high-risk vendors, such as Huawei, from UK telecommunications infrastructure. Huawei’s involvement in the UK’s 5G rollout has long been a point of debate, with growing concerns about national security risks tied to its equipment. This SI therefore provides a mechanism for enforcing the penalties that may be applied to public communications providers —PCPs—that fail to comply with the DVDs to ensure that the UK’s telecommunications infrastructure remains secure from undue foreign influence.
The primary change introduced by this SI is the formalisation of the penalties regime for public communications providers that fail to comply with the conditions outlined in DVDs. It establishes a framework for calculating and enforcing penalties that may be imposed by the Secretary of State. The Secretary of State retains discretion in imposing penalties, but they must be applied in a proportionate manner. In considering penalties, the severity of the breach, the culpability of the provider and the broader implications for the sector must all be taken into account. The aim is to ensure compliance with DVDs while protecting the integrity of the UK’s national infrastructure.
However, while the objectives of this instrument are understood, this debate offers a good opportunity to scrutinise some of the specifics a little, particularly with regard to the proportionality of penalties and the potential economic consequences for the sector. It is with that in mind that I shall raise questions in just three areas regarding the provisions set out in this instrument.
First, the SI grants the Secretary of State significant discretion in the imposition of penalties. Of course, we recognise the value of flexibility here, but there is legitimate concern that this discretion may result in inconsistent enforcement across different public communications providers. Can the Minister assure us that transparency and accountability will be maintained throughout this process? How will the Government ensure that the application of penalties is fair and consistent, particularly when considering the varying size and scope of telecoms providers?
Further to this, can the Minister clarify how the penalties will be calculated? I echo the questions asked by the noble Lord, Lord Clement-Jones, particularly in cases where a breach does not pose an immediate or severe national security threat. Do the Government anticipate that penalties will be tiered with lesser fines for breaches that do not substantially compromise national security? Can the Minister further explain how such decisions will be communicated to the public and to industry to ensure transparency?
Secondly, providers are required to remove Huawei equipment from the UK’s 5G networks by 2027. This is, of course, a significant and costly task for telecom providers. Given these financial challenges, will the penalties for non-compliance take into account the costs already incurred by providers in replacing Huawei’s technology? Will the penalties be adjusted to reflect the substantial financial burden that these providers are already facing in removing Huawei equipment from their networks? Thirdly, where PCPs have been issued with a DVD, this can be a long and demanding process. How are the Government going to keep track of progress? What progress reports can be shared with Parliament and the public?
Is the Minister confident that the 2027 deadline will be met; that no vendor, purchaser or telecoms company will be caught by the Act; that no fines will be levied; and that what we are talking about today is, therefore, entirely theoretical?
While the Minister is working on her answer, perhaps she could include in that something about how progress against the delivery of these objectives will be reported to Parliament, potentially —and, indeed, to the public.
(3 months, 3 weeks ago)
Lords ChamberThe noble Earl is right, and we are trying to find a way to ensure that those rights are upheld. However, all these sectors need to grow in our economy. As I was just explaining, the creative sector uses AI, so it is not as simple “us and them” situation. AI is increasingly being used by all sectors across our economy. We need to find a way through this that rewards creators in the way that the noble Earl has outlined, which I think we all understand.
My Lords, I recognise of course that the task of analysing the results of the consultation still needs to go ahead. That said, does the Minister agree with us that digital watermarking is going to be a key component of the solution to the AI and copyright issue? If so, what does she make of the number of digital watermarking solutions that are now coming to market? In her view, is this to be welcomed or should we be pursuing a single standard for digital watermarks?
The noble Viscount has made an important point about watermarks, and that is certainly one solution that we are considering. The issue of transparency is crucial to the outcome of this issue, and watermarks would certainly help with that. I do not have a view as yet on whether we should have one or many, but I am hoping that the consultation will give us some guidance on that.
(6 months ago)
Grand CommitteeMy Lords, I very much support the thrust of these amendments and what the noble Lord, Lord Knight, said in support of and in addition to them. I declare an interest as a current user of the national pupil database.
The proper codification of safeguards would be a huge help. As the noble Baroness, Lady Kidron, said, it would give us a foundation on which to build. I hope that, if they are going to go in this direction, the Government will take an immediate opportunity to do so because what we have here, albeit much more disorganised, is a data resource equivalent to what we have for the National Health Service. If we used all the data on children that these systems generate, we would find it much easier to know what works and in what circumstances, as well as how to keep improving our education system.
The fact that this data is tucked away in little silos—it is not shared and is not something that can be used on a national basis—is a great pity. If we have a national code as to how this data is handled, we enable something like the use of educational data in the way that the NHS proposes to use health data. Safeguards are needed on that level but the Government have a huge opportunity; I very much hope that it is one they will take.
I start by thanking all noble Lords who spoke; I enjoyed the vivid examples that were shared by so many of them. I particularly enjoyed the comment from the noble Lord, Lord Russell, about the huge gulf in difference between guidance, of which there is far too much, and a code that actually drives matters forward.
I will speak much more briefly because this ground has been well covered already. Both the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, seek to introduce codes of practice to protect the data of children in education services. Amendment 138 in the name of the noble Lord seeks to introduce a code on processing personal data in education. This includes consultation for the creation of such a code—a highly important element because the safety of this data, as well as its eventual usage, is of course paramount. Amendment 141 in the name of the noble Baroness, Lady Kidron, also seeks to set out a code of practice to provide heightened protections for children in education.
Those amendments are absolutely right to include consultation. It is a particularly important area of legislation. It is important that it does not restrict what schools can do with their data in order to improve the quality and productivity of their work. I was very appreciative of the words of the noble Lord, Lord Knight, when he sketched out some of the possibilities of what becomes educationally possible when these techs are wisely and safely used. With individual schools often responsible for the selection of technologies and their procurement, the landscape is—at the risk of understatement —often more complex than we would wish.
Alongside that, the importance of the AI Safety Institute’s role in consultation cannot be overstated. The way in which tech and AI have developed in recent years means that its expertise on how safely to provide AI to this particularly vulnerable group is invaluable.
I very much welcome the emphasis that these amendments place on protecting children’s data, particularly in the realm of education services. Schools are a safe place. That safety being jeopardised by the rapid evolution of technology that the law cannot keep pace with would, I think we can all agree, be unthinkable. As such, I hope that the Government will give careful consideration to the points raised as we move on to Report.
My Lords, I rise to make a brief but emphatic comment from the health constituency. We in the NHS have been victims of appalling cyber- hacking. The pathology labs in south London were hacked and that cost many lives. It is an example of where the world is going in the future unless we act promptly. The emphatic call for quick action so that government keeps up with world changes is really well made. I ask the Minister to reflect on that.
My Lords, I, too, shall speak very briefly, which will save valuable minutes in which I can order my CyberUp Christmas mug.
Amendments 156A and 156B add to the definition of unauthorised access, so that it includes instances where a person who accesses data in the reasonable knowledge that the controller would not consent if they knew about the access or the reason for the access, and this person is not empowered to access by an enactment. Amendment 156B introduces defences to this new charge. Given the amount of valuable personal data held by controllers, as our lives have moved increasingly online—as many speakers in this debate have vividly brought out—there is absolutely clear merit not just in this idea but in the pace implied, which many noble Lords have called for. There is a need for real urgency here, and I look forward to hearing more detail from the Minister.
My Lords, I turn to Amendments 156A and 156B, tabled by the noble Lord, Lord Holmes. I understand the strength of feeling and the need to provide legal protections for legitimate cybersecurity activities. I agree with the noble Lord that the UK should have the right legislative framework to allow us to tackle the harms posed by cybercriminals. We have heard examples of some of those threats this afternoon.
I reassure the noble Lord that this Government are committed to ensuring that the Computer Misuse Act remains up to date and effective in tackling criminality. We will continue to work with the cybersecurity industry, the National Cyber Security Centre and law enforcement agencies to consider whether there are workable proposals on this. The noble Lord will know that this is a complex and ongoing issue being considered as part of the review of the Computer Misuse Act being carried out by the Home Office. We are considering improved defences by engaging extensively with the cybersecurity industry, law enforcement agencies, prosecutors and system owners. However, engagement to date has not produced a consensus on the issue, even within the industry, and that is holding us back at this moment—but we are absolutely determined to move forward with this and to reach a consensus on the way forward.
I think the noble Lord, Lord Clement-Jones, said in the previous debate that the amendments were premature, and here that is certainly the case. The specific amendments that the noble Lord has tabled are premature, because we need a stronger consensus on the way forward, notwithstanding all the good reasons that noble Lords have given for why it is important that we have updated legislation. With these concerns and reasons in mind, I hope that the noble Lord will feel able to withdraw his amendment.
My Lords, the trouble with this House is that some have long memories. The noble Earl, Lord Erroll, reminded us all to look back, with real regret, at the Digital Economy Act and the failure to implement Part 3. I think that that was a misstep by the previous Government.
Like all of us, I warmly welcome the inclusion of data access provisions for researchers studying online safety matters in Clause 123 of the Bill. As we heard from the noble Baroness, Lady Kidron, and the noble Lord, Lord Knight, this was very much unfinished business from the Online Safety Act. However, I believe that, in order for the Bill to be effective and have the desired effect, the Government need to accept the amendments in the names of the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell. In terms of timeframe, the width of research possible, enforceability, contractual elements and location, they cover the bases extremely effectively.
The point was made extremely well by the noble Lords, Lord Bethell and Lord Russell, that we should not have to rely on brave whistleblowers such as Frances Haugen. We should be able to benefit from quality researchers, whether from academia or elsewhere, in order to carry out this important work.
My Amendment 198B is intended as a probing amendment about the definition of researchers under Clause 123, which has to be carefully drawn to allow for legitimate non-governmental organisations, academics and so on, but not so widely that it can be exploited by bad actors. For example, we do not want those who seek to identify potential exploits in a platform to use this by calling themselves “independent researchers” if they simply describe themselves as such. For instance, could Tommy Robinson seek to protect himself from liabilities in this way? After all, he called himself an “independent journalist” in another context when he clearly was not. I hope that when the Government come to draw up the regulations they will be mindful of the need to be very clear about what constitutes an independent or accredited researcher, or whatever phrase will be used in the context.
My Lords, although I have no amendments in this group, I will comment on some of them. I might jump around the order, so please forgive me for that.
Amendment 197 would change Clause 123 so that the Secretary of State must, as soon as reasonably practicable and no later than 12 months after the Act is passed, make regulations requiring regulated services to provide information for the purposes of research into online safety. This is clearly sensible. It would ensure that valuable research into online safety may commence as soon as possible, which would benefit us all, as speakers have made abundantly clear. To that end, Amendment 198D, which would ensure that researcher access is enforceable in the same way as other requirements under the Online Safety Act, would ensure that researchers can access valuable information and carry out their beneficial research.
I am still left with some curiosity on some of these amendments, so I will indicate where I have specific questions to those who have tabled them and hope they will forgive me if I ask to have a word with them between now and Report, which would be very helpful. In that spirit, I turn to Amendment 198B, which would allow the Secretary of State to define the term “independent researcher”. I ask the noble Lord, Lord Clement-Jones, who tabled the amendment, whether he envisages the Secretary of State taking advice before making such regulations and, if so, from whom and in what mechanism. I recognise that it is a probing amendment, but I would be keen to understand more.
I am also keen to understand further from my noble friend Lord Bethell and the noble Baroness, Lady Kidron, why, under Amendment 198A, the Secretary of State would not be able to make regulations providing for independent research into the “enforcement of requirements” under these regulations. Again, I look forward to discussing that with them.
I have some concerns about Amendment 198, which would require service providers to give information pertaining to age, stage of development, gender, race, ethnicity, disability and sexuality to researchers. I understand the importance of this but my concern is that it would require the disclosure of special category data to those researchers. I express reservations, especially if the data pertains to children. Do we have the right safeguards in place to address the obviously heightened risks here?
Additionally, I have some concerns about the provisions suggested in Amendment 198E. Should we allow researchers from outside the United Kingdom to require access to information from regulated service providers? Could this result in data being transferred into jurisdictions where there are less stringent data protection laws?
My Lords, I thank noble Lords who have welcomed the provisions in the Bill. I very much appreciate that we have taken on board the concerns that were raised in the debates on the previous legislation. I thank the noble Baroness, Lady Kidron, and the noble Lords, Lord Bethell and Lord Clement-Jones, for their amendments.
I will speak first to Amendment 197, tabled by the noble Baroness, Lady Kidron, which would compel the Secretary of State to create a framework and to do so within 12 months of passage. I understand and share her desire to ensure that a framework allowing researchers access is installed and done promptly. This is precisely why we brought forward this provision. I reassure her that the department will consult on the framework as soon as possible after the publication of Ofcom’s report.
Turning to Amendments 198 and 198B, tabled by the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, respectively, Clause 123 provides the Secretary of State with the power to make regulations relating to researchers’ access to data. I can reassure noble Lords that it does not limit the regulations to the non-exhaustive list of examples provided. I agree that fair and proportionate criteria for who is considered a researcher are critical to the success of the future framework. I reassure noble Lords that in the provision as currently written the Secretary of State can include in the design of the framework the specific requirements that a person must meet to be considered a researcher.
Turning to Amendments 198A and 198D, tabled by the noble Lord, Lord Bethell, while I am sympathetic to his desire to provide a future framework with the robust enforcement powers of the OSA, I assure him that as the provision is written, the Secretary of State can already use the existing enforcement powers of the OSA to support a future framework. Furthermore, should the evidence suggest that additional or different measures would be more effective and appropriate, this provision allows the Secretary of State the flexibility to introduce them.
Turning next to Amendments 198C and 198E, tabled by the noble Lord, Lord Bethell, I understand the spirit of these amendments and note the importance of this issue, given the global nature of the online world. It is entirely reasonable to allow researchers who are not based in the UK to utilise our researcher access framework, as long as the subject of their research is the experience of UK users online. I reassure him that the provisions as drafted already allow the Secretary of State to make regulations permitting non-UK-based researchers to use the framework where appropriate. We plan to use the evidence gathered through our own means and through Ofcom’s report to set out who will be eligible to use the framework in the secondary legislation.
Finally, turning to Amendment 198F, I am aware of the concern that researchers have encountered blockages to conducting research and I am sympathetic to the intentions behind the amendment. We must ensure that researchers can use the future framework without fear of legal action or other consequences. I am conscious that the noble Baroness, Lady Kidron, asked me a specific question about legal exemptions and I will write to her to make that answer much clearer. I reassure noble Lords that the Government are considering the specific issues that the noble Lord raises. For these reasons, I ask that the amendments not be pressed while the Government consider these issues further and I am of course happy to engage with noble Lords in the meantime.
My Lords, the UK is a world leader in genomics research. This research will no doubt result in many benefits, particularly in the healthcare space. However, genomics data can be, and increasingly is, exploited for deeply concerning purposes, including geostrategic ones.
Western intelligence agencies are reportedly becoming increasingly concerned about China using genomic data and biotechnology for military purposes. The Chinese Government have made it clear that genomics plays a key part in the civilian-military doctrine. The 13th five-year plan for military-civil fusion calls for the cross-pollination of military and civilian technology such as biotechnology. This statement, taken in conjunction with reports that the Beijing Genomics Institute—the BGI—in collaboration with the People’s Liberation Army, is looking to make ethnically Han Chinese soldiers less susceptible to altitude sickness, makes for worrying reading. Genetically engineered soldiers appear to be moving out of fiction and towards reality.
The global genomics industry has grown substantially as a result of the Covid-19 pandemic and gene giant BGI Group and its affiliated MGI Tech have acquired large databases of DNA. Further, I note that BGI has widespread links to the Chinese state. It operates the Government’s key laboratories and national gene bank, itself a vast repository of DNA data drawn from all over the world. A Reuters investigation found that a prenatal test, NIFTY, sold by BGI to expectant mothers, gathered millions of women’s DNA data. This prenatal test was developed in collaboration with the Chinese military.
For these reasons, I think we must become far more protective of genomic data gathered from our population. While many researchers use genomic data to find cures for terrible diseases, many others, I am afraid, would use it to do us harm. To this end, I have tabled Amendment 199 to require the Secretary of State and the Information Commissioner to conduct frequent risk assessments on data privacy associated with genomics and DNA companies headquartered in countries that are systemic competitors or hostile actors. I believe this will go some way to preventing genomic data transfer out of the UK and to countries such as China that may use it for military purposes. I beg to move.
My Lords, I strongly support this amendment. As a former Minister, I was at the front line of genomic data and know how powerful it currently is and can be in the future. Having discussed this with the UK Biobank, I know that the issue of who stores and processes genomic data in the UK is a subject of huge and grave concern. I emphasise that the American Government have moved on this issue already and emphatically. There is the possibility that we will be left behind in global standards and will one day be an outlier if we do not close this important and strategically delicate loophole. For that reason, I strongly support this amendment.
My Lords, I thank the noble Viscount, Lord Camrose, for moving this amendment, which raises this important question about our genomics databases, and for the disturbing examples that he has drawn to our attention. He is right that the opportunities from harnessing genomic data come with very real risks. This is why the Government have continued the important work of the UK Biological Security Strategy of 2023, including by conducting a full risk assessment and providing updated guidance to reduce the risks from the misuse of sensitive data. We plan to brief the Joint Committee on the National Security Strategy on the findings of the risk assessment in the new year. Following that, I look forward to engaging with the noble Viscount on its outcome and on how we intend to take these issues forward. As he says, this is a vital issue, but in the meantime I hope he is prepared to withdraw his amendment.
I thank the Minister for her answer, and I very much accept her offer of engagement. I will make a few further brief comments about the importance of this amendment, as we go forward. I hope that other noble Lords will consider it carefully before Report.
I will set out a few reasons why I believe this amendment can benefit both the Bill and this country. The first is its scope. The amendment will allow the Secretary of State and the Information Commissioner to assess data security risks across the entirety of the genomic sector, covering consumers, businesses, citizens and researchers who may be partnering with state-linked genomics companies.
The second reason is urgency. DNA is regularly described as the “new gold” and it represents our most permanent identifier, revealing physical and mental characteristics, family medical history and susceptibility to diseases. Once it has been accessed, the damage from potential misuse cannot be researched, and this places a premium on proactively scrutinising the potential risks to this data.
Thirdly, there are opportunities for global leadership. This amendment offers the UK an opportunity to take a world-leading role and become the first European country to take authoritative action to scrutinise data vulnerabilities in this area of critical technology. Scrutinising risks to UK genomic data security also provides a foundation to foster domestic genomics companies and solutions.
Fourthly, this amendment would align the UK with key security partners, particularly, as my noble friend Lord Bethell mentioned, the United States, which has already blacklisted certain genomics companies linked to China and taken steps to protect American citizens’ DNA from potential misuse.
The fifth and final reason is protection of citizens and consumers. This amendment would provide greater guidance and transparency to citizens and consumers whose DNA data is exposed to entities linked to systemic competitors. With all of that said, I thank noble Lords for their consideration and beg leave to withdraw my amendment.
My Lords, we have had some powerful speeches in this group, not least from the noble Baronesses, Lady Kidron and Lady Owen, who drafted important amendments that respond to the escalating harms caused by AI-generated sexual abuse material relating to children and adults. The amendment from the noble Baroness, Lady Kidron, would make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material. As she outlined and the noble Lord, Lord Bethell, confirmed, it specifically would become an offence to create, train or distribute generative AI models that enable the creation of computer-generated CSAM or priority legal content; to train AI models on CSAM or priority illegal content; or to possess AI models that produce CSAM or priority legal content.
This amendment responds to a growing problem, as we have heard, around computer-generated sexual abuse material and a gap in the law. There is a total lack of safeguards preventing bad actors creating sexual abuse imagery, and it is causing real harm. Sites enabling this abuse are offering tools to harm, humiliate, harass, coerce and cause reputational damage. Without robust legal frameworks, victims are left vulnerable while perpetrators operate with impunity.
The noble Lord, Lord Bethell, mentioned the Internet Watch Foundation. In its report of July, One Step Ahead, it reported on the alarming rise of AI-generated CSAM. In October 2023, in How AI is Being Abused to Create Child Sexual Abuse Imagery, it made recommendations to the Government regarding legislation to strengthen legal frameworks to better address the evolving landscape of AI-generated CSAM and enhance preventive measures against its creation and distribution. It specifically recommended:
“That the Government legislates to make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material”.
The noble Baroness, Lady Kidron, tabled such an amendment to the previous Bill. As she said, she was successful in persuading the then Government to accept it; I very much hope that she will be as successful in persuading this Government to accept her amendment.
Amendments 211G and 211H in the name of the noble Baroness, Lady Owen, are a response to the extraordinary fact that one in 14 adults has experienced threats to share intimate images in England and Wales; that rises to one in seven among young women. Research from Internet Matters shows that 49% of young teenagers in the UK aged between 13 and 16—around 750,000 children—said that they were aware of a form of image-based abuse being perpetrated against another young person known to them.
We debated the first of the noble Baroness’s amendments, which is incorporated in her Bill, last Friday. I entirely agree with the noble Lord, Lord Knight; I did not find the Government’s response at all satisfactory. I hope that, in the short passage of time between then and now, they have had time to be at least a little agile, as he requested. UK law clearly does not effectively address non-consensual intimate images. It is currently illegal to share or threaten to share non-consensual intimate images, including deepfakes, but creating them is not yet illegal; this means that someone could create a deepfake image of another person without their consent and not face legal consequences as long as they do not share, or threaten to share, it.
This amendment is extremely welcome. It addresses the gap in the law by criminalising the creation of non-consensual intimate images, including deepfakes. It rightly targets deepfakes due to their rising prevalence and potential for harm, particularly towards women. Research shows that 98% of deepfake videos online are pornographic, with 99% featuring women and girls. This makes it an inherently sexist problem that is a new frontier of violence against women—words that I know the noble Baroness has used.
I also very much welcome the new amendment not contained in her Bill, responding to what the noble Baroness, Lady Gohir, said at its Second Reading last Friday about including audio deepfakes. The words “shut down every avenue”, which I think were used by the noble Baroness, Lady Gohir, are entirely apposite in these circumstances. Despite what the noble Lord, Lord Ponsonby, said on Friday, I hope that the Government will accept both these amendments and redeem their manifesto pledge to ban the creation of sexually explicit deepfakes, whether audio or video.
My Lords, the current law does not sufficiently protect children from AI-driven CSAM because it is simply such a fast-moving issue. It is a sobering thought that, of all the many wonderful developments of AI that many of us have been predicting and speculating on for so long, CSAM is really driving the technology forward. What a depressing reflection that is.
Overall, AI is developing at an extraordinarily rapid pace and has come with a number of concerning consequences that are not all yet fully understood. However, it is understood that child sexual abuse is completely unacceptable in any and all contexts, and it is right that our law should be updated to reflect the dangers that have increased alongside AI development.
Amendment 203 seeks to create a specific offence for using personal data or digital information to create or facilitate the creation of computer-generated child sexual abuse material. Although legislation is in place to address possessing or distributing such horrendous material, we must prioritise the safety of children in this country and take the law a step further to prevent its creation. Our children must be kept safe and, subject to one reservation, which I will come to in a second, I support the amendment from the noble Baroness, Lady Kidron, to further protect them.
That reservation comes in proposed new subsection 1(c), which includes in the offence the act of collating files that, when combined, enable the creation of sexual abuse material. This is too broad. A great deal of the collation of such material can be conducted by innocent people using innocent materials that are then corrupted or given more poisonous aspects by further training, fine-tuning or combination with other materials by more malign actors. I hope there is a way we can refine this proposed new paragraph on that basis.
Unfortunately, adults can also be the targets of individuals who use AI to digitally generate non-consensual explicit images or audio files of an individual, using their likeness and personal data. I am really pleased that my noble friend Lady Owen tabled Amendments 211G and 211H to create offences for these unacceptable, cruel acts. I support these amendments unambiguously.
My Lords, I thank the noble Baroness, Lady Kidron, for her Amendment 203. It goes without saying that the Government treat all child sexual abuse material with the utmost seriousness. I can therefore confirm to her and the Committee that the Government will bring forward legislative measures to address the issue in this Session and that the Home Office will make an announcement on this early in the new year.
On Amendments 211G and 211H, tabled by the noble Baroness, Lady Owen, the Government share concerns that more needs to be done to protect women from deepfake image abuse. This is why the Government committed in their manifesto to criminalise the creation of sexually explicit deepfake images of adults. I reassure the noble Baroness and the whole Committee that we will deliver on our manifesto commitment in this Session. The Government are fully committed to protecting the victims of tech-enabled sexual abuse. Tackling intimate audio would be a new area of law, but we continue to keep that legislation under review.
I also say to the noble Baroness that there is already a process under Section 153 of the Sentencing Act 2020 for the court to deprive a convicted offender of property, including images that have been used for the purpose of committing or facilitating any criminal offence. As well as images, that includes computers and mobile phones that the offender either used to commit intimate image offences or intended to use for that purpose in future. For those reasons and the reassurances I have given today, I hope that noble Lords will feel able to withdraw or not press their amendments.
(6 months ago)
Grand CommitteeMy Lords, the debate on this group emphasises how far behind the curve we are, whether it is by including new provisions in this Bill or by bringing forward an AI Bill—which, after all, was promised in the Government’s manifesto. It emphasises that we are not moving nearly fast enough in thinking about the implications of AI. While we are doing so, I need to declare an interest as co-chair of the All-Party Parliamentary Group on AI and a consultant to DLA Piper on AI policy and regulation.
I have followed the progress of AI since 2016 in the capacity of co-chair of the all-party group and chair of the AI Select Committee. We need to move much faster on a whole range of different issues. I very much hope that the noble Lord, Lord Vallance, will be here on Wednesday, when we discuss our crawler amendments, because although the noble Lord, Lord Holmes, has tabled Amendment 211A, which deals with personality rights, there is also extreme concern about the whole area of copyright. I was tipped off by the noble Lord, Lord Stevenson, so I was slightly surprised that he did not bring our attention to it: we are clearly due the consultation at any moment on intellectual property, but there seems to be some proposal within it for personality rights themselves. Whether that is a quid pro quo for a much-weakened situation on text and data mining, I do not know, but something appears to be moving out there which may become clear later this week. It seems a strange time to issue a consultation, but I recognise that it has been somewhat delayed.
In the meantime, we are forced to put forward amendments to this Bill trying to anticipate some of the issues that artificial intelligence is increasingly giving rise to. I strongly support Amendments 92, 93, 101 and 105 put forward by the noble Viscount, Lord Colville, to prevent misuse of Clause 77 by generative AI developers; I very much support the noble Lord, Lord Holmes, in wanting to see protection for image, likeness and personality; and I very much hope that we will get a positive response from the Minister in that respect.
We have heard from the noble Baronesses, Lady Kidron and Lady Harding, and the noble Lords, Lord Russell and Lord Stevenson, all of whom have made powerful speeches on previous Bills—the then Online Safety Bill and the Data Protection and Digital Information Bill—to say that children should have special protection in data protection law. As the noble Baroness, Lady Kidron, says, we need to move on from the AADC. That was a triumph she gained during the passage of the Data Protection Act 2018, but six years later the world looks very different and young people need protection from AI models of the kind she has set out in Amendment 137. I agree with the noble Lord, Lord Stevenson, that we need to talk these things through. If it produces an amendment to this Bill that is agreed, all well and good, but it could mean an amendment or part of a new AI Bill when that comes forward. Either way, we need to think constructively in this area because protection of children in the face of generative AI models, in particular, is extremely important.
This group, looking forward to further harms that could be caused by AI, is extremely important on how we can mitigate them in a number of different ways, despite the fact that these amendments appear to deal with quite a disparate group of issues.
My Lords, I too thank all noble Lords for their insightful contributions to this important group of amendments, even if some of them bemoaned the fact that they have had to repeat themselves over the course of several Bills. I am also very heartened to see how many people have joined us for Committee today. I have been involved in only two of these sittings, but this is certainly a record, and on present trends it is going to be standing room only, which is all to the good.
I have two observations before I start. First, we have to acknowledge that perhaps this area is among the most important we are going to discuss. The rights and protections of data subjects, particularly children, are in many ways the crux of all this and we have to get it right. Secondly, I absolutely take on board that there is a real appetite to get ahead of something around AI legislation. I have an amendment I am very excited about later when we come particularly to ADM, and there will be others as well, but I absolutely take on board that we need to get going on that.
Amendment 92 in the names of the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, seeks to reduce the likelihood of the misuse of Clause 77 by AI model developers who may seek to claim that they do not need to notify data subjects of reuse for scientific purposes under that clause. This relates to the way that personal data is typically collected and processed for AI development. Amendment 93 similarly seeks to reduce the possibility of misuse of Clause 77 by model developers who could claim they do not need to notify data subjects of reuse for scientific purposes. Amendment 101 also claims to address the potential misuse of Clause 77 by the developers, as does Amendment 105. I strongly support the intent of amendments from the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, in seeking to maintain and make provisions for the rights and protections of data subjects, and look forward very much to hearing the views of the Minister.
I turn to Amendment 137 in the names of the noble Lords, Lord Russell and Lord Stevenson, and the noble Baronesses, Lady Kidron and Lady Harding. This amendment would require the commissioner to prepare and produce a code of practice which ensures that data processors prioritise the interests, rights and freedoms of children. It goes without saying that the rights and protection of children are of utmost importance. Certainly, this amendment looks to me not only practical but proportionate, and I support it.
Finally, Amendment 211A in the name of my noble friend Lord Holmes ensures the prohibition of
“the development, deployment, marketing and sale of data related to an individual’s image, likeness or personality for AI training”
without that person’s consent. Like the other amendments in this group, this makes provision to strengthen the rights and protections of data subjects against the potential misuse or sale of data and seems entirely sensible. I am sure the Minister has listened carefully to all the concerns powerfully raised from all sides of the Committee today. It is so important that we do not lose sight of the importance of the rights and protection of data subjects.
My Lords, I welcome the amendments spoken to so well by the noble Baroness, Lady Harding, regarding the open electoral register. They are intended to provide legal certainty around the use of the register, without compromising on any aspect of the data privacy of UK citizens or risking data adequacy. The amendments specify that companies are exempt from the requirement to provide individuals with information in cases where their personal data has not been obtained directly from them if that data was obtained from the open electoral register. They also provide further clarification on what constitutes “disproportionate effort” under new paragraph 5(e) of Article 14 of GDPR.
The noble Baroness covered the ground so effectively that all I need to add is that the precedent established by the current interpretation by the tribunal will affect not only the open electoral register but other public sources of data, including the register of companies, the Registry of Judgments, Orders and Fines, the Land Registry and the Food Standards Agency register. Importantly, it may even prevent the important work being done to create a national data library achieving its objectives of public sector data sharing. It will have far-reaching implications if we do not change the Bill in the way that the noble Baroness has put forward.
I thank the noble Lord, Lord Lucas, for his support for Amendment 160. I reciprocate in supporting—or, at least, hoping that we get clarification as a result of—his Amendments 158 and 161.
Amendment 159B seeks to ban what are colloquially known as cookie paywalls. As can be seen, it is the diametric opposite to Amendment 159A, tabled by the noble Viscount, Lord Camrose. For some unaccountable reason, cookie paywalls require a person who accesses a website or app to pay a fee to refuse consent to cookies being accessed from or stored on their device. Some of these sums can be extortionate and exorbitant, so I was rather surprised by the noble Viscount’s counter amendment.
Earlier this year, the Information Commissioner launched a call for views which looked to obtain a range of views on its regulatory approach to consent or pay models under data protection law. The call for views highlighted that organisations that are looking to adopt, or have already adopted, a consent-or-pay model must consider the data protection implications.
Cookie paywalls are a scam and reduce people’s power to control their data. I wonder why someone must pay if they do not consent to cookies being stored or accessed. The PEC regulations do not currently prohibit cookie paywalls. The relevant regulation is Regulation 6, which is due to be substituted by Clause 111, and is supplemented by new Schedule A1 to the PEC regulations, as inserted by Schedule 12 to the Bill. The regulation, as substituted by Clause 111 and Schedule 12, does not prohibit cookie paywalls. This comes down to the detail of the regulations, both as they currently are and as they will be if the Bill remains as drafted. It is drafted in terms that do not prevent a person signifying lack of consent to cookies, and a provider may add or set controls—namely, by imposing requirements—for how a person may signify that lack of consent. Cookie paywalls would therefore be completely legal, and they certainly have proliferated online.
This amendment makes it crystal clear that a provider must not require a person to pay a fee to signify lack of consent to their data being stored or accessed. This would mean that, in effect, cookie paywalls would be banned.
Amendment 160 is sought by the Advertising Association. It seeks to ensure that the technical storage of or access to information is considered necessary under paragraph 5 of the new Schedule A1 to the PEC regulations inserted by Schedule 12 if it would support measurement or verification of the performance of advertising services to allow website owners to charge for their advertising services more accurately. The Bill provides practical amendments to the PEC regulations through listing the types of cookies that no longer require consent.
This is important, as not all cookies should be treated the same and not all carry the same high-level risks to personal privacy. Some are integral to the service and the website itself and are extremely important for subscription-free content offered by publishers, which is principally funded by advertising. Introducing specific and target cookie exemptions has the benefit of, first, simplifying the cookie consent banner, and, secondly, increasing further legal and economic certainty for online publishers. As I said when we debated the DPDI Bill, audience measurement is an important function for media owners to determine the consumption of content, to be able to price advertising space for advertisers. Such metrics are crucial to assess the effectiveness of a media channel. For sites that carry advertising, cookies are used to verify the delivery and performance of a digital advertisement—ie, confirmation that an ad has been served or presented to a user and whether it has been clicked on. This is essential information to invoice an advertiser accurately for the number of ad impressions in a digital ad campaign.
My reading of the Bill suggests that audience measurement cookies would be covered under the list of exemptions from consent under Schedule 12, however. Can the Government confirm this? Is it the Government’s intention to use secondary legislation in future to exempt ad performance cookies?
Coming to Amendment 162 relating to the soft opt-in, I am grateful to the noble Lord, Lord Black of Brentwood, and the noble Baroness, Lady Harding of Winscombe, for their support. This amendment would enable charities to communicate to donors in the same way that businesses have been able to communicate to customers since 2003. The clause will help to facilitate greater fundraising and support the important work that charities do for society. I can do no better than quote from the letter that was sent to Secretary of State Peter Kyle on 25 November, which was co-ordinated by the DMA and involved nearly 20 major charities, seeking support for reinstating the original Clause 115 of the DPDI Bill into this Bill:
“Clause 115 of the previous DPDI Bill extended the ‘soft opt-in’ for email marketing for charities and non-commercial organisations. The DMA estimates that extending the soft opt-in to charities would increase annual donations in the UK by £290 million”,
based on analysis of 13.1 million donors by the Salocin Group. The letter continues:
“At present, the DUA Bill proposals remove this. The omission of the soft opt-in will prevent charities from being able to communicate to donors in the same way as businesses can. As representatives of both corporate entities and charitable organisations, it is unclear to the DMA why charities should be at a disadvantage in this regard”.
I hope that the Government will listen to the DMA and the charities involved.
I thank noble Lords for their comments and contributions. I shall jump to Amendments 159 and 159A, one of which is in my name and both of which are concerned with cookie paywalls. I am not sure I can have properly understood the objection to cookie paywalls. Do they not simply offer users three choices: pay money and stay private; share personal data and read for free; or walk away? So many times, we have all complained about the fact that these websites harvest our data and now, for the first time, this approach sets a clear cash value on the data that they are harvesting and offers us the choice. The other day somebody sent me a link from the Sun. I had those choices. I did not want to pay the money or share my data, so I did not read the article. I feel this is a personal decision, supported by clear data, which it is up to the individual to take, not the Government. I do not think we should take away this choice.
Let me turn to some of the other amendments in this group. Amendment 161 in the name of my noble friend Lord Lucas is, if I may say so, a thoughtful amendment. It would allow pension providers to communicate information on their product. This may mean that the person who will benefit from that pension does not miss out on useful information that would benefit their saving for retirement. Given that pension providers already hold the saver’s personal data, it seems to be merely a question of whether this information is wanted; of course, if it is not, the saver can simply opt out.
Amendment 162 makes an important point: many charities rely on donations from the public. Perhaps we should consider bringing down the barriers to contacting people regarding fundraising activities. At the very least, I am personally not convinced that members of the public have different expectations around what kinds of organisation can and cannot contact them and in what circumstances, so I support any step that simplifies the—to my mind—rather arbitrary differences in the treatment of business and charity communications.
Amendment 104 certainly seems a reasonable addition to the list of what might constitute “unreasonable effort” if the information is already public. However, I have some concerns about Amendments 98 and 100 to 103. For Amendment 98, who would judge the impact on the individual? I suspect that the individual and the data controllers may have different opinions on this. In Amendment 100, the effort and cost of compliance are thorny issues that would surely be dictated by the nature of the data itself and the reason for providing it to data subjects. In short, I am concerned that the controllers’ view may be more subjective than we would want.
On Amendment 102, again, when it comes to providing information to them,
“the damage and distress to the data subjects”
is a phrase on which the subject and the controller will almost inevitably have differing opinions. How will these be balanced? Additionally, one might presume that information that is either damaging or distressing to the data subjects should not necessarily be withheld from them as it is likely to be extremely important.
My Lords, we have covered a range of issues in our debate on this grouping; nevertheless, I will try to address each of them in turn. I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Harding, for their Amendments 95, 96, 98, 100, 102 to 104 and 106 regarding notification requirements.
First, with regard to the amendments in the name of the noble Baroness, Lady Harding, I say that although the Government support the use of public data sources, transparency is a key data protection principle. We do not agree that such use of personal data should remove or undermine the transparency requirements. The ICO considers that the use and sale of open electoral register data alone is likely not to require notification. However, when the data is combined with data from other sources, in order to build an extensive profile to be sold on for direct marketing, notification may be proportionate since the processing may go beyond the individual’s reasonable expectations. When individuals are not notified about processing, it makes it harder for them to exercise their data subject rights, such as the right to object.
Adding other factors to the list of what constitutes a “disproportionate effort” for notification is unnecessary given that the list is already non-exhaustive. The “disproportionate effort” exemption must be applied according to the safeguards of the wider data protection framework. According to the fairness principle, controllers should already account for whether the processing meets the reasonable expectations of a data subject. The data minimisation and purpose limitation principles also act as an important consideration for data controllers. Controllers should continue to assess on a case-by-case basis whether they meet the threshold for the existing exemptions to notify; if not, they should notify. I hope that this helps clarify our position on that.
My Lords, I rise briefly to support my friend, the noble Lord, Lord Clement-Jones, and his string of amendments. He made the case clearly: it is simply about access, the right to redress and a clear pathway to that redress, a more efficient process and clarity and consistency across this part of our data landscape. There is precious little point in having obscure remedies or rights—or even, in some cases, as we have discussed in our debates on previous groups, no right or obvious pathways to redress. I believe that this suite of amendments addresses that issue. Again, I full-throatedly support them.
My Lords, I address the amendments tabled by the noble Lord, Lord Clement-Jones. These proposals aim to transfer jurisdiction from courts to tribunals; to establish a new right of appeal against decisions made by the Information Commissioner; and to grant the Lord Chancellor authority to implement tribunal procedure rules. I understand and recognise the noble Lord’s intent here, of course, but I have reservations about these amendments and urge caution in accepting them.
The suggestion to transfer jurisdiction from courts to tribunals raises substantial concerns. Courts have a long-standing authority and expertise in adjudicating complex legal matters, including data protection cases. By removing these disputes from the purview of the courts, the risk is that we undermine the depth and breadth of legal oversight required in such critical areas. Tribunals, while valuable for specialised and expedited decisions, may not provide the same level of rigorous legal analysis.
Cases such as those cited by the noble Lord, Lord Clement-Jones—Killock and another v the Information Commissioner and Delo v the Information Commissioner—demonstrate to me the intricate interplay between data protection, administrative discretion and broader legal principles. It is questionable whether tribunals, operating under less formal procedures, can consistently handle such complexities without diminishing the quality of justice. Further, I am not sure that the claim that this transfer will streamline the system and reduce burdens on the courts is fully persuasive. Shifting cases to tribunals does not eliminate complexity; it merely reallocates it, potentially at the expense of the detailed scrutiny that these cases demand.
I turn to the right of appeal against the commissioner’s decisions. Although the introduction of a right of appeal against these decisions may seem like a safeguard, it risks creating unnecessary layers of litigation. The ICO already operates within a robust framework of accountability, including judicial review for cases of legal error or improper exercise of discretion. Adding a formal right of appeal risks encouraging vexatious challenges, overwhelming the tribunal system and diverting resources from addressing genuine grievances.
I think we in my party understand the importance of regulatory accountability. However, creating additional mechanisms should not come at the expense of efficiency and proportionality. The existing legal remedies are designed to strike an appropriate balance, and further appeals risk creating a chilling effect on the ICO’s ability to act decisively in protecting data rights.
On tribunal procedure rules and centralised authority, the proposed amendment granting the Lord Chancellor authority to set tribunal procedure rules bypasses the Tribunal Procedure Committee, an independent body designed to ensure that procedural changes are developed with judicial oversight. This move raises concerns about the concentration of power and the erosion of established checks and balances. I am concerned that this is a case of expediency overriding the principles of good governance. While I acknowledge that consultation with the judiciary is included in the amendment, it is not a sufficient substitute for the independent deliberative processes currently in place. The amendment risks undermining the independence of our legal institutions and therefore I have concerns about it.
These amendments overall, while presented as technical fixes, and certainly I recognise the problem and the intent, would have far-reaching consequences for our data protection framework. The vision of my party for governance is one that prioritises stability, legal certainty and the preservation of integrity. We must avoid reforms that, whatever their intent, introduce confusion or inefficiency or undermine public trust in our system. Data protection is, needless to say, a cornerstone of our modern economy and individual rights. As such, any changes to its governance must be approached with the utmost care.
I thank the noble Lord, Lord Clement-Jones, for his Amendments 108, 146 to 153 and 157, and I am grateful for the comments by the noble Lord, Lord Holmes, and the noble Viscount, Lord Camrose.
The effect of this group of amendments would be to make the First-tier Tribunal and the Upper-tier Tribunal responsible for all data protection cases. They would transfer ongoing as well as future cases out of the court system to the relevant tribunals and, as has been alluded to, may cause more confusion in doing so.
As the noble Lord is aware, there is currently a blend of jurisdiction under the data protection legislation for both tribunals and courts according to the nature of the proceedings in question. This is because certain types of cases are appropriate to fall under tribunal jurisdiction while others are more appropriate for court settings. For example, claims by individuals against organisations for breaches of legal requirements can result in awards of compensation for the individuals and financial and reputational damage for the organisations. It is appropriate that such cases are handled by a court in conformance with their strict procedural and evidential rules. Indeed, under the Killock and Delo examples, it was noted that there could be additional confusion in that ability to go between those two possibilities if you went solely to one of the tribunals.
On the transfer of responsibility for making tribunal procedural rules from the Tribunal Procedure Committee to the Lord Chancellor, we think that would be inappropriate. The committee is comprised of legal experts appointed or nominated by senior members of the judiciary or the Lord Chancellor. This committee is best placed to make rules to ensure that tribunals are accessible and fair and that cases are dealt with quickly and efficiently. It keeps the rules under constant review to ensure that they are fit for purpose in line with new appeal rights and the most recent legislative changes.
Amendment 151 would also introduce a statutory appeals procedure for tribunals to determine the merits of decisions made by the Information Commissioner. Data subjects and controllers alike can already challenge the merits of the Information Commissioner’s decisions by way of judicial review in a way that would preserve the discretion and independence of the Information Commissioner’s decision-making, so no statutory procedure is needed. The Government therefore believe that the current jurisdictional framework is well-balanced and equitable, and that it provides effective and practical routes of redress for data subjects and controllers as well as appropriate safeguards to ensure compliance by organisations. For these reasons, I hope the noble Lord will not press his amendments.
My Lords, in speaking to this group of amendments I must apologise to the Committee that, when I spoke last week, I forgot to mention my interests in the register, specifically as an unpaid adviser to the Startup Coalition. For Committee, noble Lords will realise that I have confined myself to amendments that may be relevant to our healthcare and improving that.
I will speak to Amendments 111 and 116 in the names of my noble friends Lord Camrose and Lord Markham, and Amendment 115 from my noble friend Lord Lucas and the noble Lords, Lord Clement-Jones and Lord Knight of Weymouth, as well as other amendments, including from my noble friend Lord Holmes—I will probably touch on most amendments in this group. To illustrate my concerns, I return to two personal experiences that I shared during debate on the Data Protection and Digital Information Bill. I apologise to noble Lords who have heard these examples previously, but they illustrate the points being made in discussing this group of amendments.
A few years ago, when I was supposed to be travelling to Strasbourg, my train to the airport got delayed. My staff picked me up, booked me a new flight and drove me to the airport. I got to the airport with my new boarding pass and scanned it to get into the gate area, but as I was about to get on the flight, I scanned my pass again and was not allowed on the flight. No one there could explain why, having been allowed through security, I was not allowed on the flight. To cut a long story short, after two hours of being gaslighted by four or five staff, with them not even saying that they could not explain things to me, I eventually had to return to the check-in desk—this was supposed to be avoided by all the automation—to ask what had happened. The airline claimed that it had sent me an email that day. The next day, it admitted that it had not sent me an email. It then explained what had happened by saying that a flag had gone off in its system. That was simply the explanation.
This illustrates the point about human intervention, but it is also about telling customers and others what happens when something goes wrong. The company clearly had not trained its staff in how to speak to customers or in transparency. Companies such as that airline get away with this sort of disgraceful behaviour all the time, but imagine if such technology were being used in the NHS. Imagine the same scenario: you turn up for an operation, and you scan your barcode to enter the hospital—possibly even the operating theatre—but you are denied access. There must be accountability, transparency and human intervention, and, in these instances, there has to be human intervention immediately. These things are critical.
I know that this Bill makes some sort of differentiation between more critical and less critical ADM, but let me illustrate my point with another example. A few years ago, I paid for an account with one of those whizzy fintech banks. Its slogan was: “We are here to make money work for everyone”. I downloaded the app and filled out the fields, then a message popped up telling me, “We will get back to you within 48 hours”. Two weeks later, I got a message on the app saying that I had been rejected and that, by law, the bank did not have to explain why. Once again, I ask noble Lords to imagine. Imagine Monzo’s technology being used on the NHS app, which many people currently use for repeat prescriptions or booking appointments. What would happen if you tried to book an appointment but you received a message saying, “Your appointment has been denied and, by law, we do not have to explain why”? I hope that we would have enough common sense to ensure that there is human intervention immediately.
I realise that the noble Lord, Lord Clement-Jones, has a Private Member’s Bill on this issue—I am sorry that I have not been able to take part in those debates—but, for this Bill, I hope that the two examples I have just shared illustrate the point that I know many noble Lords are trying to make in our debate on this group of amendments. I look forward to the response from the Minister.
I thank all noble Lords who have spoken. I must confess that, of all the groups we are looking at today, I have been particularly looking forward to this one. I find this area absolutely fascinating.
Let me begin in that spirit by addressing an amendment in my name and that of my noble friend Lord Markham and I ask the Government and all noble Lords to give it considerable attention. Amendment 111 seeks to insert the five principles set out in the AI White Paper published by the previous Government and to require all those participating in ADM—indeed, all forms of AI—to have due regard for them. They are:
“safety, security and robustness, appropriate transparency and explainability, fairness, accountability and governance, and contestability and redress”.
These principles for safe AI are based on those originally developed with the OECD and have been the subject of extensive consultation. They have been refined and very positively received by developers, public sector organisations, private sector organisations and civil society. They offer real safeguards against the risks of AI while continuing to foster innovation.
I will briefly make three brief points to commend their inclusion in the Bill, as I have described. First, the Bill team has argued throughout that these principles are already addressed by the principles of data protection and so are covered in the Bill. There is overlap, of course, but I do not agree that they are equivalent. Data protection is a significant concern in AI but the risks and, indeed, the possibilities of AI go far further than data protection. We simply cannot entrust all our AI risks to data protection principles.
Secondly, I think the Government will point to their coming AI Bill and suggest that we should wait for that before we move significantly on AI. However, in practice all we have to go on about the Bill—I recognise that Ministers cannot describe much of it now—is that it will focus on the largest AI labs and the largest models. I assume it will place existing voluntary agreements on a statutory footing. In other words, we do not know when the Bill is coming, but this approach will allow a great many smaller AI fish to slip through the net. If we want to enshrine principles into law that cover all use of AI here, this may not quite be the only game in town, but it is certainly the only all-encompassing, holistic game in town likely to be positively impactful. I look forward to the Minister’s comments on this point.
The Secretary of State can help describe specific cases in the future but, on the point made by my noble friend Lord Knight, the ICO guidance will clarify some of that. There will be prior consultation with the ICO before that guidance is finalised, but if noble Lords are in any doubt about this, I am happy to write and confirm that in more detail.
Amendment 115 in the names of the noble Lords, Lord Clement-Jones, Lord Lucas and Lord Knight, and Amendment 123A in the name of the noble Lord, Lord Holmes, seek to ensure that individuals are provided with clear and accessible information about solely automated decision-making. The safeguards set out in Clause 80, alongside the wider data protection framework’s safeguards, such as the transparency principle, already achieve this purpose. The UK GDPR requires organisations to notify individuals about the existence of automated decision-making and provide meaningful information about the logic involved in a clear and accessible format. Individuals who have been subject to solely automated decisions must be provided with information about the decisions.
On Amendment 116 in the names of the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, I reassure noble Lords that Clause 69 already provides a definition of consent that applies to all processing under the law enforcement regime.
On Amendment 117 in the names of the noble Viscount, Lord Camrose, the noble Lords, Lord Markham, and my noble friend Lord Knight, I agree with them on the importance of protecting the sensitive personal data of children by law enforcement agencies, and there is extensive guidance on this issue. However, consent is rarely used as the basis for processing law enforcement data. Other law enforcement purposes, such as the prevention, detection and investigation of crime, are quite often used instead.
I will address Amendment 118 in the name of the noble Viscount, Lord Camrose, and Amendment 123B in the name of the noble Lord, Lord Holmes, together, as they focus on obtaining human intervention for a solely automated decision. I agree that human intervention should be carried out competently and by a person with the authority to correct a wrongful outcome. However, the Government believe that there is currently no need to specify the qualifications of human reviewers as the ICO’s existing guidance explains how requests for human review should be managed.
Does the Minister agree that the crux of this machinery is solely automated decision-making as a binary thing—it is or it is not—and, therefore, that the absolute key to it is making sure that the humans involved are suitably qualified and finding some way to do so, whether by writing a definition or publishing guidelines?
On the question of qualification, the Minister may wish to reflect on the broad discussions we have had in the past around certification and the role it may play. I gently her take her back to what she said on Amendment 123A about notification. Does she see notification as the same as a personalised response to an individual?
My Lords, I had expected the noble Baroness, Lady Owen of Alderley Edge, to be in the Room at this point. She is not, so I wish to draw the Committee’s attention to her Amendment 210. On Friday, many of us were in the Chamber when she made a fantastic case for her Private Member’s Bill. It obviously dealt with a much broader set of issues but, as we have just heard, the overwhelming feeling of the House was to support her. I think we would all like to see the Government wrap it up, put a bow on it and give it to us all for Christmas. But, given that that was not the indication we got, I believe that the noble Baroness’s intention here is to deal with the fact that the police are giving phones and devices back to perpetrators with the images remaining on them. That is an extraordinary revictimisation of people who have been through enough. So, whether or not this is the exact wording or way to do it, I urge the Government to look on this carefully and positively to find a way of allowing the police the legal right to delete data in those circumstances.
My Lords, none of us can be under any illusion about the growing threats of cyberattacks, whether from state actors, state-affiliated actors or criminal gangs. It is pretty unusual nowadays to find someone who has not received a phishing email, had hackers target an account or been promised untold riches by a prince from a faraway country. But, while technology has empowered these criminals, it is also the most powerful tool we have against them. To that end, we must do all we can do to assist the police, the NCA, the CPS, the SIS and their overseas counterparts in countries much like our own. That said, we must also balance this assistance with the right of individuals to privacy.
Regarding the Clause 81 stand part notice from the noble Lord, Lord Clement-Jones, I respectfully disagree with this suggestion. If someone within the police were to access police records in an unauthorised capacity or for malign reasons, I simply doubt that they would be foolish enough to enter their true intentions into an access log. They would lie, of course, rendering the log pointless, so I struggle to see—we had this debate on the DPDI Bill—how this logging system would help the police to identify unauthorised access to sensitive data. It would simply eat up hours of valuable police time. I remember from our time working on the DPDI Bill that the police supported this view.
As for Amendment 124, which allows for greater collaboration between the police and the CPS when deciding charging decisions, there is certainly something to be said for this principle. If being able to share more detailed information would help the police and the CPS come to the best decision for victims, society and justice, then I absolutely support it.
Amendments 126, 128 and 129 seek to keep the UK in close alignment with the EU regarding data sharing. EU alignment or non-alignment is surely a decision for the Government of the day alone. We should not look to bind a future Administration to the EU.
I understand that Amendment 127 looks to allow data transfers to competent authorities—that is, law enforcement bodies in other countries—that may have a legitimate operating need. Is this not already the case? Are there existing provisions in the Bill to facilitate such transfers and, if so, does this not therefore duplicate them? I would very much welcome the thoughts of both the Minister and the noble Lord, Lord Clement-Jones, when he sums up at the end.
Amendment 156A would add to the definition of “unauthorised access” so that it includes instances where a person accesses data in the reasonable knowledge that the controller would not consent if they knew about the access or the reason for the access, and the person is not empowered to access it by an enactment. Given the amount of valuable personal data held by controllers as our lives continue to move online, there is real merit to this idea from my noble friend Lord Holmes, and I look forward to hearing the views of the Minister.
Finally, I feel Amendment 210 from my noble friend Lady Owen—ably supported in her unfortunate absence by the noble Baroness, Lady Kidron—is an excellent amendment as it prevents a person convicted of a sexual offence from retaining the images that breached the law. This will prevent them from continuing to use the images for their own ends and from sharing them further. It would help the victims of these crimes regain control of these images which, I hope, would be of great value to those affected. I hope that the Minister will give this serious consideration, particularly in light of noble Lords’ very positive response to my noble friend’s Private Member’s Bill at the end of last week.
I think the noble Viscount, Lord Camrose, referred to Amendment 156A from the noble Lord, Lord Holmes—I think he will find that is in a future group. I saw the Minister looking askance because I doubt whether she has a note on it at this stage.
I thank the noble Lord, Lord Clement-Jones; let me consider it a marker for future discussion.
I thank the noble Lord, Lord Clement-Jones, for coming to my rescue there.
I turn to the Clause 81 stand part notice tabled by the noble Lord, Lord Clement-Jones, which would remove Clause 81 from the Bill. Section 62 of the Data Protection Act requires law enforcement agencies to record their processing activities, including their reasons for accessing and disclosing personal information. Entering a justification manually was intended to help detect unauthorised access. The noble Lord was right that the police do sometimes abuse their power; however, I agree with the noble Viscount, Lord Camrose, that the reality is that anyone accessing the system unlawfully is highly unlikely to record that, making this an ineffective safeguard.
Meanwhile, the position of the National Police Chiefs’ Council is that this change will not impede any investigation concerning the unlawful processing of personal data. Clause 81 does not remove the strong safeguards that ensure accountability for data use by law enforcement that include the requirement to record time, date, and where possible, who has accessed the data, which are far more effective in monitoring potential data misuse. We would argue that the requirement to manually record a justification every time case information is accessed places a considerable burden on policing. I think the noble Lord himself said that we estimate that this clause may save approximately 1.5 million policing hours, equivalent to a saving in the region of £42.8 million a year.
These four technical government amendments do not, we believe, have a material policy effect but will improve the clarity and operation of the Bill text.
Amendment 133 amends Section 199 of the Investigatory Powers Act 2016, which provides a definition of “personal data” for the purposes of bulk personal datasets. This definition cross-refers to Section 82(1) of the Data Protection Act 2018, which is amended by Clauses 88 and 89 of the Bill, providing for joint processing by the intelligence services and competent authorities. This amendment will retain the effect of that cross-reference to ensure that processing referred to in Section 199 of the IPA remains that done by an intelligence service.
Amendment 136 concerns Clause 92 and ICO codes of practice. Clause 92 establishes a new procedure for panels to consider ICO codes of practice before they are finalised. It includes a regulation-making power for the Secretary of State to disapply or modify that procedure for particular codes or amendments to them. Amendment 136 will enable the power to be used to disapply or modify the panel’s procedure for specific amendments or types of amendments to a code, rather than for all amendments to it.
Finally, Amendments 213 and 214 will allow for changes made to certain immigration legislation and the Online Safety Act 2023 by Clauses 55, 122 and 123 to be extended via existing powers in those Acts, exercisable by Orders in Council, to Guernsey and the Isle of Man, should they seek this.
I beg to move.
My Lords, I will keep my comments brief as these are all technical amendments to the Bill. I understand that Amendments 133 and 136 are necessary for the functioning of the law and therefore have no objection. As for Amendment 213, extending immigration legislation amended by Clause 55 of this Bill to the Bailiwick of Guernsey or the Isle of Man, this is a sensible measure. The same can be said for Amendment 214, which extends the provision of the Online Safety Act 2023, amended by this Bill, to the Bailiwick of Guernsey or the Isle of Man.
My Lords, given the hour, I will try to be as brief as possible. I will start by speaking to the amendments tabled in my name.
Amendment 142 seeks to prevent the Information Commissioner’s Office sending official notices via email. Official notices from the ICO will not be trivial: they relate to serious matters of data protection, such as monetary penalty notices or enforcement notices. My concern is that it is all too easy for an email to be missed. An email may be filtered into a spam folder, where it sits for weeks before being picked up. It is also possible that an email may be sent to a compromised email address, meaning one that the holder has lost control of due to a hacker. These concerns led me also to table Amendment 143, which removes the assumption that a notice sent by email had been received within 48 hours of being sent.
Additionally, I suspect I am right in saying that a great many people expect official correspondence to arrive via the post. I wonder, therefore, whether there might be a risk that people ignore an unexpected email from the ICO, concerned that it might well be a scam or a hack of some description. I, for one, am certainly deeply suspicious of unexpected but official-looking messages that arrive. I believe that official correspondence which may have legal ramifications should really be sent by post.
On some of the other amendments tabled, Amendment 135A, which seeks to introduce a measure from the DPDI Bill, makes provision for the introduction of a statement of strategic priorities by the Secretary of State that sets out the Government’s data protection priorities, to which the commissioner must have regard, and the commissioner’s duties in relation to the statement. Although I absolutely accept that this measure would create more alignment and efficiency in the way that data protection is managed, I understand the concerns that it would undermine the independence of the Information Commissioner’s Office. That in itself, of course, would tend to bear on the adequacy risk.
I do not support the stand part notices on Clauses 91 and 92. Clause 91 requires the Information Commissioner to prepare codes of practice for the processing of data, which seems a positive measure. It provides guidance to controllers, helping them to control best practice when processing data, and is good for data subjects, as it is more likely that their data will be processed in an appropriate manner. As for Clause 92, which would effectively increase expert oversight of codes of practice, surely that would lead to more effective codes, which will benefit both controllers and data subjects.
I have some concerns about Amendment 144, which limits the Information Commissioner to sending only one reprimand to a given controller during a fixed period. If a controller or processor conducts activities that infringe the provisions of the GDPR and does so repeatedly, why should the commissioner be prevented from issuing reprimands? Indeed, what incentives does that give for people to commit a minor sin and then a major one later?
I welcome Amendment 145, in the name of the noble Baroness, Lady Kidron, which would ensure that the ICO’s annual report records activities and action taken by the ICO in relation to children. This would clearly give the commissioner, parliamentarians and the data and tech industry as a whole a better understanding of how policies are affecting children and what changes may be necessary.
Finally, I turn my attention to many of the amendments tabled by the noble Lord, Lord Clement-Jones, which seek to remove the involvement of the Secretary of State from the functions of the commissioner and transfer the responsibility from government to Parliament. I absolutely understand the arguments the noble Lord advances, as persuasively as ever, but I am concerned even so that the Secretary of State for the relevant department is the best person to work with the commissioner to ensure both clarity of purpose and rapidity of decision-making.
I wanted to rise to my feet in time to stop the noble Viscount leaping forward as he gets more and more excited as we reach—I hope—possibly the last few minutes of this debate. I am freezing to death here.
I wish only to add my support to the points of the noble Baroness, Lady Kidron, on Amendment 145. It is much overused saw, but if it is not measured, it will not get reported.