Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Freyberg
Main Page: Lord Freyberg (Crossbench - Excepted Hereditary)Department Debates - View all Lord Freyberg's debates with the Department for Business and Trade
(1 day, 19 hours ago)
Lords ChamberMy Lords, I was IP Minister for nearly three years and I am a long-standing member of the APPG on IP. It is a great pleasure to speak from the Back Benches and to support the Motion in the name of the noble Baroness, Lady Kidron, and my noble friend Lord Camrose’s amendment.
What concerns me is that we are witnessing an assault on a sector worth £160 billion to the UK, as we have heard. Actually, I suspect that may be an underestimate, because IP and copyright are to be found in the nooks and crannies of so much of our life and our industry. There has been a lot of mention of music and media. Nobody has mentioned breeding and performance data on racehorses, information on art and antiques, or—close to my heart—the design, by young graduates, of gorgeous new clothing and fancy footwear of the kind that I wear. It is the small operators that are most at risk. That is why I am speaking today.
We are going too slowly. Amendments have been knocked back. The noble Baroness, Lady Kidron, has been trying her hardest, with a great deal of support from right across Britain. As time goes by, AI and LLMs are stealing more of our creativity, hitting UK growth. I believe that the Government must get on. It is not easy, but it is a challenge they have to rise to, and very quickly.
My Lords, I support Motion 49A from the noble Baroness, Lady Kidron. I will also address claims that we have heard repeatedly in these debates: that transparency for AI data is technically unfeasible. This claim, forcefully pushed by technology giants such as Google, is not only unsupported by evidence but deliberately misleading.
As someone with a long-standing background in the visual arts, and as a member of DACS—the Design and Artists Copyright Society—I have witnessed first-hand how creators’ works are being exploited without consent or compensation. I have listened carefully to the concerns expressed by the noble Lord, Lord Tarassenko, in both his email to colleagues today and the letter from entrepreneurs to the Secretary of State. Although I deeply respect their expertise and commitment to innovation, I must firmly reject their assessment, which echoes the talking points of trillion-dollar tech corporations.
The claims by tech companies that transparency requirements are technically unfeasible have been thoroughly debunked. The LAION dataset already meticulously documents over 5 billion images, with granular detail. Companies operate crawler services on this dataset to identify images belonging to specific rights holders. This irrefutably demonstrates that transparency at scale is not only possible but already practised when it suits corporate interests.
Let us be clear about what is happening: AI companies are systematically ingesting billions of copyrighted works without permission or payment, then claiming it would be too difficult to tell creators which works have been taken. This is theft on an industrial scale, dressed up as inevitable technological progress.
The claim from the noble Lord, Lord Tarassenko, that these amendments would damage UK AI start-ups while sparing US technology giants is entirely backwards. Transparency would actually level the playing field by benefiting innovative British companies while preventing larger firms exploiting creative works without permission. I must respectfully suggest that concerns about potential harm to AI start-ups should be balanced against the devastating impact on our creative industries, thousands of small businesses and individual creators whose livelihoods depend on proper recognition and compensation for their work. Their continued viability depends fundamentally on protecting intellectual property rights. Without transparency, how can creators even begin to enforce these rights? The question answers itself.
This is not about choosing between technology and creativity; it is about ensuring that both sectors can thrive through fair collaboration based on consent and compensation. Transparency is not an obstacle to innovation; it is the foundation on which responsible, sustainable innovation is built.
Google’s preferred approach would reverse the fundamental basis of UK copyright law by placing an unreasonable burden on rights holders to opt out of having their work stolen. This approach is unworkable and would, effectively, legalise mass copyright theft to benefit primarily American technology corporations.
Rather than waiting for a consultation outcome that may take years, while creative works continue to be misappropriated, Motion 49A offers a practical step forward that would benefit both sectors while upholding existing law. I urge the House to support it.
My Lords, it has been a privilege to listen to today’s debate. The noble Baroness, Lady Kidron, really has opened the floodgates to expressions of support for human creativity. I thank her for tabling her Motion. I also thank the Minister for setting out the Government’s position and their support for the creative industries.
I suppose I straddle the world of AI and creativity as much as anybody in this House. I co-founded the All-Party Group on Artificial Intelligence and I have been a member of the All-Party Group on Intellectual Property for many years. That is reflected in my interests, both as an advisor to DLA Piper on AI policy and regulation, and as the newly appointed chair of the Authors’ Licensing and Collecting Society. I declare those interests, which are more than merely formal.
The subject matter of the amendments in this group is of profound importance for the future of our creative industries and the development of AI in the UK: the critical intersection of AI training and copyright law, and, specifically, the urgent need for transparency. As the noble Baroness, Lady Kidron, described, the rapid development of AI, particularly large language models, relies heavily on vast volumes of data for training. This has brought into sharp focus the way copyright law applies to such activity. It was impossible to miss the letter over the weekend from 400 really important creatives, and media and creative business leaders urging support for her Motion 49A. Rights holders, from musicians and authors to journalists and visual artists, are rightly concerned about the use of their copyrighted material to train AI models, often without permission or remuneration, as we have heard. They seek greater control over their content and remuneration when it is used for this purpose, alongside greater transparency.
Like others, I pay tribute to the noble Baroness, Lady Kidron, who has brilliantly championed the cause of creators and the creative industries throughout the passage of this Bill in her tabling of a series of crucial amendments. Her original amendments on Report, passed in this House but deleted by the Government in the Commons and then retabled in the Commons on Report by my honourable friends, aimed to make existing UK copyright law enforceable in the age of generative AI. The core argument behind Amendment 49B, which encapsulates the essence of the previous amendments, is that innovation in the AI field should not come at the expense of the individuals and industry creating original content.
The central plank of the noble Baroness’s proposals, and one these Benches strongly support, is the requirement for transparency from AI developers regarding the copyrighted material used in their training data. Her Amendment 49B specifically requires the Secretary of State to make regulations setting out strict transparency requirements for web crawlers and general-purpose AI models. This would include disclosing the identity and purpose of the crawlers used, identifying their owners and, crucially, keeping records of where and when copyrighted material is gathered. This transparency is vital for ensuring accountability and enabling copyright holders to identify potential infringements and enforce their rights.
The Minister described the process in the consultation on AI and copyright, published last December. That consultation proposed a text and data mining exception that would allow AI developers to train on material unless the rights holder expressly reserved their rights or opted out. The arguments against this proposed opt-out mechanism are compelling; they have been made by many noble Lords today and have been voiced by many outside, as we have heard. This mechanism shifts the burden on to creators to police the use of their work and actively opt out, placing an undue responsibility on them.
This approach undermines the fundamental principles of copyright, effectively rewarding the widespread harvesting or scraping of copyrighted material that has occurred without permission or fair remuneration. The Government’s proposed text and data-mining exception, which it appears that they are no longer proposing—as the noble Lord, Lord Brennan, asked, perhaps the Minister can clarify the Government’s position and confirm that that is indeed the case—risks harming creative sectors for minimal gain to a small group of global tech companies and could erode public trust in the AI sector. As the noble Baroness observed, this approach is selling the creative industries down the river. Voluntary measures for transparency proposed by the Government are insufficient. Clear legal obligations are needed.