Data (Use and Access) Bill [Lords]

Debate between Victoria Collins and Samantha Niblett
Wednesday 7th May 2025

(2 days, 1 hour ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Victoria Collins Portrait Victoria Collins
- Hansard - -

That has been my challenge to the tech companies, which I absolutely support in innovating and driving this—but if they are saying that it would be easy for creatives to do this, why is it not easy for big tech companies with power and resources to lead the way?

Amendments 41 to 44 would ensure that the decisions made about people, whether through data profiling, automated systems or algorithms, are fair. They would clarify that meaningful human involvement in automated decision making must be real, competent and capable of changing the outcome, not just a box-ticking exercise.

The amendments before us offer a clear choice to protect our children and creators or to continue to delay while harm grows—the choice to build a future in which technology either builds trust or destroys it. We have the evidence and the solutions, and the time for action is now. Let us choose a future in which technology empowers, rather than exploits—one that is good for society and for business. I urge all Members to support our amendments, which would put people and the wellbeing of future generations first.

Samantha Niblett Portrait Samantha Niblett
- View Speech - Hansard - - - Excerpts

I am pleased to speak in this debate in support of new clause 14, in the name of my hon. Friend the Member for Leeds Central and Headingley (Alex Sobel), to which I have added my name. The clause would give our media and creative sectors urgently needed transparency over the use of copyright works by AI models. I am sure that my speech will come as no surprise to the Minister.

I care about this issue because of, not in spite of, my belief in the power of AI and its potential to transform our society and our economy for the better. I care because the adoption of novel technologies by businesses and consumers requires trust in the practices of firms producing the tech. I care about this issue because, as the Creative Rights in AI Coalition has said:

“The UK has the potential to be the global destination for generative firms seeking to license the highest-quality creative content. But to unlock that immense value, we must act now to stimulate a dynamic licensing market: the government must use this legislation to introduce meaningful transparency provisions.”

Although I am sure that the Government’s amendments are well meant, they set us on a timeline for change to the copyright framework that would take us right to the tail end of this Parliament. Many in this House, including myself, do not believe that an effective opt-out mechanism will ever develop; I know it is not in the Bill right now, but it was proposed in the AI and copyright consultation. Even if the Government insist on pursuing this route, it would be a dereliction of duty to fail to enforce our existing laws in the intervening period.

Big tech firms claim that transparency is not feasible, but that is a red herring. These companies are absolutely capable of letting rights holders know whether their individual works have been used, as OpenAI has been ordered to do in the Authors Guild v. OpenAI copyright case. Requiring transparency without the need for a court order will avoid wasting court time and swiftly establish a legal precedent, making the legal risk of copyright infringement too great for AI firms to continue with the mass theft that has taken place. That is why big tech objects to transparency, just as it objects to any transparency requirements, whether they are focused on online safety, digital competition or copyright. It would make it accountable to the individuals and businesses that it extracts value from.

The AI companies further argue that providing transparency would compromise their trade secrets, but that is another red herring. Nobody is asking for a specific recipe of how the models are trained: they are asking only to be able to query the ingredients that have gone into it. Generative AI models are made up of billions of data points, and it is the weighting of data that is a model’s secret sauce.

The Government can do myriad things around skills, access to finance, procurement practices and energy costs to support AI firms building and deploying models in the UK. They insist that they do not see the AI copyright debate as a zero-sum game, but trading away the property rights of 2.4 million UK creatives—70% of whom live outside London—to secure tech investment would be just that.

There are no insurmountable technical barriers to transparency in the same way that there are no opt-outs. The key barrier to transparency is the desire of tech firms to obscure their illegal behaviour. It has been shown that Meta employees proactively sought, in their own words,

“to remove data that is clearly marked as pirated/stolen”

from the data that they used from the pirate shadow library, LibGen. If they have technical means to identify copyright content to cover their own backs, surely they have the technical means to be honest with creators about the use of their valuable work.

I say to the Minister, who I know truly cares about the future of creatives and tech businesses in the UK—that is absolutely not in question—that if he cannot accept new clause 14 as tabled, he should take the opportunity as the Bill goes back to the Lords to bring forward clauses that would allow him to implement granular transparency mechanisms in the next six to 12 months. I and many on the Labour Benches—as well as the entire creative industries and others who do not want what is theirs simply to be taken from them—stand ready to support the development of workable solutions at pace. It can never be too soon to protect the livelihoods of UK citizens, nor to build trust between creators and the technology that would not exist without their hard work.