Data (Use and Access) Bill [Lords] Debate
Full Debate: Read Full DebateVictoria Collins
Main Page: Victoria Collins (Liberal Democrat - Harpenden and Berkhamsted)Department Debates - View all Victoria Collins's debates with the Department for Science, Innovation & Technology
(1 day, 14 hours ago)
Commons ChamberI completely understand and, in large measure, share those concerns. We wanted to ensure, in this fast-changing world, that the creative industries in the United Kingdom could be remunerated for the work they had produced. We are not in the business of giving away other people’s work to third parties for nothing: that would be to sell our birthright for a mess of pottage, to use a term from an old translation of the Bible, and we are determined not to do it. As my hon. Friend—and several other Members—will have heard me say many times before, we would only proceed with the package of measures included in the consultation if we believed that we were advancing the cause of the creative industries in the UK, rather than putting them in danger or legal peril.
I think that some of the things I will say in a moment will be of assistance. We want to reach a point at which it is easier for the creative industries—whether they are large businesses with deep pockets and able to use lawyers, or very small individual photographers or painters—to assert and protect their rights, and to say, if they wish, “No, you cannot scrape my material for the purpose of large language model learning, unless you remunerate me.” That remuneration might happen via a collective licensing scheme, or it might happen individually. Either way, we want to get to more licensing rather than less. As, again, I have said several times at this Dispatch Box, we have looked at what has happened in the European Union and what is happening in the United States of America, and we believe that although the EU said that its package was designed to deliver more licensing, it has not led to more licensing or to more remuneration of the creative industries, and we want to avoid that pitfall.
As I have said, I take the concerns of the creative industries seriously, both as a DSIT Minister and as a DCMS Minister; of course I do. I agree—we, the Government, agree—that transparency is key. We want to see more licensing of content. We believe that the UK is a creative content superpower, and we want UK AI companies to flourish on the basis of high-quality data. I have spoken to a fair number of publishing companies, in particular UK companies such as Taylor & Francis, a largely academic publisher. As Members will know, the UK is the largest exporter of books in the world. Those companies are deliberately trying to get all their material licensed to AI companies, for two reasons: first, they want to be remunerated for the work that they have provided, and secondly, just as importantly, they want AI to come up with good answers. If you put dirty water into a pipe, dirty water will come out at the other end, and if you put good data into AI, good answers will come out of AI. That is an important part of why we want to ensure that we have strong AI based on high-quality data, and much of that is premium content from our creative industries.
We also agree that the Government must keep an open mind, and must take full account of the economic evidence. That is why we have tabled new clauses 16 and 17, which set out binding commitments to assess the impact of any and all proposals and to consider and report on the key areas raised in debate. That includes any and all of the options that were involved in the consultation that we published after the amendments were tabled in the House of Lords. As the Government take forward the commitments made by these amendments, they will consider all potential policy options. I must emphasise that the Government have not prejudged the outcome of the consultation, and take the need to consider and reflect on the best approach for all parties very seriously.
Members will, I am sure, have read new clause 17; it requires the Government to report on four matters. First, there is the issue of technical solutions that would enable copyright owners to control whether their copyright works could be used to develop AI.
Will the hon. Lady just let me finish this paragraph, because it might read better in Hansard? Actually, I have now added that bit, so it is ruined, and I might as well give way to her.
The question of technical solutions is very important, but my challenge is this. I have spoken to representatives of some of the big tech companies who are pushing for that, and who are saying that it is hard for them to do it at scale but creatives can do it. Why can the tech companies not be leading on an opt-in system for creatives? Let me hand that back to the Minister.
I should point out that the hon. Lady, as the spokesperson for the Liberal Democrat party, will be speaking very shortly.
I call the Liberal Democrat spokesperson.
Thank you for calling me, Madam Deputy Speaker, and for your patience regarding my earlier intervention. I am very passionate about all elements of the Bill.
On Second Reading, I said:
“Data is the new gold”—[Official Report, 12 February 2025; Vol. 762, c. 302.]
—a gold that could be harnessed to have a profound impact on people’s daily lives, and I stand by that. With exponential advances in innovation almost daily, this has never been truer, so we must get this right.
I rise today to speak to the amendments and new clauses tabled in my name specifically, and to address two urgent challenges: protecting children in our digital world and safeguarding the rights of our creative industry in the age of artificial intelligence. The Bill before us represents a rare opportunity to shape how technology serves people, which I firmly believe is good for both society and business. However, I stand here with mixed emotions: pride in the cross-party work we have accomplished, including with the other place; hope for the progress we can still achieve; but also disappointment that we must fight so hard for protections that should be self-evident.
New clause 1 seeks to raise the age of consent for social media data processing from 13 to 16 years old. We Liberal Democrats are very clear where we stand on this. Young minds were not designed to withstand the psychological assault of today’s social media algorithms. By raising the age at which children can consent to have their data processed by social media services, we can take an important first step towards tackling those algorithms at source. This is a common-sense measure, bringing us in line with many of our European neighbours.
The evidence before us is compelling and demands our attention. When I recently carried out a safer screens tour of schools across Harpenden and Berkhamsted to hear exactly what young people think about the issue, I heard that they are trapped in cycles of harmful content that they never sought out. Students spoke of brain rot and described algorithms that pushed them towards extreme content, despite their efforts to block it.
The evidence is not just anecdotal; it is overwhelming. Child mental health referrals have increased by 477% in just eight years, with nearly half of teenagers with problematic smartphone use reporting anxiety. One in four children aged 12 to 17 have received unwanted sexual images. We know that 82% of parents support Government intervention in this area, while a Liberal Democrat poll showed that seven in 10 people say the Government are not doing enough to protect children online.
I welcome new clause 1, tabled by my hon. Friend. Does she agree that raising the age of consent for processing personal data from 13 to 16 will help reduce the use of smartphones in schools by reducing their addictiveness, thereby also improving concentration and educational performance in schools?
That is exactly what is at the heart of this matter—the data that drives that addictiveness and commercialises our children’s attention is not the way forward.
Many amazing organisations have gathered evidence in this area, and it is abundantly clear that the overuse of children’s data increases their risk of harm. It powers toxic algorithms that trap children in cycles of harmful content, recommender systems that connect them with predators, and discriminatory AI systems that are used to make decisions about them that carry lifelong consequences. Health Professionals for Safer Screens—a coalition of child psychiatrists, paediatricians and GPs— is pleading for immediate legislative action.
This is not a partisan issue. So many of us adults can relate to the feeling of being drawn into endless scrolling on our devices—I will not look around the Chamber too much. Imagine how much more difficult it is for developing minds. This is a cross-party problem, and it should not be political, but we need action now.
Let me be absolutely clear: this change is not about restricting young people’s digital access or opposing technology and innovation; it is about requiring platforms to design their services with children’s safety as the default, not as an afterthought. For years we have watched as our children’s wellbeing has been compromised by big tech companies and their profits. Our call for action is supported by the National Society for the Prevention of Cruelty to Children, 5rights, Healthcare Professionals for Safer Screens, Girlguiding, Mumsnet and the Online Safety Act network. This is our chance to protect our children. The time to act is not 18 months down the line, as the Conservatives suggest, but now. I urge Members to support new clause 1 and take the crucial steps towards creating a digital world where children can truly thrive.
To protect our children, I have also tabled amendment 45 to clause 80, which seeks to ensure that automated decision-making systems cannot be used to make impactful decisions about children without robust safeguards. The Bill must place a child’s best interests at the heart of any such system, especially where education or healthcare are concerned.
We must protect the foundational rights of our creators in this new technological landscape, which is why I have tabled new clause 2. The UK’s creative industries contribute £126 billion annually to our economy and employ more than 2.3 million people—they are vital to our economy and our cultural identity. These are the artists, musicians, writers and creators who inspire us, define us and proudly carry British creativity on to the global stage. Yet today, creative professionals across the UK watch with mounting alarm as AI models trained on their life’s work generate imitations without permission, payment or even acknowledgment.
New clause 2 would ensure that operators of web crawlers and AI models comply with existing UK copyright law, regardless of where they are based. This is not about stifling innovation; it is about ensuring that innovation respects established rights and is good for everyone. Currently, AI companies are scraping creative works at an industrial scale. A single AI model may be trained on thousands of copyrighted works without permission or compensation.
The UK company Polaron is a fantastic example, creating AI technology to help engineers to characterise materials, quantify microstructural variation and optimise microstructural designs faster than ever before. Why do I bring up Polaron? It is training an AI model built from scratch without using copyright materials.
I am emphatically on the hon. Lady’s side in her intent to protect British creativity, but how does she respond to the implicit threat from artificial intelligence providers to this and other elements of the Bill to effectively deny AI to the UK if they find the regulations too difficult to deal with?
We have a thriving innovation sector in the UK, so those companies are not going anywhere—they want to work with the UK. We actually have a system now that has a fantastic creative industry and we have innovation and business coming in. There are many ways to incentivise that. I talk a lot about money, skills and infrastructure—that is what these innovative companies are looking for. We can make sure the guardrails are right so that it works for everyone.
By ensuring that operators of web crawlers and AI models comply with existing UK copyright law, we are simply upholding established rights in a new technological context. The UK led the world in establishing trustworthy financial and legal services, creating one of the largest economies by taking a long-term view, and we can do the same with technology. By supporting new clause 2, we could establish the UK as a base for trustworthy technology while protecting our creative industries.
Finally, I will touch on new clause 4, which would address the critical gap in our approach to AI regulation: the lack of transparency regarding training data. Right now, creators have no way of knowing if their work has been used to train AI models. Transparency is the foundation of trust. Without it, we risk not only exploiting creators, but undermining public confidence in these powerful new technologies. The principle is simple: if an AI system is trained using someone’s creative work, they deserve to know about it and to have a say in how it is used. That is not just fair to creators, but essential for building an AI ecosystem that the public trust. By supporting new clause 4, we would ensure that the development of AI happens in the open, allowing for proper compensation, attribution and accountability. That is how we will build responsible AI that serves everyone, not just the tech companies.
On the point of transparency, I will touch briefly on a couple of other amendments. We must go further in algorithmic decision making. That is why I have tabled amendment 46, which would ensure that individuals receive personalised explanations in plain language when an automated decision system affects them. We cannot allow generic justifications to stand in for accountability.
I will support the hon. Lady’s new clause 2 tonight, if she pushes it to a vote, and I encourage her also to push new clause 4 to a vote. This is a most important issue. We must ensure that transparency is available to all artists and creators. Does she agree that there is no good technological barrier to having transparency in place right now?
That has been my challenge to the tech companies, which I absolutely support in innovating and driving this—but if they are saying that it would be easy for creatives to do this, why is it not easy for big tech companies with power and resources to lead the way?
Amendments 41 to 44 would ensure that the decisions made about people, whether through data profiling, automated systems or algorithms, are fair. They would clarify that meaningful human involvement in automated decision making must be real, competent and capable of changing the outcome, not just a box-ticking exercise.
The amendments before us offer a clear choice to protect our children and creators or to continue to delay while harm grows—the choice to build a future in which technology either builds trust or destroys it. We have the evidence and the solutions, and the time for action is now. Let us choose a future in which technology empowers, rather than exploits—one that is good for society and for business. I urge all Members to support our amendments, which would put people and the wellbeing of future generations first.
I am pleased to speak in this debate in support of new clause 14, in the name of my hon. Friend the Member for Leeds Central and Headingley (Alex Sobel), to which I have added my name. The clause would give our media and creative sectors urgently needed transparency over the use of copyright works by AI models. I am sure that my speech will come as no surprise to the Minister.
I care about this issue because of, not in spite of, my belief in the power of AI and its potential to transform our society and our economy for the better. I care because the adoption of novel technologies by businesses and consumers requires trust in the practices of firms producing the tech. I care about this issue because, as the Creative Rights in AI Coalition has said:
“The UK has the potential to be the global destination for generative firms seeking to license the highest-quality creative content. But to unlock that immense value, we must act now to stimulate a dynamic licensing market: the government must use this legislation to introduce meaningful transparency provisions.”
Although I am sure that the Government’s amendments are well meant, they set us on a timeline for change to the copyright framework that would take us right to the tail end of this Parliament. Many in this House, including myself, do not believe that an effective opt-out mechanism will ever develop; I know it is not in the Bill right now, but it was proposed in the AI and copyright consultation. Even if the Government insist on pursuing this route, it would be a dereliction of duty to fail to enforce our existing laws in the intervening period.
Big tech firms claim that transparency is not feasible, but that is a red herring. These companies are absolutely capable of letting rights holders know whether their individual works have been used, as OpenAI has been ordered to do in the Authors Guild v. OpenAI copyright case. Requiring transparency without the need for a court order will avoid wasting court time and swiftly establish a legal precedent, making the legal risk of copyright infringement too great for AI firms to continue with the mass theft that has taken place. That is why big tech objects to transparency, just as it objects to any transparency requirements, whether they are focused on online safety, digital competition or copyright. It would make it accountable to the individuals and businesses that it extracts value from.
The AI companies further argue that providing transparency would compromise their trade secrets, but that is another red herring. Nobody is asking for a specific recipe of how the models are trained: they are asking only to be able to query the ingredients that have gone into it. Generative AI models are made up of billions of data points, and it is the weighting of data that is a model’s secret sauce.
The Government can do myriad things around skills, access to finance, procurement practices and energy costs to support AI firms building and deploying models in the UK. They insist that they do not see the AI copyright debate as a zero-sum game, but trading away the property rights of 2.4 million UK creatives—70% of whom live outside London—to secure tech investment would be just that.
There are no insurmountable technical barriers to transparency in the same way that there are no opt-outs. The key barrier to transparency is the desire of tech firms to obscure their illegal behaviour. It has been shown that Meta employees proactively sought, in their own words,
“to remove data that is clearly marked as pirated/stolen”
from the data that they used from the pirate shadow library, LibGen. If they have technical means to identify copyright content to cover their own backs, surely they have the technical means to be honest with creators about the use of their valuable work.
I say to the Minister, who I know truly cares about the future of creatives and tech businesses in the UK—that is absolutely not in question—that if he cannot accept new clause 14 as tabled, he should take the opportunity as the Bill goes back to the Lords to bring forward clauses that would allow him to implement granular transparency mechanisms in the next six to 12 months. I and many on the Labour Benches—as well as the entire creative industries and others who do not want what is theirs simply to be taken from them—stand ready to support the development of workable solutions at pace. It can never be too soon to protect the livelihoods of UK citizens, nor to build trust between creators and the technology that would not exist without their hard work.
I would just like to clarify that we have thought long and hard about this Bill, along with many organisations and charities, to get it right.