Data (Use and Access) Bill [Lords]

Debate between Victoria Collins and David Davis
Wednesday 7th May 2025

(1 day, 23 hours ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Victoria Collins Portrait Victoria Collins
- Hansard - -

That is exactly what is at the heart of this matter—the data that drives that addictiveness and commercialises our children’s attention is not the way forward.

Many amazing organisations have gathered evidence in this area, and it is abundantly clear that the overuse of children’s data increases their risk of harm. It powers toxic algorithms that trap children in cycles of harmful content, recommender systems that connect them with predators, and discriminatory AI systems that are used to make decisions about them that carry lifelong consequences. Health Professionals for Safer Screens—a coalition of child psychiatrists, paediatricians and GPs— is pleading for immediate legislative action.

This is not a partisan issue. So many of us adults can relate to the feeling of being drawn into endless scrolling on our devices—I will not look around the Chamber too much. Imagine how much more difficult it is for developing minds. This is a cross-party problem, and it should not be political, but we need action now.

Let me be absolutely clear: this change is not about restricting young people’s digital access or opposing technology and innovation; it is about requiring platforms to design their services with children’s safety as the default, not as an afterthought. For years we have watched as our children’s wellbeing has been compromised by big tech companies and their profits. Our call for action is supported by the National Society for the Prevention of Cruelty to Children, 5rights, Healthcare Professionals for Safer Screens, Girlguiding, Mumsnet and the Online Safety Act network. This is our chance to protect our children. The time to act is not 18 months down the line, as the Conservatives suggest, but now. I urge Members to support new clause 1 and take the crucial steps towards creating a digital world where children can truly thrive.

To protect our children, I have also tabled amendment 45 to clause 80, which seeks to ensure that automated decision-making systems cannot be used to make impactful decisions about children without robust safeguards. The Bill must place a child’s best interests at the heart of any such system, especially where education or healthcare are concerned.

We must protect the foundational rights of our creators in this new technological landscape, which is why I have tabled new clause 2. The UK’s creative industries contribute £126 billion annually to our economy and employ more than 2.3 million people—they are vital to our economy and our cultural identity. These are the artists, musicians, writers and creators who inspire us, define us and proudly carry British creativity on to the global stage. Yet today, creative professionals across the UK watch with mounting alarm as AI models trained on their life’s work generate imitations without permission, payment or even acknowledgment.

New clause 2 would ensure that operators of web crawlers and AI models comply with existing UK copyright law, regardless of where they are based. This is not about stifling innovation; it is about ensuring that innovation respects established rights and is good for everyone. Currently, AI companies are scraping creative works at an industrial scale. A single AI model may be trained on thousands of copyrighted works without permission or compensation.

The UK company Polaron is a fantastic example, creating AI technology to help engineers to characterise materials, quantify microstructural variation and optimise microstructural designs faster than ever before. Why do I bring up Polaron? It is training an AI model built from scratch without using copyright materials.

David Davis Portrait David Davis
- Hansard - - - Excerpts

I am emphatically on the hon. Lady’s side in her intent to protect British creativity, but how does she respond to the implicit threat from artificial intelligence providers to this and other elements of the Bill to effectively deny AI to the UK if they find the regulations too difficult to deal with?

Victoria Collins Portrait Victoria Collins
- Hansard - -

We have a thriving innovation sector in the UK, so those companies are not going anywhere—they want to work with the UK. We actually have a system now that has a fantastic creative industry and we have innovation and business coming in. There are many ways to incentivise that. I talk a lot about money, skills and infrastructure—that is what these innovative companies are looking for. We can make sure the guardrails are right so that it works for everyone.

By ensuring that operators of web crawlers and AI models comply with existing UK copyright law, we are simply upholding established rights in a new technological context. The UK led the world in establishing trustworthy financial and legal services, creating one of the largest economies by taking a long-term view, and we can do the same with technology. By supporting new clause 2, we could establish the UK as a base for trustworthy technology while protecting our creative industries.

Finally, I will touch on new clause 4, which would address the critical gap in our approach to AI regulation: the lack of transparency regarding training data. Right now, creators have no way of knowing if their work has been used to train AI models. Transparency is the foundation of trust. Without it, we risk not only exploiting creators, but undermining public confidence in these powerful new technologies. The principle is simple: if an AI system is trained using someone’s creative work, they deserve to know about it and to have a say in how it is used. That is not just fair to creators, but essential for building an AI ecosystem that the public trust. By supporting new clause 4, we would ensure that the development of AI happens in the open, allowing for proper compensation, attribution and accountability. That is how we will build responsible AI that serves everyone, not just the tech companies.

On the point of transparency, I will touch briefly on a couple of other amendments. We must go further in algorithmic decision making. That is why I have tabled amendment 46, which would ensure that individuals receive personalised explanations in plain language when an automated decision system affects them. We cannot allow generic justifications to stand in for accountability.