Data (Use and Access) Bill [Lords] Debate
Full Debate: Read Full DebateFreddie van Mierlo
Main Page: Freddie van Mierlo (Liberal Democrat - Henley and Thame)Department Debates - View all Freddie van Mierlo's debates with the Department for Science, Innovation & Technology
(1 day, 14 hours ago)
Commons ChamberThank you for calling me, Madam Deputy Speaker, and for your patience regarding my earlier intervention. I am very passionate about all elements of the Bill.
On Second Reading, I said:
“Data is the new gold”—[Official Report, 12 February 2025; Vol. 762, c. 302.]
—a gold that could be harnessed to have a profound impact on people’s daily lives, and I stand by that. With exponential advances in innovation almost daily, this has never been truer, so we must get this right.
I rise today to speak to the amendments and new clauses tabled in my name specifically, and to address two urgent challenges: protecting children in our digital world and safeguarding the rights of our creative industry in the age of artificial intelligence. The Bill before us represents a rare opportunity to shape how technology serves people, which I firmly believe is good for both society and business. However, I stand here with mixed emotions: pride in the cross-party work we have accomplished, including with the other place; hope for the progress we can still achieve; but also disappointment that we must fight so hard for protections that should be self-evident.
New clause 1 seeks to raise the age of consent for social media data processing from 13 to 16 years old. We Liberal Democrats are very clear where we stand on this. Young minds were not designed to withstand the psychological assault of today’s social media algorithms. By raising the age at which children can consent to have their data processed by social media services, we can take an important first step towards tackling those algorithms at source. This is a common-sense measure, bringing us in line with many of our European neighbours.
The evidence before us is compelling and demands our attention. When I recently carried out a safer screens tour of schools across Harpenden and Berkhamsted to hear exactly what young people think about the issue, I heard that they are trapped in cycles of harmful content that they never sought out. Students spoke of brain rot and described algorithms that pushed them towards extreme content, despite their efforts to block it.
The evidence is not just anecdotal; it is overwhelming. Child mental health referrals have increased by 477% in just eight years, with nearly half of teenagers with problematic smartphone use reporting anxiety. One in four children aged 12 to 17 have received unwanted sexual images. We know that 82% of parents support Government intervention in this area, while a Liberal Democrat poll showed that seven in 10 people say the Government are not doing enough to protect children online.
I welcome new clause 1, tabled by my hon. Friend. Does she agree that raising the age of consent for processing personal data from 13 to 16 will help reduce the use of smartphones in schools by reducing their addictiveness, thereby also improving concentration and educational performance in schools?
That is exactly what is at the heart of this matter—the data that drives that addictiveness and commercialises our children’s attention is not the way forward.
Many amazing organisations have gathered evidence in this area, and it is abundantly clear that the overuse of children’s data increases their risk of harm. It powers toxic algorithms that trap children in cycles of harmful content, recommender systems that connect them with predators, and discriminatory AI systems that are used to make decisions about them that carry lifelong consequences. Health Professionals for Safer Screens—a coalition of child psychiatrists, paediatricians and GPs— is pleading for immediate legislative action.
This is not a partisan issue. So many of us adults can relate to the feeling of being drawn into endless scrolling on our devices—I will not look around the Chamber too much. Imagine how much more difficult it is for developing minds. This is a cross-party problem, and it should not be political, but we need action now.
Let me be absolutely clear: this change is not about restricting young people’s digital access or opposing technology and innovation; it is about requiring platforms to design their services with children’s safety as the default, not as an afterthought. For years we have watched as our children’s wellbeing has been compromised by big tech companies and their profits. Our call for action is supported by the National Society for the Prevention of Cruelty to Children, 5rights, Healthcare Professionals for Safer Screens, Girlguiding, Mumsnet and the Online Safety Act network. This is our chance to protect our children. The time to act is not 18 months down the line, as the Conservatives suggest, but now. I urge Members to support new clause 1 and take the crucial steps towards creating a digital world where children can truly thrive.
To protect our children, I have also tabled amendment 45 to clause 80, which seeks to ensure that automated decision-making systems cannot be used to make impactful decisions about children without robust safeguards. The Bill must place a child’s best interests at the heart of any such system, especially where education or healthcare are concerned.
We must protect the foundational rights of our creators in this new technological landscape, which is why I have tabled new clause 2. The UK’s creative industries contribute £126 billion annually to our economy and employ more than 2.3 million people—they are vital to our economy and our cultural identity. These are the artists, musicians, writers and creators who inspire us, define us and proudly carry British creativity on to the global stage. Yet today, creative professionals across the UK watch with mounting alarm as AI models trained on their life’s work generate imitations without permission, payment or even acknowledgment.
New clause 2 would ensure that operators of web crawlers and AI models comply with existing UK copyright law, regardless of where they are based. This is not about stifling innovation; it is about ensuring that innovation respects established rights and is good for everyone. Currently, AI companies are scraping creative works at an industrial scale. A single AI model may be trained on thousands of copyrighted works without permission or compensation.
The UK company Polaron is a fantastic example, creating AI technology to help engineers to characterise materials, quantify microstructural variation and optimise microstructural designs faster than ever before. Why do I bring up Polaron? It is training an AI model built from scratch without using copyright materials.