Data (Use and Access) Bill [Lords]

Ben Spencer Excerpts
Wednesday 7th May 2025

(1 day, 18 hours ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Nusrat Ghani Portrait Madam Deputy Speaker
- Hansard - - - Excerpts

I call the shadow Minister.

Ben Spencer Portrait Dr Ben Spencer (Runnymede and Weybridge) (Con)
- View Speech - Hansard - -

It is a privilege to respond to this debate on behalf of His Majesty’s official Opposition, and to speak to the new clauses and amendments. This is an ambitious piece of legislation, which will enable us to harness data—the currency of our digital age—and use it in a way that drives the economy and enhances the delivery of public services. Since its original inception under the Conservatives in the last Parliament, the Bill has also become the platform for tackling some of the most pressing social and technological issues of our time. Many of these are reflected in the amendments to the Bill, which are the subject of debate today.

I start with new clause 20. How do we regulate the interaction of AI models with creative works? I pay tribute to the work of many Members on both sides of this House, and Members of the other place, who have passionately raised creatives’ concerns and the risks posed to their livelihoods by AI models. Conservative Members are clear that this is not a zero-sum game. Our fantastic creative and tech industries have the potential to turbocharge economic growth, and the last Government rightly supported them. The creative and technology sectors need and deserve certainty, which provides the foundation for investment and growth. New clause 20 would achieve certainty by requiring the Government to publish a series of plans on the transparency of AI models’ use of copyrighted works, removing market barriers for smaller AI market entrants and digital watermarking and, most important of all, a clear restatement of the application of copyright law to AI-modelling activities.

I cannot help but have a sense of déjà vu in relation to Government new clause 17: we are glad that the Government have acted on several of the actions we called for in Committee, but once again they have chosen PR over effective policy. Amid all the spin, the Government have in effect announced a plan to respond to their own consultation—how innovative!

What is starkly missing from the Government new clauses is a commitment to make it clear that copyright law applies to the use of creative content by AI models, which is the primary concern raised with me by industry representatives. The Government have created uncertainty about the application of copyright law to AI modelling through their ham-fisted consultation. So I offer the Minister another opportunity: will he formally confirm the application of copyright law to protect the use of creative works by AI, and will he provide legal certainty and send a strong signal to our creative industries that they will not be asked to pay the price for AI growth?

Ben Spencer Portrait Dr Spencer
- Hansard - -

I thank the Minister for making that statement at the Dispatch Box. As he knows, we need to have that formally, in writing, as a statement from the Government to make it absolutely clear, given that the consultation has muddied the waters.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

I am sorry, but I said that in my speech, and I have said it several times in several debates previously.

Ben Spencer Portrait Dr Spencer
- Hansard - -

I would therefore be grateful if the Minister said why there remains uncertainty among creatives about the application of copyright in this area. Is that not why we need to move this forward?

I now turn to Government amendment 34 and others. I congratulate my noble Friend Baroness Owen on the tremendous work she has done in ensuring that clauses criminalising the creation of and request for sexually explicit deepfake images have made it into the Bill. I also thank the Government for the constructive approach they are now taking in this area.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

I should have said earlier that, as the shadow Minister knows, in Committee we changed the clause on “soliciting” to one on “requesting” such an image, because in certain circumstances soliciting may require the exchange of money. That is why we now have the requesting offence.

Ben Spencer Portrait Dr Spencer
- Hansard - -

I thank the Minister for his clarification and reiteration of that point, and again for his work with colleagues to take forward the issue, on which I think we are in unison across the House.

New clause 21 is on directions to public authorities on recording of sex data. One does not need to be a doctor to know that data accuracy is critical, particularly when it comes to health, research or the provision of tailored services based on protected characteristics such as sex or age. The accuracy of data must be at the heart of this Bill, and nowhere has this been more high-profile or important than in the debate over the collection and use of sex and gender data. I thank the charity Sex Matters and the noble Lords Arbuthnot and Lucas for the work they have done to highlight the need for accurate data and its relevance for the digital verification system proposed in the Bill.

Samantha Niblett Portrait Samantha Niblett (South Derbyshire) (Lab)
- Hansard - - - Excerpts

The recent decision by the Supreme Court that “sex” in the Equality Act 2010 refers to biological sex at birth, regardless of whether someone holds a gender recognition certificate or identifies as of a different gender, has already left many trans people feeling hurt and unseen. Does the shadow Minister agree with me that any ID and digital verification service must consider trans people, not risk making them more likely to feel that their country is forgetting who they are?

Ben Spencer Portrait Dr Spencer
- Hansard - -

I thank the hon. Member for her intervention, and I will shortly come on to the impact on all people of the decision of the Supreme Court. Our new clause’s focus and scope are simple. The Supreme Court ruling made it clear that public bodies must collect data on biological sex to comply with their duties under the Equality Act. The new clause ensures that this data is recorded and used correctly in accordance with the law. This is about data accuracy, not ideology.

New clause 21 is based in part on the work of Professor Alice Sullivan, who conducted a very important review, with deeply concerning findings on inaccurate data collection and the conflation of gender identity with biological sex data. She found people missed off health screening, risks to research integrity, inaccurate policing records and management through the criminal justice system, and many other concerns. These concerns present risks to everyone, irrespective of biological sex, gender identity or acquired gender. Trans people, like everyone else, need health screening based on their biological sex. Trans people need protecting from sexual predators, too, and they have the right to dignity and respect.

The Sullivan report shows beyond doubt that the concerns of the last Government and the current Leader of the Opposition were entirely justified. The Government have had Professor Sullivan’s report since September last year, but the Department for Science, Innovation and Technology has still not made a formal statement about it or addressed the concerns raised, which is even more surprising given its relevance to this Bill. The correction of public authority data on sex is necessary and urgent, but it is made even more critical by the implementation of the digital verification services in the Bill.

Tonia Antoniazzi Portrait Tonia Antoniazzi (Gower) (Lab)
- Hansard - - - Excerpts

I appreciate that the shadow Minister is making an important point on the Sullivan review and the Supreme Court judgment, but there are conversations in Government and with Labour Members to ensure that the Supreme Court judgment and the Sullivan review are implemented properly across all Departments, and I hope to work with the Government on that.

Ben Spencer Portrait Dr Spencer
- Hansard - -

I thank the hon. Member for her intervention, and for all the work that she and colleagues on both sides of the House are doing in this area. I hope that the findings of the Sullivan report are implemented as soon as possible, and part of that implementation would be made possible if Members across the House supported our new clause.

For the digital verification services to be brought in, it is important that the data used to inform them is accurate and correct. Digital verification could be used to access single-sex services, so it needs to be correct, and if sex and gender data are conflated, as we know they are in many datasets, a failure to act will bring in self-ID by the back door. To be clear, that has never been the legal position in the UK, and it would conflict with the ruling of the Supreme Court. Our new clause 21 is simple and straightforward. It is about the accurate collection and use of sex data, and rules to ensure that data is of the right standard when used in digital verification services so that single-sex services are not undermined.

New clause 19 is on the Secretary of State’s duty to review the age of consent for data processing under the UK GDPR. What can or should children be permitted to consent to when using or signing up to online platforms and social media? How do we ensure children are protected, and how do we prevent harms from the use of inappropriate social media itself, separate from the content provided? How do we help our children in a world where social media can import the school, the playground, the changing room, the influencer, the stranger, the groomer, the radical and the hostile state actor all into the family home?

Our children are the first generation growing up in the digital world, and they are exposed to information and weaponised algorithms on a scale that simply did not exist for their parents. In government, we took measures to improve protections and regulate harmful content online, and I am delighted to see those measures now coming into force. However, there is increasing evidence that exposure to inappropriate social media platforms is causing harm, and children as young as 13 may not be able to regulate and process this exposure to such sites in a safe and proportionate way.

I am sure every Member across the House will have been contacted by parents concerned about the impact of social media on their children, and we recognise that this is a challenging area to regulate. How do we define and target risky and inappropriate social media platforms, and ensure that education and health tech—or, indeed, closed direct messaging services—do not fall within scope? How effective are our provisions already, and can age verification be made to work for under-16s? What ids are available to use? What will the impact of the Online Safety Act 2023 be now that it is coming into force? What are the lessons from its implementation, and where does it need strengthening? Finally, how do we support parents and teachers in educating and guiding children so they are prepared to enter the digital world at whatever age they choose and are able to do so?

The Government must take action to ensure appropriate safeguards are in place for our children, not through outright bans or blanket restrictions but with an evidence-based approach that takes into account the recent legal changes and need for effective enforcement, including age verification for under-16s. Too often in this place we focus on making more things illegal rather than on the reasons for lack of enforcement in the first place. There is no point in immediate restrictions if they cannot be implemented.

Munira Wilson Portrait Munira Wilson (Twickenham) (LD)
- Hansard - - - Excerpts

I agree with all the points the shadow Minister is making about keeping our children safe online, so why does new clause 19 only commit to a review of the digital age of data consent and raising the age from 13 to 16 for when parental consent is no longer required? Why does he not support the Liberal Democrats’ new clause 1 that would start to implement this change? We can still, through implementation, do all the things the hon. Gentleman proposes to do, so why the delay?

Ben Spencer Portrait Dr Spencer
- Hansard - -

There are a few issues with new clause 1. One is the scope in terms of the definition of networking services and ensuring platforms such as WhatsApp are not captured within it. Looking at new clause 19, there are challenges to implementing in this area. There is no point in clicking our fingers and saying, “Let’s change the age of digital consent,” without understanding the barriers to implementation, and without understanding whether age verification can work in this context. We do not want to create a system and have people just get around it quite simply. We need the Government to do the work in terms of setting it up so that we can move towards a position of raising the age from 13 to 16.

Max Wilkinson Portrait Max Wilkinson (Cheltenham) (LD)
- Hansard - - - Excerpts

The press have obviously been briefed by Conservatives that the Conservatives are pushing for a ban on social media for under-16s, but it seems that what is actually being suggested is a review of the digital age of consent with a view to perhaps increasing it to 16. The two positions are very different, and I wonder whether the tough talk in the press matches what is actually being proposed by the Opposition today.

Ben Spencer Portrait Dr Spencer
- Hansard - -

I have been very clear on this, and it is important in such a complex area to look at the detail and nuance of the challenges around—(Interruption.) Well, it is very easy to create a new clause where we click our fingers and say, “Let’s make this more illegal; let’s bring in x, y or z restriction.” As a responsible Opposition, we are looking at the detail and complexities around implementing something like this. [Interruption.] I have been asked a few questions and the hon. Member for Cheltenham (Max Wilkinson) might want to listen to the rationale of our approach.

One question is how to define social media. Direct messaging services such as WhatsApp and platforms such as YouTube fall in the scope of social media. There are obviously social media platforms that I think all of us are particularly concerned about, including Snapchat and TikTok, but by changing the age of digital consent we do not want to end up capturing lower-risk social media platforms that we recognise are clearly necessary or beneficial, such as education technology or health technology platforms. And that is before we start looking at whether age verification can work, particularly in the 13-to-16 age group.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

Sorry, I am getting a bit lost. Does the Minister think, and does the Conservative party think, that the digital age of consent should rise from 13 to 16 or not?

Ben Spencer Portrait Dr Spencer
- Hansard - -

rose—

Nusrat Ghani Portrait Madam Deputy Speaker (Ms Nusrat Ghani)
- Hansard - - - Excerpts

Order. I point out to Mr Bryant that Dr Ben Spencer is the shadow Minister.

Ben Spencer Portrait Dr Spencer
- Hansard - -

I think that was wishful thinking by the Minister in this debate.

Our new clause says that we need to look at the desirability of raising the digital age of consent for data processing from 13 to 16 in terms of its impact particularly on issues such as the social and educational development of children, but also the viability of doing so in terms of the fallout and the shaking out of the Online Safety Act and with regard to age verification services. Should there then be no evidence to demonstrate that it is unnecessary, we would then raise the digital age of consent to 13 to 16. It might be the case that, over the next six months, the shaking out of the Online Safety Act demonstrates that this intervention is not necessary. Perhaps concerns around particular high-risk social media platforms will change as technology evolves. We are saying that the Government should do the work with a view to raising the age in 18 months unless there is evidence to prove the contrary. [Interruption.] I have made this crystal clear, and if the Minister would choose to look at the new clause, rather than chuckling away in the corner, he might see the strategy we are proposing.

Max Wilkinson Portrait Max Wilkinson
- Hansard - - - Excerpts

I thank the shadow Minister for giving way. As ever, he is extremely polite in his presentation and in his dealing with interventions, but I am not sure that he dealt with my intervention, which was basically asking whether the Conservative party position is as it has briefed to the press—that it wishes to ban social media for under-16s—or that it wishes to have a review on raising the age of data consent. It cannot be both.

Ben Spencer Portrait Dr Spencer
- Hansard - -

I say again that the position is that, following a careful look at the evidence regarding the desirability and validity of doing so—taking into account findings regarding the impact and implementation of the Online Safety Act and age verification and how one defines social media, particularly high-risk platforms—unless there is direct evidence to show that raising the age from 13 to 16 is unnecessary, which there may be, then we should raise it from 13 to 16. If that has not provided clarity, the hon. Gentleman is very welcome to intervene on me again and I will try and explain it a third time, but I think Members have got a grasp now.

This new clause will also tackle some of the concerns at the heart of the campaign for Jools’ law, and I pay tribute to Ellen Roome for her work in this area. I am very sympathetic to the tragic circumstances leading to this campaign and welcome the additional powers granted to coroners in the Bill, but I know that they do not fully address Ellen Roome’s concerns. The Government need to explain how they can be sure that data will be retained in the context of these tragedies, so that a coroner will be able to make sure, even if there are delays, that it can be accessed. If the Minister could provide an answer to that in his winding-up speech, and detail any further work in the area, that would be welcome.

On parental access to children’s data more broadly, there are difficult challenges in terms of article 8 rights on privacy and transparency, especially for children aged 16 to 17 as they approach adulthood. Our new clause addresses some of these concerns and would also put in place the groundwork to, de facto, raise the digital age of consent for inappropriate social media to 16 within 18 months, rendering the request for parental access to young teenage accounts obsolete.

I urge colleagues across the House to support all our amendments today as a balanced, proportionate and effective response to a generational challenge. The Bill and the votes today are an opportunity for our Parliament, often referred to as the conscience of our country, to make clear our position on some of the most pressing social and technological issues of our time.

Nusrat Ghani Portrait Madam Deputy Speaker (Ms Nusrat Ghani)
- Hansard - - - Excerpts

I call the Chair of the Science, Innovation and Technology Committee.

--- Later in debate ---
Caroline Nokes Portrait Madam Deputy Speaker (Caroline Nokes)
- Hansard - - - Excerpts

I call the shadow Minister.

Ben Spencer Portrait Dr Spencer
- View Speech - Hansard - -

It has been a pleasure to hear the speeches of Members from across the House. I pay tribute to my hon. Friend the Member for Gosport (Dame Caroline Dinenage) and my right hon. Friend the Member for Maldon (Sir John Whittingdale), who spoke with passion about the protection of copyright in AI. I suspect that my right hon. Friend is looking forward to seeing the back of the Bill, and hoping that it does not return in a future iteration. My right hon. Friend the Member for Chingford and Woodford Green (Sir Iain Duncan Smith) spoke of the importance of ensuring that data does not fall victim to hostile states and hostile state actors. My right hon. Friend the Member for East Hampshire (Damian Hinds) spoke with knowledge and authority about this important issue, and the challenges and practicalities involved in ensuring that we get it right for our children.

I will return to the three themes that we have put forward. The Minister has repeatedly given assurances on the application of copyright with regard to AI training, but the Secretary of State created uncertainty by saying in the AI copyright consultation:

“At present, the application of UK copyright law to the training of AI models is disputed.”

When we create that level of uncertainty, we need at least an equal level of clarity to make amends, and that is partly what our new clause 20 calls for: among other things, a formal statement from the Intellectual Property Office or otherwise. I do not see why it is a challenge for the Government to put that forward and deliver.

--- Later in debate ---
Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

I would just like to clarify that we have thought long and hard about this Bill, along with many organisations and charities, to get it right.

Ben Spencer Portrait Dr Spencer
- Hansard - -

That is good to hear.

Max Wilkinson Portrait Max Wilkinson
- Hansard - - - Excerpts

I will try a third time, because we tried earlier. The Conservatives have clearly briefed the press that they are angling for a ban on social media for under-16s—it has been reported in multiple places. Can the shadow Minister confirm whether that is the Conservatives’ position or not?

Ben Spencer Portrait Dr Spencer
- Hansard - -

For the fourth time, and as I have said, new clause 19 would effectively create a de facto position whereby there are restrictions on the use of inappropriate social media services by children. It seeks to tackle the challenges of implementation, age verification and the scope of social media. It says that there needs to be work to make sure that we can actually do so and that, when we can, we should move in that direction, unless there is overwhelming evidence that it is not needed, such as with the shaking out of the Online Safety Act.

Finally, I return to new clause 21. Sadly, it has been widely misrepresented. The laws in this area are clear: the Equality Act puts in place obligations in relation to protected characteristics. The Supreme Court says that “sex” means biological sex, and that public authorities must collect data on protected characteristics to meet their duties under the Equality Act. The new clause would put that clear legal obligation into effect, and build in data minimisation principles to preserve privacy. There would be no outing of trans people through the new clause, but where public authorities collect and use sex data, it would need to be biological sex data.

Chris Bryant Portrait Chris Bryant
- View Speech - Hansard - - - Excerpts

As ever, it is good to see you in the Chair, Madam Deputy Speaker. I thank all right hon. and hon. Members who have taken part in the debate. If I do not manage to get to any of the individual issues that have been raised, and to which people want answers, I am afraid that is because of a shortness of time, and I will seek to write to them. I thank the officials who helped to put the Bill together, particularly Simon Weakley—not least because he not only did this Bill, but all the previous versions in the previous Parliament. He deserves a long-service medal, if not something more important.

I will start with the issues around new clauses 1, 11, 12 and 13, and amendment 9. The Government completely share the concern about the vulnerability of young people online, which lots of Members have referred to. However, the age of 13 was set in the Data Protection Act 2018—I remember, because I was a Member at the time. It reflects what was considered at the time to be the right balance between enabling young people to participate online and ensuring that their data is protected. Some change to protecting children online is already in train. As of last month, Ofcom finalised the child safety codes, a key pillar of the Online Safety Act. Guidance published at the same time started a three-month period during which all in-scope services likely to be accessed by children will be required to assess the risk of harm their services pose to them.

From July, the Act will require platforms to implement measures to protect children from harm, and this is the point at which we expect child users to see a tangible, positive difference to their online experiences. I wish it had been possible for all this to happen earlier— I wish the Act had been in a different year—but it is the Act it is. The new provisions include highly effective age checks to prevent children encountering the most harmful content, and adjusting algorithms to reduce the exposure to harmful content. Services will face tough enforcement from Ofcom if they fail to comply.

The Act very much sets the foundation for protecting children online. The Government continue to consider further options in pursuit of protecting children online, which is why the Department for Science, Innovation and Technology commissioned a feasibility study to understand how best to investigate the impact of smartphones and social media on children’s wellbeing. This will form an important part of our evidence base.