Data (Use and Access) Bill [Lords] Debate
Full Debate: Read Full DebateMax Wilkinson
Main Page: Max Wilkinson (Liberal Democrat - Cheltenham)Department Debates - View all Max Wilkinson's debates with the Department for Science, Innovation & Technology
(1 day, 18 hours ago)
Commons ChamberThere are a few issues with new clause 1. One is the scope in terms of the definition of networking services and ensuring platforms such as WhatsApp are not captured within it. Looking at new clause 19, there are challenges to implementing in this area. There is no point in clicking our fingers and saying, “Let’s change the age of digital consent,” without understanding the barriers to implementation, and without understanding whether age verification can work in this context. We do not want to create a system and have people just get around it quite simply. We need the Government to do the work in terms of setting it up so that we can move towards a position of raising the age from 13 to 16.
The press have obviously been briefed by Conservatives that the Conservatives are pushing for a ban on social media for under-16s, but it seems that what is actually being suggested is a review of the digital age of consent with a view to perhaps increasing it to 16. The two positions are very different, and I wonder whether the tough talk in the press matches what is actually being proposed by the Opposition today.
I have been very clear on this, and it is important in such a complex area to look at the detail and nuance of the challenges around—(Interruption.) Well, it is very easy to create a new clause where we click our fingers and say, “Let’s make this more illegal; let’s bring in x, y or z restriction.” As a responsible Opposition, we are looking at the detail and complexities around implementing something like this. [Interruption.] I have been asked a few questions and the hon. Member for Cheltenham (Max Wilkinson) might want to listen to the rationale of our approach.
One question is how to define social media. Direct messaging services such as WhatsApp and platforms such as YouTube fall in the scope of social media. There are obviously social media platforms that I think all of us are particularly concerned about, including Snapchat and TikTok, but by changing the age of digital consent we do not want to end up capturing lower-risk social media platforms that we recognise are clearly necessary or beneficial, such as education technology or health technology platforms. And that is before we start looking at whether age verification can work, particularly in the 13-to-16 age group.
I think that was wishful thinking by the Minister in this debate.
Our new clause says that we need to look at the desirability of raising the digital age of consent for data processing from 13 to 16 in terms of its impact particularly on issues such as the social and educational development of children, but also the viability of doing so in terms of the fallout and the shaking out of the Online Safety Act and with regard to age verification services. Should there then be no evidence to demonstrate that it is unnecessary, we would then raise the digital age of consent to 13 to 16. It might be the case that, over the next six months, the shaking out of the Online Safety Act demonstrates that this intervention is not necessary. Perhaps concerns around particular high-risk social media platforms will change as technology evolves. We are saying that the Government should do the work with a view to raising the age in 18 months unless there is evidence to prove the contrary. [Interruption.] I have made this crystal clear, and if the Minister would choose to look at the new clause, rather than chuckling away in the corner, he might see the strategy we are proposing.
I thank the shadow Minister for giving way. As ever, he is extremely polite in his presentation and in his dealing with interventions, but I am not sure that he dealt with my intervention, which was basically asking whether the Conservative party position is as it has briefed to the press—that it wishes to ban social media for under-16s—or that it wishes to have a review on raising the age of data consent. It cannot be both.
I say again that the position is that, following a careful look at the evidence regarding the desirability and validity of doing so—taking into account findings regarding the impact and implementation of the Online Safety Act and age verification and how one defines social media, particularly high-risk platforms—unless there is direct evidence to show that raising the age from 13 to 16 is unnecessary, which there may be, then we should raise it from 13 to 16. If that has not provided clarity, the hon. Gentleman is very welcome to intervene on me again and I will try and explain it a third time, but I think Members have got a grasp now.
This new clause will also tackle some of the concerns at the heart of the campaign for Jools’ law, and I pay tribute to Ellen Roome for her work in this area. I am very sympathetic to the tragic circumstances leading to this campaign and welcome the additional powers granted to coroners in the Bill, but I know that they do not fully address Ellen Roome’s concerns. The Government need to explain how they can be sure that data will be retained in the context of these tragedies, so that a coroner will be able to make sure, even if there are delays, that it can be accessed. If the Minister could provide an answer to that in his winding-up speech, and detail any further work in the area, that would be welcome.
On parental access to children’s data more broadly, there are difficult challenges in terms of article 8 rights on privacy and transparency, especially for children aged 16 to 17 as they approach adulthood. Our new clause addresses some of these concerns and would also put in place the groundwork to, de facto, raise the digital age of consent for inappropriate social media to 16 within 18 months, rendering the request for parental access to young teenage accounts obsolete.
I urge colleagues across the House to support all our amendments today as a balanced, proportionate and effective response to a generational challenge. The Bill and the votes today are an opportunity for our Parliament, often referred to as the conscience of our country, to make clear our position on some of the most pressing social and technological issues of our time.
Order. From the next speaker, there will be a five-minute time limit.
As many Members will be aware, my constituent Ellen Roome knows only too well the tragedies that can take place as a result of social media. I am pleased that Ellen joins us in the Gallery to hear this debate in her pursuit of Jools’ law.
In 2022, Ellen came home to find her son Jools not breathing. He had tragically lost his life, aged just 14. In the following months, Ellen battled the social media giants—and she is still battling them—to try to access his social media data, as she sought answers about what had happened leading up to his death. I am grateful to the shadow Minister, the hon. Member for Runnymede and Weybridge (Dr Spencer), for raising this in his speech. In her search for answers, Ellen found herself blocked by social media giants that placed process ahead of compassion. The police had no reason to suspect a crime, so they did not see any reason to undertake a full investigation into Jools’ social media. The inquest did not require a thorough analysis of Jools’ online accounts. None of the social media companies would grant Ellen access to Jools’ browsing data, and a court order was needed to access the digital data, which required eye-watering legal fees.
The legal system is unequipped to tackle the complexities of social media. In the past, when a loved one died, their family would be able to find such things in their possession—perhaps in children’s diaries, in school books or in cupboards. However, now that so much of our lives are spent online, personal data is kept by the social media giants. New clause 11 in my name would change that, although I understand that there are technical and legal difficulties.
The Minister and the Secretary of State met Ellen and me this morning, along with the hon. Member for Darlington (Lola McEvoy), and we are grateful for the time they gave us. My new clause will not go to a vote today, but we will keep pushing because Ellen and other parents like her should not have to go through this to search for answers when a child has died. I understand that there are provisions in the Bill that will be steps forward, but we will keep pushing and we will hold the Government’s and all future Governments’ feet to the fire until we get a result.
It was great to meet this morning, although I am sorry it was so late and so close to Report stage; I wish it had been earlier. We were serious in the meeting this morning: we will do everything we possibly can to make sure that coroners understand both their powers and their duties in this regard, and how they should be operating with families and the prosecuting authorities as well if necessary. We will also do everything we can to ensure that the technical companies embrace the point that they need to look after the families of those who have lost loved ones when they are young.
I thank the Minister for his intervention. He is absolutely right. There are clear issues of process here. There are differential approaches across the country—different coroners taking different approaches and different police forces taking different approaches. The words of Ministers have weight and I hope that coroners and police forces are taking note of what needs to happen in the future so that there are proper investigations into the deaths of children who may have suffered misadventure as a result of social media.
On related matters, new clause 1 would gain the support of parents like Ellen up and down this country. We need to move further and faster on this issue of social media and online safety—as this Government promised on various other things—and I am pleased that my party has a very clear position on it.
I will now turn to the issue of copyright protections. I held a roundtable with creatives in Cheltenham, which is home to many tech businesses and AI companies. The creative industries in my town are also extremely strong, and I hear a lot of concern about the need to protect copyright for our creators. The industry, is worth £124 billion or more every year, remains concerned about the Government’s approach. The effects of these issues on our culture should not be understated.
We would be far poorer both culturally and financially if our creatives were unable to make a living from their artistic talents. I believe there is still a risk of the creative industry being undermined if the Government remove protections to the benefit of AI developers. I trust that Ministers are listening, and I know that they have been listening over the many debates we have had on this issue. If they were to remove those protections, they would tip the scales in favour of AI companies at the cost of the creative industry. When we ask AI companies and people in tech where the jobs are going to come from, the answers are just not there.
The amendments tabled by my hon. Friend the Member for Harpenden and Berkhamsted (Victoria Collins) would reinstate copyright protections at all levels of AI development and reinforce the law as it currently stands. It is only fair that when creative work is used for AI development, the creator is properly compensated. The Government have made positive noises on this issue in multiple debates over the last few months. That is a positive sign, and I think that in all parts of this House we have developed a consensus on where things need to move—but creatives remain uneasy about the implications for their work and are awaiting firm action.
Ministers may wish to tackle this issue with future action, and I understand that it might not be dealt with today, but our amendments would enable that to happen. They also have an opportunity today: nothing would send a stronger signal than Government support and support from Members from across the House for my hon. Friend’s amendments, and I implore all Members to back them.
I rise to speak to new clauses 4, 16 and 17, but first let me say that this is a very ambitious and weighty piece of legislation. Most of us can agree on sections or huge chunks of it, but there is anxiety in the creative industries and in the media—particularly the local media, which have had a very torrid time over the last few years through Brexit and the pandemic. I thank UK Music, the News Media Association and Directors UK for engaging with me on this issue and the Minister for his generosity in affording time to Back Benchers to discuss it.
AI offers massive opportunities to make public services and businesses more effective and efficient, and this will improve people’s lives. However, there is a fundamental difference between using AI to manage stock in retail or distribution, or for making scientific breakthroughs that will improve people’s health, and the generative AI that is used to produce literature, images or music. The latter affects the creative industries, which have consistently seen faster and more substantial growth than the overall economy. The creative industries’ gross value added grew by over 50% in real terms compared with the overall UK economy, which grew by around a fifth between 2010 and 2022. That is why the Government are right to have identified the creative industries as a central plank of their industrial strategy, and it is right to deliver an economic assessment within 12 months, as outlined in Government new clauses 16 and 17. I welcome all that.
I know it is not the Government’s intention to deal with copyright and licensing as part of the Bill, but because of the anxiety in the sector the issues have become conflated. Scraping is already happening, without transparency, permission or remuneration, in the absence of a current adequate framework. The pace of change in the sector, and the risk of tariffs from across the pond, mean it is imperative that we deal with the threat posed to the creative industries as soon as possible. We are now facing 100% tariffs on UK films going to the USA, which increases that imperative.
I welcome the Government’s commitment to engage with the creative industries and to implement a programme to protect them, following consultation. I would welcome an overview from the Minister in his summing up about progress in that regard. The more we delay, the worse the impact could be on our creative sector. I am also concerned that in the Government’s correct mission to deliver economic growth, they may inadvertently compromise the UK’s robust copyright laws. Instead, we should seek to introduce changes, so that creatives’ work cannot be scraped by big AI firms without providing transparency or remunerating the creatives behind it. Failure to protect copyright is not just bad for the sector as a whole, or the livelihoods of authors, photographers, musicians and others; it is bad for our self-expression, for how robust the sector can be, and for how it can bring communities together and invite us to ask the big questions about the human condition. Allowing creators to be uncredited and undercut, with their work stripped of attribution and their livelihoods diluted in a wave of synthetic imitation, will disrupt the creative market enormously. We are not talking about that enough.
It is tempting to lure the big US AI firms into the UK, giving the economy a sugar rush and attracting billions of pounds-worth of data centres, yet in the same breath we risk significantly draining economic value from our creative industries, which are one of the UK’s most storied pillars of our soft power. None of this is easy. The EU has grappled with creating a framework to deal with this issue for years without finding an equitable solution. I do not envy what the Government must navigate. However, I ask the Minister about the reports that emerged over the weekend, and whether the Government are moving away from an opt-out system for licensing, which creatives say will not work. Will that now be the Government’s position?
Harnessing the benefits of AI—economic, social and innovative—is not diametrically opposed to ensuring that the rights of creatives are protected. We must ensure transparency in AI, as covered in new clause 4, so that tech companies, some of which are in cahoots with some of the more troubling aspects of the US Administration, do not end up with the power to curate an understanding of the world that reflects their own peculiar priorities. Big AI says transparency will effectively reveal its trade secrets, but that need not necessarily be the case, as my hon. Friend the Member for South Derbyshire (Samantha Niblett) said. A simple mechanism to alert creators when their content is used is well within the abilities of these sophisticated companies. They just need the Government to prod them to do it.
The Government are working hard. I know that they care passionately about the sector, and the economic and social value it brings. I look forward to hearing how they will now move at pace to address the concerns I have outlined, even if they cannot do so through the Bill.
I will try a third time, because we tried earlier. The Conservatives have clearly briefed the press that they are angling for a ban on social media for under-16s—it has been reported in multiple places. Can the shadow Minister confirm whether that is the Conservatives’ position or not?
For the fourth time, and as I have said, new clause 19 would effectively create a de facto position whereby there are restrictions on the use of inappropriate social media services by children. It seeks to tackle the challenges of implementation, age verification and the scope of social media. It says that there needs to be work to make sure that we can actually do so and that, when we can, we should move in that direction, unless there is overwhelming evidence that it is not needed, such as with the shaking out of the Online Safety Act.
Finally, I return to new clause 21. Sadly, it has been widely misrepresented. The laws in this area are clear: the Equality Act puts in place obligations in relation to protected characteristics. The Supreme Court says that “sex” means biological sex, and that public authorities must collect data on protected characteristics to meet their duties under the Equality Act. The new clause would put that clear legal obligation into effect, and build in data minimisation principles to preserve privacy. There would be no outing of trans people through the new clause, but where public authorities collect and use sex data, it would need to be biological sex data.