Data (Use and Access) Bill [Lords] Debate
Full Debate: Read Full DebateIqbal Mohamed
Main Page: Iqbal Mohamed (Independent - Dewsbury and Batley)Department Debates - View all Iqbal Mohamed's debates with the Department for Science, Innovation & Technology
(1 day, 14 hours ago)
Commons ChamberOn the point about transparency, the law is the law—it already exists. However, the law can be enforced, and people can be punished, only if actions that break our current laws come to light. Does the right hon. Gentleman agree that this is another reason that new clause 2 is essential?
I completely agree. The hon. Gentleman has stated the case: in order to enforce the law, we have to know who is breaking it.
There are all sorts of legal actions already under way, but this issue is about the extent to which scraping is going on. I agree with the right hon. Member for Hayes and Harlington (John McDonnell) on the importance of newspapers and the press. The press face the particular problem of retrieval-augmented generation—a phrase I did not think I would necessarily be introducing—which is the use of live data, rather than historic data; if historic data is used, it often produces the wrong results. The big tech companies therefore rely on retrieval-augmented generation, which means using current live data—that which is the livelihood of the press. It is absolutely essential for publishers that they should know when their material is being used and that they should have the ability to license it or not, as they choose.
Does the right hon. Gentleman agree that self-regulation just does not work in many industries? We can look at sewage reporting in the water industry, or at the AI and tech companies, which will use our data and not tell the regulators that they are doing so. There is a real need to strengthen the regulation.
The hon. Gentleman tempts me to broaden the debate, which I do not think you would encourage me to do at this late stage, Madam Deputy Speaker. However, he makes a very important point about self-regulation in this sector. The public, parents, and indeed children look to us to make sure we have their best interests at heart.
The Online Safety Act may only say that age minima should be enforced “consistently” rather than well, but I do not think the will of this Parliament was that it would be okay to enforce a minimum age limit consistently badly. What we meant was that if the law says right now that the age minimum is 13, or if it is 16 in the future—or whatever other age it might be—companies should take reasonable steps to enforce it. There is more checking than there used to be, but it is still very limited. The recent 5Rights report on Instagram’s teen accounts said that all its avatars were able to get into social media with only self-reported birth dates and no additional checks. That means that many thousands of children under the nominal age of 13 are on social media, and that there are many more thousands who are just over 13 but who the platform thinks are 15, 16 or 17, or perhaps 18 or 19. That, of course, affects the content that is served to them.
Either Ofcom or the ICO could tighten up the rules on the minimum age, but amendment 9 would require that to happen in order for companies to be compliant with the ICO regulation. The technology does exist, although it is harder to implement at the age 13 than at 18—of course, the recent Ofcom changes are all about those under the age of 18—but it is possible, and that technology will develop further. Ultimately, this is about backing parents who have a balance to strike: they want to make sure that their children are fully part of their friendship groups and can access all those opportunities, but also want to protect them from harm. Parents have a reasonable expectation that their children will be protected from wholly inappropriate content.
I rise to speak to new clauses 1 and 11, and briefly to new clause 2. The Liberal Democrats believe that the Government have missed a trick by not including in this Bill stronger provisions on children’s online safety. It is time for us to start treating the mental health issues arising from social media use and phone addiction as a public health crisis, and to act accordingly.
We know that children as young as nine and 10 are watching hardcore, violent pornography. By the time they are in their teens, it has become so normalised that they think violent sexual acts such as choking are normal—it certainly was not when we were teenagers. Girls are starving themselves to achieve an unrealistic body image because their reality is warped by airbrushed images, and kids who are struggling socially are sucked in by content promoting self-harm and even suicide. One constituent told me, “I set up a TikTok account as a 13-year-old to test the horrors, and half a day later had self-harm content dominating on the feed. I did not search for it; it found me. What kind of hell is this? It is time we gave our children back their childhood.”
New clause 1 would help to address the addictive nature of endless content that reels children in and keeps them hooked. It would raise the minimum age for social media data processing from 13 to 16 right now, meaning that social media companies would not be able to process children’s data for algorithmic purposes. They would still be able to access social media to connect with friends and access relevant services, which is important, but the new clause would retain exceptions for health and educational purposes, so that children who were seeking help could still find it.
We know that there is a correlation between greater social media use among young people since 2012 and worsening mental health outcomes. Teachers tell me regularly that children are struggling to concentrate and stay awake because of lack of sleep. Some are literally addicted to their phones, with 23% of 13-year-old girls in the UK displaying problematic social media use. The evidence is before us. It is time to act now—not in 18 months and not in a couple of years. The addictive nature of the algorithm is pernicious, and as legislators we can do something about it by agreeing to this new clause 1.
It is time to go further. This Bill does not do it, but it is time that we devised legislation to save the next generation of teenagers from the horrors of online harm. Ofcom’s new children’s code provides hope that someone ticking a box to say they are an adult will no longer be enough to allow access to adult sites. That is a good place to start; let us hope it works. If it does not, we need to take quick and robust action to move further with legislation.
Given the nature of the harms that exist online, I also support new clause 11 and strongly urge the Government to support it. No parent should have to go through the agony experienced by Ellen Roome. Losing a child is horrific enough, but being refused access to her son’s social media data to find out why he died was a second unacceptable agony. That must be changed, and all ISPs should be compelled to comply. New clause 11 would make that happen. I heard what the Minister said about coroners, but I strongly believe that legislation is needed, with a requirement to release data or provide access to their children’s account for any parent or guardian of someone under 18 who has died. There is, as far as I can see, no reason not to support this new clause.
Briefly, I echo calls from across the House to support new clause 2 in support of our creatives. Creativity is a uniquely human endeavour. Like others, I have been contacted by many creators who do not want their output stolen by AI companies without consent or permission. It is vital that AI companies comply with copyright legislation, which clearly has to be updated to meet the requirements of the brave new world of tech that we now live in.
I rise to confirm my agreement with new clauses 1 and 12, and I associate myself with the speech of the hon. Member for South Devon (Caroline Voaden). I have had several emails on the protection of copyrighted information and revenue streams for artists, including from Yvonne, who contacted me recently. It is essential that the creative arts and intellectual property are protected and that artists are properly compensated if their output is used in AI.
On new clauses 1 and 12, the case for raising the age of consent for data processing from 13 to 16 has been well made across the House, so I will not repeat the points made, but I will say that it is essential that we give our children their childhoods back. They need to be protected from the toxic content to which they are being exposed by social media and online.
New clauses 3 to 6 and new clause 14 would place transparency requirements on AI companies to report on what information and data they have used, from where, and with what permission. That is essential to holding the AI companies to account and to ensuring that content holders and data owners are informed and have adequate channels of redress for misuse of their information.
I am sure that new clause 7 was spoken about while I was out of the Chamber, but let me say now that the right for our citizens to use non-digital verification is key. My mother—who is in her late 60s, bless her—would not have a clue what to do if she did not have family to help her with her benefits claims, doctors’ prescriptions, appointments and so on. We cannot exclude millions of our citizens who may choose not to have smartphones and not to be exposed to toxic content online, or who are simply not tech-literate. I urge the Government to ensure that we do not exclude millions of our citizens. I also strongly support new clause 11, but I will defer to earlier speakers in that regard.
As for new clause 18, many constituents have written to me or spoken to me, expressing concern about sharing their NHS and other private data with third parties such as Palantir. It is essential for this new Government to adopt a posture of supporting ethical, transparent business practices for all suppliers who provide services in our country. We have already heard about the background of Palantir. I do not know how true this is, but some of my constituents believed, or had read, that during the Prime Minister’s first visit to the US, after meeting Donald Trump he visited Palantir’s headquarters, or one of its offices. I urge the Government to protect—
Order. The hon. Gentleman’s time is up.