Data (Use and Access) Bill [Lords]

Damian Hinds Excerpts
Wednesday 7th May 2025

(1 day, 18 hours ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Damian Hinds Portrait Damian Hinds (East Hampshire) (Con)
- View Speech - Hansard - -

Our nominal minimum age for social media usage in this country comes from a well-meaning piece of American legislation originally passed in 1998. The age did not have to be 13. Back in 1998 it was going to be 16, but it was changed to 13. With the birth of GDPR, the age did not have to be 13: the default was 16. Various countries, including Germany, the Netherlands and Ireland, selected 16, but we selected 13. That means that at the age of 13 people can sign up to social media, have their behaviour tracked for the purpose of targeting content and ads, start their own channel, have multiple IDs and make decisions about what details of their private life they share.

Many people believe that, because of brain development, 13 is too young to make some of those decisions reliably, and that there are real downsides, risks and dangers from the combination of social media and the ready availability of a handheld electronic device. For children, there are addictive features, an effect on sleep, an ease of making unwanted content, rabbit holes to fall down and corrosive content that plays on the insecurity of adolescence.

Objections to raising the age to 16 are normally centred around worries that pro-social applications will be hit and that there will be unintended consequences, such as children not being able to seek help if they in an abusive family, or to find information about contraception or whatever else they may need to know. Indeed, those were some of the reasons why, back in 1998, the age of 16 became 13, and those reasons came up again here in the debates over GDPR. As such, I worded new clause 12 to demonstrate how we could do it without losing anything, by having very broad categories of exemption. However, even with those exemptions, the Government would still be able to say—I am sure they will, and will say some of the same things about new clause 1 shortly—that new clause 12 is technically inadequate, worded badly and contains the wrong exemptions, and that there would be unintended consequences. New clause 19, though, which was tabled by the official Opposition, is almost impossible to argue against, because it contains the default position that these exemptions will change; under its provision, those changes would be subject to review, which would ensure that all those considerations were taken into account.

--- Later in debate ---
Iqbal Mohamed Portrait Iqbal Mohamed
- Hansard - - - Excerpts

Does the right hon. Gentleman agree that self-regulation just does not work in many industries? We can look at sewage reporting in the water industry, or at the AI and tech companies, which will use our data and not tell the regulators that they are doing so. There is a real need to strengthen the regulation.

Damian Hinds Portrait Damian Hinds
- Hansard - -

The hon. Gentleman tempts me to broaden the debate, which I do not think you would encourage me to do at this late stage, Madam Deputy Speaker. However, he makes a very important point about self-regulation in this sector. The public, parents, and indeed children look to us to make sure we have their best interests at heart.

The Online Safety Act may only say that age minima should be enforced “consistently” rather than well, but I do not think the will of this Parliament was that it would be okay to enforce a minimum age limit consistently badly. What we meant was that if the law says right now that the age minimum is 13, or if it is 16 in the future—or whatever other age it might be—companies should take reasonable steps to enforce it. There is more checking than there used to be, but it is still very limited. The recent 5Rights report on Instagram’s teen accounts said that all its avatars were able to get into social media with only self-reported birth dates and no additional checks. That means that many thousands of children under the nominal age of 13 are on social media, and that there are many more thousands who are just over 13 but who the platform thinks are 15, 16 or 17, or perhaps 18 or 19. That, of course, affects the content that is served to them.

Either Ofcom or the ICO could tighten up the rules on the minimum age, but amendment 9 would require that to happen in order for companies to be compliant with the ICO regulation. The technology does exist, although it is harder to implement at the age 13 than at 18—of course, the recent Ofcom changes are all about those under the age of 18—but it is possible, and that technology will develop further. Ultimately, this is about backing parents who have a balance to strike: they want to make sure that their children are fully part of their friendship groups and can access all those opportunities, but also want to protect them from harm. Parents have a reasonable expectation that their children will be protected from wholly inappropriate content.

Caroline Voaden Portrait Caroline Voaden (South Devon) (LD)
- View Speech - Hansard - - - Excerpts

I rise to speak to new clauses 1 and 11, and briefly to new clause 2. The Liberal Democrats believe that the Government have missed a trick by not including in this Bill stronger provisions on children’s online safety. It is time for us to start treating the mental health issues arising from social media use and phone addiction as a public health crisis, and to act accordingly.

We know that children as young as nine and 10 are watching hardcore, violent pornography. By the time they are in their teens, it has become so normalised that they think violent sexual acts such as choking are normal—it certainly was not when we were teenagers. Girls are starving themselves to achieve an unrealistic body image because their reality is warped by airbrushed images, and kids who are struggling socially are sucked in by content promoting self-harm and even suicide. One constituent told me, “I set up a TikTok account as a 13-year-old to test the horrors, and half a day later had self-harm content dominating on the feed. I did not search for it; it found me. What kind of hell is this? It is time we gave our children back their childhood.”

New clause 1 would help to address the addictive nature of endless content that reels children in and keeps them hooked. It would raise the minimum age for social media data processing from 13 to 16 right now, meaning that social media companies would not be able to process children’s data for algorithmic purposes. They would still be able to access social media to connect with friends and access relevant services, which is important, but the new clause would retain exceptions for health and educational purposes, so that children who were seeking help could still find it.

We know that there is a correlation between greater social media use among young people since 2012 and worsening mental health outcomes. Teachers tell me regularly that children are struggling to concentrate and stay awake because of lack of sleep. Some are literally addicted to their phones, with 23% of 13-year-old girls in the UK displaying problematic social media use. The evidence is before us. It is time to act now—not in 18 months and not in a couple of years. The addictive nature of the algorithm is pernicious, and as legislators we can do something about it by agreeing to this new clause 1.

It is time to go further. This Bill does not do it, but it is time that we devised legislation to save the next generation of teenagers from the horrors of online harm. Ofcom’s new children’s code provides hope that someone ticking a box to say they are an adult will no longer be enough to allow access to adult sites. That is a good place to start; let us hope it works. If it does not, we need to take quick and robust action to move further with legislation.

Given the nature of the harms that exist online, I also support new clause 11 and strongly urge the Government to support it. No parent should have to go through the agony experienced by Ellen Roome. Losing a child is horrific enough, but being refused access to her son’s social media data to find out why he died was a second unacceptable agony. That must be changed, and all ISPs should be compelled to comply. New clause 11 would make that happen. I heard what the Minister said about coroners, but I strongly believe that legislation is needed, with a requirement to release data or provide access to their children’s account for any parent or guardian of someone under 18 who has died. There is, as far as I can see, no reason not to support this new clause.

Briefly, I echo calls from across the House to support new clause 2 in support of our creatives. Creativity is a uniquely human endeavour. Like others, I have been contacted by many creators who do not want their output stolen by AI companies without consent or permission. It is vital that AI companies comply with copyright legislation, which clearly has to be updated to meet the requirements of the brave new world of tech that we now live in.

--- Later in debate ---
Chris Bryant Portrait Chris Bryant
- View Speech - Hansard - - - Excerpts

As ever, it is good to see you in the Chair, Madam Deputy Speaker. I thank all right hon. and hon. Members who have taken part in the debate. If I do not manage to get to any of the individual issues that have been raised, and to which people want answers, I am afraid that is because of a shortness of time, and I will seek to write to them. I thank the officials who helped to put the Bill together, particularly Simon Weakley—not least because he not only did this Bill, but all the previous versions in the previous Parliament. He deserves a long-service medal, if not something more important.

I will start with the issues around new clauses 1, 11, 12 and 13, and amendment 9. The Government completely share the concern about the vulnerability of young people online, which lots of Members have referred to. However, the age of 13 was set in the Data Protection Act 2018—I remember, because I was a Member at the time. It reflects what was considered at the time to be the right balance between enabling young people to participate online and ensuring that their data is protected. Some change to protecting children online is already in train. As of last month, Ofcom finalised the child safety codes, a key pillar of the Online Safety Act. Guidance published at the same time started a three-month period during which all in-scope services likely to be accessed by children will be required to assess the risk of harm their services pose to them.

From July, the Act will require platforms to implement measures to protect children from harm, and this is the point at which we expect child users to see a tangible, positive difference to their online experiences. I wish it had been possible for all this to happen earlier— I wish the Act had been in a different year—but it is the Act it is. The new provisions include highly effective age checks to prevent children encountering the most harmful content, and adjusting algorithms to reduce the exposure to harmful content. Services will face tough enforcement from Ofcom if they fail to comply.

The Act very much sets the foundation for protecting children online. The Government continue to consider further options in pursuit of protecting children online, which is why the Department for Science, Innovation and Technology commissioned a feasibility study to understand how best to investigate the impact of smartphones and social media on children’s wellbeing. This will form an important part of our evidence base.

Damian Hinds Portrait Damian Hinds
- Hansard - -

Will the Minister give way?

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

I am going to come to the right hon. Member’s amendment in a moment.

The study is being led by Dr Amy Orben of Cambridge University, and it is supported by scientists from nine of the UK’s premier universities, all with established expertise in this field. The study will report to the Government this month on the existing evidence base, ongoing research and recommendations for future research that will establish any causal links between smartphones, social media and children’s wellbeing. The Government will publish the report along with the planned next steps to improve the evidence base in this area to support policy making. Considering the extra work we are doing, I hope Members will not press their amendments.

--- Later in debate ---
Damian Hinds Portrait Damian Hinds
- Hansard - -

Very quickly, I want the Minister to confirm that the Ofcom children’s codes, to which he has referred, are all about the 18 age threshold. They are a very welcome move to filter out wholly inappropriate content that is designed for over-18s and other very harmful content, but they do not do anything for the initial threshold—the age minimum—at age 13.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

The right hon. Member makes a fair point.