(2 weeks, 4 days ago)
Commons ChamberThank you, Madam Deputy Speaker. It is a pleasure to have heard the last few speeches, which made very important points, but even with five minutes, time is still short for me. I will speak briefly about a couple of aspects of social media and mobile phones.
On social media, let us get on with it. We have had this issue come back from the Lords multiple times, and we can do this. There is a glaring logical flaw at the heart of the Government’s argument for not taking action—we have also heard it from a bunch of Labour MPs today—which is, “We can’t do this one thing, because there are some other things we could do as well.” That just does not hold water. All those other things—around gaming, other types of applications, chatbots, addictive features and so on—could be additive to a ban on social media for children under the age of 16. They would still, by the way, be very relevant to child safety. I remind the House that our duty to children extends to those aged up to 18, as per the Children Act 1989 and our commitments to the United Nations.
There are issues to resolve about a ban—exactly where the lines should be drawn; exactly what is in and what is out—and yes, of course, the Government have to consult on those issues, but they do not need to consult further on the principle of whether the country and the House of Commons want a ban on young people under the age of 16 accessing social media, a conclusion that so many other countries are also coming to.
On mobile phones, throughout the progress of the Bill, I have found a remarkable contrast. The Government said for so long that they would not ban phones in schools because there should be some discretion for headteachers, but they are going to tell them precisely how many items of branded school uniform they are allowed to specify, and will tell them that in secondary schools that could include a tie, but in primary schools, for some bizarre reason, it cannot.
I am pleased that the Government have partly seen the light. The Minister, whom we all like and respect, said last week that the problem had already been solved—and presumably it has now been re-solved, as the Government have come back to the issue—but I have to say that that is not what children say. What children tell us, both informally and when they are answering surveys about the actual use of mobile phones in schools, is how often lessons get interrupted, teachers are filmed, and bullying and other stuff happens at break times and lunchtimes. We need to act. Of course, there can be individual exceptions for those using assistive and adaptive technology, for young carers, and for others, but the one exception that we must not have is on the type of ban.
The critical question is about having a policy of “not seen, not heard”. Every school in the country, pretty much, already has at least that, but I am afraid that it is not effective as a ban. If you have this thing in your pocket, or in your bag at your foot, it is still there, and you feel its presence. If it vibrates, you might actually feel it, physically; but even if you do not, you feel that compulsion towards it. The only way to make a school truly free of the scourge of mobile phones is to have them away from the child. The “not seen, not heard” approach does not work.
The main argument for saying that we have to allow “not seen, not heard” is about cost. I understand that. Pouches, which a couple of colleagues have mentioned, do have a cost, but we do not have to do pouches. There are other ways of doing this. I mentioned the Petersfield school in my constituency, which has a phones-away-from-children ban, and which uses a simple device—a plastic box that can be purchased in most large-format Swedish retailers. That is locked away in a cupboard, along with a number of other boxes, until the end of the day. The biggest cost has been the foam inserts, with numbered slots in which each child puts their phone, but the sum total cost is very reasonable.
I want to answer the hon. Member for Banbury (Sean Woodcock), who is no longer with us, so to speak. He asked why had we not taken this measure when we were in government. That is a perfectly reasonable question. There are two reasons: first, the issue has become more acute; and, secondly, the attitude of headteachers. It has changed. We have gone from headteachers and their representative bodies saying, “The best way for you to support me in this school is not to impose a national ban,” to them saying the exact opposite—that the best way to support schools and headteachers is to have a ban written into law.
I think that would try Madam Deputy Speaker’s patience. Today is the day that we can take action on those two points.
(1 year ago)
Commons ChamberThe hon. Gentleman tempts me to broaden the debate, which I do not think you would encourage me to do at this late stage, Madam Deputy Speaker. However, he makes a very important point about self-regulation in this sector. The public, parents, and indeed children look to us to make sure we have their best interests at heart.
The Online Safety Act may only say that age minima should be enforced “consistently” rather than well, but I do not think the will of this Parliament was that it would be okay to enforce a minimum age limit consistently badly. What we meant was that if the law says right now that the age minimum is 13, or if it is 16 in the future—or whatever other age it might be—companies should take reasonable steps to enforce it. There is more checking than there used to be, but it is still very limited. The recent 5Rights report on Instagram’s teen accounts said that all its avatars were able to get into social media with only self-reported birth dates and no additional checks. That means that many thousands of children under the nominal age of 13 are on social media, and that there are many more thousands who are just over 13 but who the platform thinks are 15, 16 or 17, or perhaps 18 or 19. That, of course, affects the content that is served to them.
Either Ofcom or the ICO could tighten up the rules on the minimum age, but amendment 9 would require that to happen in order for companies to be compliant with the ICO regulation. The technology does exist, although it is harder to implement at the age 13 than at 18—of course, the recent Ofcom changes are all about those under the age of 18—but it is possible, and that technology will develop further. Ultimately, this is about backing parents who have a balance to strike: they want to make sure that their children are fully part of their friendship groups and can access all those opportunities, but also want to protect them from harm. Parents have a reasonable expectation that their children will be protected from wholly inappropriate content.
Caroline Voaden (South Devon) (LD)
I rise to speak to new clauses 1 and 11, and briefly to new clause 2. The Liberal Democrats believe that the Government have missed a trick by not including in this Bill stronger provisions on children’s online safety. It is time for us to start treating the mental health issues arising from social media use and phone addiction as a public health crisis, and to act accordingly.
We know that children as young as nine and 10 are watching hardcore, violent pornography. By the time they are in their teens, it has become so normalised that they think violent sexual acts such as choking are normal—it certainly was not when we were teenagers. Girls are starving themselves to achieve an unrealistic body image because their reality is warped by airbrushed images, and kids who are struggling socially are sucked in by content promoting self-harm and even suicide. One constituent told me, “I set up a TikTok account as a 13-year-old to test the horrors, and half a day later had self-harm content dominating on the feed. I did not search for it; it found me. What kind of hell is this? It is time we gave our children back their childhood.”
New clause 1 would help to address the addictive nature of endless content that reels children in and keeps them hooked. It would raise the minimum age for social media data processing from 13 to 16 right now, meaning that social media companies would not be able to process children’s data for algorithmic purposes. They would still be able to access social media to connect with friends and access relevant services, which is important, but the new clause would retain exceptions for health and educational purposes, so that children who were seeking help could still find it.
We know that there is a correlation between greater social media use among young people since 2012 and worsening mental health outcomes. Teachers tell me regularly that children are struggling to concentrate and stay awake because of lack of sleep. Some are literally addicted to their phones, with 23% of 13-year-old girls in the UK displaying problematic social media use. The evidence is before us. It is time to act now—not in 18 months and not in a couple of years. The addictive nature of the algorithm is pernicious, and as legislators we can do something about it by agreeing to this new clause 1.
It is time to go further. This Bill does not do it, but it is time that we devised legislation to save the next generation of teenagers from the horrors of online harm. Ofcom’s new children’s code provides hope that someone ticking a box to say they are an adult will no longer be enough to allow access to adult sites. That is a good place to start; let us hope it works. If it does not, we need to take quick and robust action to move further with legislation.
Given the nature of the harms that exist online, I also support new clause 11 and strongly urge the Government to support it. No parent should have to go through the agony experienced by Ellen Roome. Losing a child is horrific enough, but being refused access to her son’s social media data to find out why he died was a second unacceptable agony. That must be changed, and all ISPs should be compelled to comply. New clause 11 would make that happen. I heard what the Minister said about coroners, but I strongly believe that legislation is needed, with a requirement to release data or provide access to their children’s account for any parent or guardian of someone under 18 who has died. There is, as far as I can see, no reason not to support this new clause.
Briefly, I echo calls from across the House to support new clause 2 in support of our creatives. Creativity is a uniquely human endeavour. Like others, I have been contacted by many creators who do not want their output stolen by AI companies without consent or permission. It is vital that AI companies comply with copyright legislation, which clearly has to be updated to meet the requirements of the brave new world of tech that we now live in.