Online Harm: Child Protection Debate
Full Debate: Read Full DebateEmily Darlington
Main Page: Emily Darlington (Labour - Milton Keynes Central)Department Debates - View all Emily Darlington's debates with the Department for Science, Innovation & Technology
(1 day, 9 hours ago)
Commons Chamber
Emily Darlington (Milton Keynes Central) (Lab)
This week is Eating Disorders Awareness Week, and we must remember the acceleration of online harms. We have heard horrific accounts of ChatGPT giving young people diets of 600 calories per day, which is just appalling. We know the suffering and pain caused by seeing images tagged with the terms “ana”, “thinspiration” and other terms that should go. The promotion of such content is now a category 1 offence, and Ofcom should be weeding it out. The hon. Member for Winchester (Dr Chambers) is absolutely right to say that that measure should be extended to bots.
I thank the Chair of the Science, Innovation and Technology Committee, my hon. Friend the Member for Newcastle upon Tyne Central and West (Dame Chi Onwurah), for her fantastic speech. We have taken this matter seriously since the very beginning of the parliamentary Session, and we have done a lot of work on it. I echo her call for Ministers to look again at the recommendations in our Committee’s “Social media, misinformation and harmful algorithms” report, which goes well beyond misinformation and into how the damage is done.
Protecting our children and young people online is extremely important. The Online Safety Act was an important step forward, but it has not been fully implemented by Ofcom, it is not proactive enough, and it is too dependent on what social media companies themselves tell Ofcom. In the spirit of consultation—I know that we will get to that—I have done my own consultation with 500-plus 14 to 16-year-olds across my Milton Keynes Central constituency. Some 91% of them have a phone, and 80% have social media profiles. However, what will surprise the House is what young people consider social media profiles to be. We consider them to be Facebook or Instagram, while they consider them to be YouTube and Roblox—two organisations not covered by the Australian model. Additionally, 74% of those 14 to 16-year-olds spend two to seven hours online a day. Let me remind the House that, at that age, the brain development of young women is close to finished, while for young men, whose brain development does not finish until they are about 25, it is nowhere near complete. We know that from the science—just to be clear, that is not an opinion. Brain development in young women and girls happens differently, so should we therefore have different rules for young women and men?
Fifty-nine per cent of the 14 to 16-year-olds have been contacted by strangers, and more than a third of that was through Roblox, which is not covered by the Australian social media ban. Thirty-three per cent have been bullied, and a third of those was on Roblox. The Australian social media ban—which I assume is what the Liberal Democrats are talking about when they say they are in favour of a ban—does not cover YouTube or Roblox, and we have not even looked at whether it is effective. A ban is a blunt tool that essentially raises the flag of surrender to social media platforms and declares that there is no way of making social media safe. That is essentially what the Conservatives did when the Online Safety Act 2023 was passed: they said, “We cannot go far enough, so we are going to roll back. It is about free speech.” No, it is not about free speech. Freedom of speech was written into law in this country and spread around the world, so we understand how to protect it and limit its harm. The Online Safety Act was a missed opportunity. It also took seven years to get through this House, but we do not have seven years to wait.
There would also be unintended consequences to a ban. I had the pleasure of meeting Ian Russell the other night, and we had a really powerful discussion. My heart goes out to him, as one parent to another, given what his family have been through. He does not jump to the easy solution of a social media ban. The Molly Rose Foundation has done a brilliant briefing paper, which every MP should read, about why it does not support a ban: it wants the online world to be safe for children, but a ban does not make it so.
My hon. Friend is making an excellent speech. I commend her work in reaching out to young people; it sounds superb. The lesson may be that we should all do exactly that. I am running a survey myself. She mentioned the Molly Rose Foundation, and I have met some of its staff to discuss its work. A family in my constituency of Reading suffered a terrible incident—their son was murdered in an incident of online bullying—and they have a different view. Does my hon. Friend agree that it is important that we properly listen to the families and consider the different views in the consultation?
Emily Darlington
I absolutely do. My full sympathy goes to that family in my hon. Friend’s constituency—it is the worst thing in the world for a parent to lose a child. But we have to get this right, which is why it is right that we have a consultation. It does no child any good if we jump to a conclusion that does not actually protect children.
Although I maintain an open mind, I worry about a full ban. Some children rely on social media for connection, often including those who are exploring their sexuality—LGBTQ+ people—and those who are neurodivergent. The consequences for them could be devastating, so we need to consider their views. If young people get around the ban, as they do in Australia, they are less likely to report when they see harmful content or are being targeted on social media, because they worry that they will get in trouble for breaking the law.
A ban would create a cliff edge at 16. No matter the person’s maturity—I have already talked about the different brain development in young women and men—their skills or what they have been taught, there is a cut-off at 16. All of a sudden it does not matter, and they go into a world that is not safe. Younger children do not have their own social media profiles; they use their parents’ devices. Often, they start with a video of Peppa Pig, and all of a sudden—who knows where it ends up? A ban would not address that. So, what is the solution? Doing nothing is not an option—I think the whole House can agree on that.
Monica Harding
I was interested in the hon. Member’s survey. I have done my own very unscientific survey of young people, and all of them seem to want some form of regulation. With that in mind, we must hurry up—does the hon. Member agree?
Emily Darlington
I absolutely agree. Young people, particularly those in the mid-teenage years, understand this issue in a way that sometimes we do not because, quite frankly, our online experience is completely different from theirs. If Members want to test that, they should open an app such as Pinterest and compare what is fed into their Pinterest boards with their child’s Pinterest boards. It is a completely different experience. If Members do not have children, they should ask younger member of staff to open the same app on the different phones, and they will see a completely different world.
A local organisation in my constituency, CyberSafe Scotland, surveyed children about what they were being fed on TikTok. There is a road in my constituency called North Anderson Drive, and children on one side of North Anderson Drive were being fed different content to the children on the other side of it. It is not just an age thing; it is really specific, and we cannot understand what each individual person is seeing because it is different for everybody.
Emily Darlington
That is a very important point about how sophisticated the technology has become. When we ask companies to take action to stop outcomes, the technology exists to do that. We are not asking them to reinvent the wheel or come up with new technology. It already exists because they are even microtargeting two different sides of the road.
Having discussed this with experts, parents and—most importantly—young people, what do I think we need to consider? First, we need to fully and properly implement the Online Safety Act 2023. That must be done at speed, and it requires nothing from the House. It has been a request of the Secretary of State and the Minister, and I recommend that Ofcom gets on and does that as quickly as possible. We must make safe spaces for children online. How do we do that? Part of the answer is ensuring that content is related to ratings that we already understand as parents, such as those from the British Board of Film Classifications. I have been asking YouTube what rating YouTube Kids has for about a year now. Is it rated U? Is it 12A? Is it 15? It cannot tell me because it does not do things on that basis.
As a parent I want to know the rating before allowing my children on an app, because parents have a role in this as well. All apps should be rated like videogames. Roblox has a 5+ rating, which does not exist in videogame ratings. We see ratings such as 4+ or 9+, but those are made up. At the parents forum that I did after the survey, one parent said that she walked in on her nine-year-old playing “guns versus knives”—on an app that is rated 5+. The ratings on apps mean nothing, yet we have video game ratings that we as parents understand, so why are they not used? Should in-app purchases ever be allowed for young children? What is the age at which in-app purchases should be allowed in a game?
We must consider the time limits for the different stages of brain development. We have guides on fruit and vegetables that recommend five a day to parents. We all know that. Schools use the same language, we use the same language, yet we have nothing to support parents in deciding how long a child should be online at different stages of brain development. I hope that the evidence that the Science, Innovation and Technology Committee collects will help inform that.
We need to change addictive and radicalising platform algorithms. To protect children from child sexual abuse images, we need to talk to those behind iOS and Android to stop the creation of self-generated child sexual abuse images—some 70% to 80% of child sexual abuse images are self-generated—and we need to stop end-to-end encryption sites from sharing them. We have technology that can do that. We should always keep the ability to ban in our pockets, but any ban should be for particular apps. We should not ban our children and young people from having an online experience that is good.