Children’s Wellbeing and Schools Bill Debate

Full Debate: Read Full Debate
Department: Department for Work and Pensions

Children’s Wellbeing and Schools Bill

Lord Bethell Excerpts
Wednesday 21st January 2026

(1 day, 9 hours ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
What Australia did was a political decision. It made a choice and is sticking to it. We could choose to make the OSA what it was supposed to be, and that is what I would like to see. Everybody in this House should suck it up and vote because we have to make it clear that this is not what we were promised and that this is not good enough.
Lord Bethell Portrait Lord Bethell (Con)
- Hansard - -

My Lords, it is such a pleasure to follow the noble Baroness, Lady Kidron. She is absolutely right. We have a choice today on whether we will send an amendment to the Commons to put pressure on the Government to act or whether we are going to flunk this opportunity. I support her conclusion that this is a moment when the House of Lords does need to act.

I pay tribute to my noble friend Lady Penn for her Amendment 91, which is thoughtful, patiently put and important. I hope very much indeed that the Government pay attention to her notes on timetable and that, if necessary, she presses this point so that she gets what she needs.

I want to address Amendment 94A, tabled by my noble friend Lord Nash, and pay tribute to the noble Baronesses, Lady Cass, Lady Berger and Lady Benjamin, all of whom have made an enormous impact on this. Guardrails for our children are where we have landed. I say this with some regret, but it is important that we recognise this point.

The noble Baroness, Lady Cass, mentioned a meeting with the royal colleges. As a former Health Minister who has had many dealings with the herd of cats that are the royal colleges, I say that if they unify and say that there is a public health emergency, we should pay attention to that moment. We should not be brushed off by attempts to knock this into the long grass via public consultations. We should listen to our clinicians. Dr Rebecca Foljambe and the clinicians against smart- phones have done conclusive research on the harms done by screen time, by predators, by fraudsters, by the filth on the internet and by the sheer quantity of screen time that our children are subjected to. It is an utterly persuasive argument. Further research is not needed.

In fact, a consultation is the tobacco industry playbook, applied to smartphones for adolescents. Delay, consult, lobby, weaken—we know this playbook very well. We do not need a “get out of jail” clause for the tech companies; we need implementation. This is our opportunity for doing it.

Like many others in this Chamber, I worked really hard on the Online Safety Act to make it a success. It is a landmark piece of legislation. I am extremely proud of bits of it. But it assumed that we could work with the platforms to moderate their algorithms, to remove the filth, to prevent the predators, and to limit the screen time. It assumed that we were working in some kind of collaborative partnership with Facebook, Google, TikTok, Meta, Snapchat, Twitter and all the other social media companies to protect children and work towards some kind of better world.

That was a catastrophic misjudgment about the nature of these companies and of their leadership. The outcomes for our children, which have gone significantly backwards in the last two years, are testimony to that point. That damage done to our children is accelerating, with the tsunami of AI that is heading their way. The platforms have not reformed. They have not taken the bait. Instead, they have taken the mick. They are introducing artificial intelligence and totally inappropriate chatbots to our children.

The risk assessments that are an absolutely essential building block of the Ofcom regime and the Online Safety Act are an absolute insult to the intelligence of the regulators and of parents. How on earth did a risk assessment ever assess Grok’s new AI tools as being safe for children? It is a complete joke. The noble Lords, Lord Knight and Lord Clement-Jones, say that these platforms can be moderated, that they can be brought to heel, and that they will abide with the regulator’s will. But we have got to remember that they will not change, because around 25% of their clicks—the page views—come from the children they are targeting, and they are far too reliant on mis-selling those children’s eyeballs as adult eyeballs to advertisers. You cannot regulate far-off tech titans who are reliant on that income. You can only create perimeters in which they can hunt their profits, and that is exactly what the amendment seeks to do.

I recognise that there are sensible, respected voices who take a completely different view. Noble Lords have rightly paid tribute to the Molly Rose Foundation. I know that Ian Russell, and the NSPCC and other charities, have argued that instead of guardrails we should strengthen the Online Safety Act. They say that we should mandate well-being by design requirements, and Ian Russell has said that we should require platforms to prioritise child well-being in algorithmic design, and that age verification creates

“a false sense of security”.

I just do not think that is right. This argument rests on a false premise that we can somehow design our way out of the problem while keeping children on platforms whose entire business model depends on their exploitation. That cannot happen. You cannot algorithmically mitigate something that is not a design problem but a business model problem. The algorithm is not broken; it is doing exactly what it was designed to do: maximise engagement, keep eyes on the screen, and amplify provocative content, because provocative content keeps people clicking, including our children. This is not a market failure; this is a market working as designed by the companies that have monetised our children’s childhood as a commodity.

There are other noble Peers, such as the noble Lord, Lord Clement-Jones, who are flying a kite on the possibility of some kind of film-style certification system. I also share that dream. What a wonderful world it would be to live in. I lived in that world for many years. Previous to being here, I was the strategy director of Capital Radio, in much-loved local radio, which, in the 1990s, was a warm and loving place to work and operate in. For every single local radio station, the Government had a licence which dictated exactly who they could broadcast to and what content they had on their radio station. If you breached that licence, they pulled it and gave it to someone else. It meant that local radio was extremely compliant with the licence details. Our broadcasting was a warm and lovely thing that was safe for children.

It is completely unrealistic that we are going to appoint something like a modern Radio Authority that will issue licences for every single website in the world and in some way oversee what our children access on those websites. How many bureaucrats would be needed to look at all those licences? How many armies of enforcement agents would be needed to issue the fines? How could we possibly read all that content?