Children’s Wellbeing and Schools Bill

Baroness Harding of Winscombe Excerpts
Wednesday 21st January 2026

(1 week, 3 days ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Australia’s e-safety commissioner has appointed an advisory group of 11 distinguished experts—led by Stanford University’s Social Media Lab, and including from this country Professor Orben from Cambridge University and Professor Etchells from Bath Spa University —to provide a robust and transparent evaluation of the outcomes of the social media ban in Australia for under-16s. In her reply, will the Minister undertake to report back, within 12 months of the Children’s Wellbeing and Schools Bill being passed, with an expert analysis of the impact of the Australian ban on key mental health statistics? I do not believe that we should introduce a ban in this country without some impartial analysis of the data being collected in Australia, even if it is preliminary. Amendment 94A makes no provision for analysis within 12 months of the most relevant evidence that we will have, and I therefore cannot support it.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, I will also endeavour to be brief. Like many who have spoken already, I spent a very large amount of time on the Online Safety Act. I agree entirely with the comments of the noble Lords, Lord Knight and Lord Russell, and the noble Baroness, Lady Kidron. This is a cry of pain and anger from this House that I hope the Minister is hearing, but I do not think that banning social media for under-16s is the right thing to do. I will add two reasons that have not been discussed so far.

First, I worry that absolutely nothing will change by implementing a ban. We already have a minimum age of 13; go into any primary school and you will find how effective that is. I urge the Minister to tell us how she is going to implement the minimum age we already have. How is she going to stiffen Ofcom’s backbone to hold tech companies to account? Otherwise, we can legislate all we like, but it will not make any difference.

Secondly, I have huge respect for the eloquence with which my noble friend Lord Nash set out the horrors and harms that social media is undoubtedly doing, but there is one flaw in his argument. He quoted a lot of research that points to the harm that excessive use of social media does to children. A ban, however, is zero use. We must be very careful about that. Social media is part of the modern world; it brings good as well as ill, and to simply ban it is abdicating responsibility.

I worry hugely that we are letting the tech companies off the hook. We have to hold them to account to produce products that are age appropriate. We have done that with every other technology as it has grown up over the centuries, and we should not duck the issue now. That takes me to the right reverend Prelate’s point, which seems like quite a long time ago. I am in the same dilemma, because I am absolutely certain that change has to happen, that the Online Safety Act is not working as those of us who worked so hard on it envisaged, and that Ofcom is not delivering. I doubt that more consultation solves that problem. But I am worried about passing this baton back to our colleagues in the other place. I am worried because a ban on social media has a nice ring to it. I am worried when I hear Ian Russell say that we must not use our children as a political football. We must really work out what the right answer to this problem is.

I ask the Minister to listen to this emotional debate. Those of us who worked on the Online Safety Act can see that there are about hundredfold more people in this Chamber now than there were at any stage of the Online Safety Act. That shows how much we all care about it now—not just that everyone is waiting for a vote. I ask the Minister please to hear the concern, the fury and the need to act. But, my goodness, if we send this back to the other place, I hope it will not translate into a blanket ban on social media for under-16s but into proper action to make the internet a better place for our children.

Children’s Wellbeing and Schools Bill

Baroness Harding of Winscombe Excerpts
Thursday 18th September 2025

(4 months, 1 week ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Tarassenko Portrait Lord Tarassenko (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I support Amendments 493, 494, 502K and 502YI, as someone with an interest in the use of educational technologies, including AI, both in schools and universities. I declare my interest as chair of the Maths Horizons project, funded by XTX Markets, which earlier this year reviewed the maths curriculum in England from five to 18, and briefly investigated the use of edtech to support the teaching of the subject.

I speak as a supporter of the deployment of educational technology in the classroom as I believe it can and should have a positive impact on the education of children, and not just in maths. But this must be done within a framework which protects children from its misuse. We must balance innovation in education through edtech with appropriate regulation. The regulations listed in subsection (2) of the proposed new clause in Amendment 493 would support the adoption of edtech in our schools rather than hinder it.

In this context, what has happened with chatbots based on large language models is a salutary example of the early release of AI products without proper safeguards, especially with respect to their use by children. Tragically, this week the parents of the American teenager who recently took his own life after repeatedly sharing his intentions with ChatGPT told a Senate judiciary sub-committee investigating chatbot dangers:

“What began as a homework helper gradually turned itself into a confidant and then a suicide coach”.


Ironically, we are now told that OpenAI is building a ChatGPT for teenagers and plans to use age-prediction technology to help bar children under 18 from the standard version. Sam Altman, the CEO of OpenAI, wrote in a blog this week just before the Senate hearings—and then coming to this country—that AI chatbots are

“a new and powerful technology, and we believe minors need significant protection”.

The risks associated with the use of edtech may not be on the same scale, but they are nevertheless real. In many cases, edtech products used in schools rely extensively on the collection of children’s data, allowing it to be used for commercial and profiling purposes. The recent report from the 5Rights Foundation and the LSE, which has already been mentioned, highlights that some popular classroom AI apps track users with cookies from adult websites and may provide inaccurate and unhelpful information. Most worryingly, a popular app used for educational purposes in the UK generates emulated empathy through sentiment analysis and so increases the likelihood of children forming an emotional attachment to the app. I therefore support Amendments 493, 494 and 502K, which together would ensure that edtech products provide children with the higher standard of protection afforded by the ICO’s age-appropriate design code.

In addition to the safeguards introduced by these amendments, there is a need for research to establish whether educational technologies deliver better educational outcomes for children. Most edtech products lack independent evidence that they lead to improved outcomes. Indeed, some studies have shown that edtech products can promote repetitive or distracting experiences with minimal, if any, learning values. By contrast, there is a growing body of evidence on the positive side that edtech can effectively support vocabulary acquisition, grammar learning, and the development of reading and writing skills for students for whom English is the second language, particularly when these tools are used to complement a teacher’s instruction.

To establish a causal relationship between the use of an edtech tool and a specific learning outcome, we need to design randomised control trials—the gold standard for demonstrating the efficacy of interventions in the social or medical sciences. Longitudinal data will then be needed to track student usage, time on task and completion rates. Crucially, the trial must have enough participants to detect a meaningful effect if one exists. This is unlikely to be possible using the data from a single school, so data from several schools will need to be anonymised and then aggregated to obtain a statistically meaningful result.

I am satisfied that Amendments 502K and 502YI would allow this methodological approach to be followed. Indeed, subsection (4)(c) of the proposed new clause in Amendment 502K would ensure that the code of practice enabled the development of standards to certify evidence-based edtech products and support the testing of novel products. This would provide UK- based companies with the opportunity to innovate in edtech within an appropriate regulatory environment.

As English is the lingua franca of the digital world, there is the opportunity for the UK to become a leader in edtech innovation and certification, for the benefit of children not only in the UK but in many other countries. These amendments should be seen by the Department for Education not as an attempt to overregulate the edtech sector but instead as a mechanism for the promotion of existing evidence-based apps and the development of a new generation of products, some of which may be AI-facilitated, using—no pun intended—best-in-class trial methodology.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I too support Amendments 493 and 494 in the name of my noble friend Lord Holmes, and Amendments 502K and 502YI in the name of the noble Baroness, Lady Kidron. I am not an educationalist and this is my first contribution on the Bill. I spend my time in this House focused mainly on digital issues, hence my interest in these amendments.

Like others today, I will start by being really clear that I am not anti-technology in education—quite the opposite. I see the huge potential that digital technology can bring in all sectors of our lives. It is also particularly clear today, as our Prime Minister is signing the tech prosperity deal. We should be open-eyed that technology brings the opportunity for prosperity; I am not anti it at all. But it is also really clear that technology, not just digital but all technologies for evermore, need guardrails, and those guardrails cannot be self-imposed.

Among those of us who have worked on child safety online for the past 10 or 15 years, many on this side of the House began firmly believing that self-regulation was the answer. I am afraid that we been proven absolutely wrong. There is no doubt that self-regulation in social media has been a disaster, and I fear that we are doing exactly the same in digital technology in education. Companies operating in this space need guardrails in order to develop the products that really will make a positive difference and to help us all mitigate the downsides that these technologies inherently have.

I am not a lover of adding regulation, so in each example of adding regulation in the digital space I ask myself a simple question: is this additional regulation an example of the red flag Act 1865? For those that do not know, that was the wonderful piece of legislation that required a man—it had to be a man—with a red flag to walk in front of every non-horse-drawn vehicle. This was clearly a very bad piece of legislation that was repealed—it took 30 years, but it was repealed. So question number one is: is this piece of additional digital regulation a red flag Act that will prevent the benefit of the technology, or is it in danger of being a seat belt?

The seat belt was patented in 1885 but it became mandatory to wear one in the back seat of a car, where children tend to sit, only in 1991. So, during that century, was the world better off and was car development so much faster because there were not mandatory seat belts on the back seat, or was it just that more children died? We have to ask ourselves, with every piece of regulation in the digital world: are we in danger of creating a red flag that is slowing down the development of the technology, or are we in danger of believing that regulation will slow down economic growth while instead being culpable of doing harm for decades or even centuries?

The problem with this Bill, and these amendments, is that many of us have debated the issue many times before. The age-appropriate design code came into being in the Data Protection Act 2018 and came to life in 2020. It expressly excludes technology in schools. I find it incomprehensible that, five years later, we are having to argue that it is wrong that children’s data in school is less protected than it is at home. The Minister has referenced previously that many of us have spoken on this topic before or have a track record in this. The Government, when they were not the Government, very clearly supported expanding regulation into edtech. I hope that the Minister will hear the cross-party support for these amendments and work with us to put in statute the appropriate protection for the use of children’s data and technology when they are in education.

--- Later in debate ---
Baroness Smith of Malvern Portrait Baroness Smith of Malvern (Lab)
- Hansard - - - Excerpts

I hope that the noble Baroness will carefully read what I said. I was certainly not saying that. In my response, I have gone further in explaining the work that the department is doing to meet many of the concerns that she outlined than we have done previously. I am most certainly not saying that it will be done to the 2030 timetable. I understand her concern around regulation and accountability, and I have given some considerable steers, at the very least, about the direction in which that work is going—it is not to a 2030 timetable. Turning to—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

Before the Minister moves on, I have a follow-up question. It is very encouraging to hear the work that seems to be ongoing in the ICO. What is the Minister’s view on why it would not be appropriate to put the requirement for a code of conduct on the statute book for education in the same way that it is in the Age Appropriate Design code for all other children’s data? Just to be clear, I value the fact that the Minister has been so open about the ongoing work, but those of us who have worked in this space for so long worry that things can change and that, without legal underpinning, codes can then disappear.

Baroness Smith of Malvern Portrait Baroness Smith of Malvern (Lab)
- Hansard - - - Excerpts

I understand that concern. Perhaps we can first make progress on the code, as I have outlined we are. I will write to the noble Baroness about this. I understand that this place is about putting things into legislation, but that does not mean that activity is not happening. The proof of the pudding may well be in the production of the code.