(2 weeks ago)
Lords ChamberMy Lords, I support many of the amendments in this group, but I also want to express my concerns about Amendment 94A in the name of the noble Lord, Lord Nash. I have listened carefully to his arguments and those of other noble Lords who support the amendment. I too am appalled by the many stories that we have heard. I too want to stop children being exposed to harms online. I hope my record in the debates on the Online Safety Act and other digital legislation show my support for measures to increase safety for children in the digital space. I, like all noble Lords, recognise that there are many harms online.
However, I do not think that an outright ban on social media for the under-16s will be effective in protecting children. I hesitate to disagree with my noble friend Lady Kidron, who I normally always agree with, but we should put all the pressure we possibly can directly on Ofcom to make sure we realise the hopes and dreams of the Online Safety Act.
Early this morning, I had an interesting conversation with Jason Trethowan, who runs headspace, Australia’s national youth mental health charity, which last year was accessed by 170,000 young people aged between 12 and 25 in 170 locations across the country. His organisation is at the sharp end of the social media ban in Australia. His main message was that we all want to stop online harms to children, but he called on the Government and legislators to listen to children as well as parents.
The noble Lord, Lord Nash, has quite rightly highlighted the harms that exist for young people on social media. However, noble Lords also need to be aware of the crucial role that social media plays for young people in communicating with each other, getting information about the world and, very importantly, getting help and advice from like-minded people.
Jason said that we all need to understand that young people see the online world as their world. It is a central part of their existence, and no amount of bans will remove them from online space. headspace told me that the ban in Australia, which started on 10 December 2025, was a massive shock for many young people. They had been warned of its arrival for months but still were not prepared for the severing of their contacts on social media. Most did not have the phone numbers to continue communicating with their contacts and suddenly found themselves isolated from their peer groups. Many noble Lords will dismiss these severances as youthful folly, but the charity told me that of 3,000 young people who have been seen since the ban was introduced, 10% included social media bans among the reasons for their mental health deteriorating.
One young person on an isolated farm in rural Australia had used an LGBT group on social media to find like-minded young people. He lived in a household he regarded as homophobic, and was geographically far away from many of his online contacts. Suddenly, he found his support network taken away from him. The schools in Australia are on their summer break until the end of this month, so the full extent of the disruption to the lives of young people is not known.
The young LGBT person will not be able to renew his social media contacts, but rest assured he will find advice somewhere else on the internet. Young people who are banned from social media will find other ways online to assuage their appetites for communication, information and problem-solving.
In Australia, Headspace is already seeing this happening. Young people who can no longer use the 10 major sites, which include Snapchat, X, YouTube, Instagram and Kik, are now migrating to AI sites. Noble Lords have already had debates over concerns about AI as a form of gathering information. Many will be aware of what the West Coast techies call “hallucinations” —the rest of us call them “lies”—appearing in AI research.
Young people are using AI to resolve their problems. On 27 November last year, this House had a debate about banning AI companions, which many young people use for advice. They can be dangerous—my noble friend Lady Kidron told how this led to one young man committing suicide on the advice of an AI companion. Surely, noble Lords do not want to encourage young people to use these AI replacements for social media.
The tech companies will feed that appetite. I know that built into Amendment 94A there is a flexibility for which apps will be used. However, they found in Australia that new platforms are opening all the time. The Australian Government’s original Act banned 10 social media platforms, but already they have had to come up with another list of platforms to ban. This is a game of whack-a-mole, just as the noble Lord, Lord Clement-Jones, said. It will not be solved by ban on social media platforms. The media will always outpace the legislation.
There are so many harms online, on social media and other platforms. We all agree on that. I have spoken to the charities that have been mentioned many times by noble Lords—the Molly Rose Foundation, Internet Matters, NSPCC and the Online Safety Act Network. They have all championed the development of online safety for children, as noble Lords have already mentioned, and all are against a blanket ban on social media for under-16s in this country. They suggest that instead of banning social media, the Online Safety Act should be amended. I know that my noble friend Lady Kidron has said that that is not possible to do.
I am sorry but the noble Viscount is misreading what I said. I said exactly that.
I apologise. They suggest that the Act should be amended to ensure safety by design for all users, particularly young users.
There is a need to strengthen Ofcom’s response to tech platforms that breach their risk assessments. It needs to put the onus on the platforms to mitigate the risks, instead of defining the mitigation measures and taking action only when there is evidence that these measures actually work. This needs to be combined with the definition of “safety by design”.
I partially support Amendment 108 in the name of the noble Lord, Lord Storey. Children’s safety charities have long been calling for age-appropriate content requirements to be introduced for content on social media and across the internet. However, age-appropriate design should be introduced not just for 18 year-olds but for 16 year-olds and even 13 year-olds.
I completely support Amendment 109. I am glad the Government are having a consultation on this issue. I sincerely hope that noble Lords are wrong in saying that this is an attempt to kick this down the road. Addiction is a real problem. This is about engagement and economy, and it needs to be dealt with.
I support the call for Ofcom to revisit its interpretation of the Online Safety Act so that it includes addictive design as one of the harms that it needs platforms to mitigate against. I understand the powerful instinct of noble Lords and many parents to ban social media for under-16s, but I ask them to consider that young people will not be torn away from life online. It will not be possible to force them to leave the digital world, however much a majority of adults want that to happen.
(1 year ago)
Lords ChamberMy Lords, I thank my noble friend Lady Kidron and the noble Viscount, Lord Camrose, for adding their signatures to my Amendment 14. I withdrew this amendment in Committee, but I am now asking the Minister to consider once again the definition of “scientific research” in the Bill. If he cannot satisfy me in his speech this evening, I will seek the opinion of the House.
I have been worried about the safeguards for defining scientific research since the Bill was published. This amendment will require that the research should be in “the public interest”, which I am sure most noble Lords will agree is a laudable aim and an important safeguard. This amendment has been looked at in the context of the Government’s recent announcements on turning this country into an AI superpower. I am very much a supporter of this endeavour, but across the country there are many people who are worried about the need to set up safeguards for their data. They fear data safety is threatened by this explosion of AI and its inexorable development by the big tech companies. This amendment will go some way to building public trust in the AI revolution.
The vision of Donald Trump surrounded at his inauguration yesterday by tech billionaires, most of whom have until recently been Democrats, puts the fear of God into me. I fear their companies are coming for our data. We have some of the best data in the world, and it needs to be safeguarded. The AI companies are spending billions of dollars developing their foundation models, and they are beholden to their shareholders to minimise the cost of developing these models.
Clause 67 gives a huge fillip to the scientific research community. It exempts research which falls within the definition of scientific research as laid out in the Bill from having to gain new consent from data subjects to reuse millions of points of data.
It costs time and money for the tech companies to get renewed consent from data holders before reusing their data. This is an issue we will discuss further when we debate amendments on scraping data from creatives without copyright licensing. It is clear from our debates in Committee that many noble Lords fear that AI companies will do what they can to avoid either getting consent or licensing data for use in scraping data. Defining their research as scientific will allow them to escape these constraints. I could not be a greater supporter of the wonderful scientific research that is carried out in this country, but I want the Bill to ensure that it really is scientific research and not AI development camouflaged as scientific research.
The line between product development and scientific research is often blurred. Many developers posit efforts to increase model capabilities, efficiency, or indeed the study of their risks, as scientific research. The balance has to be struck between allowing this country to become an AI superpower and exploiting its data subjects. I contend that this amendment will go far to allay public fears of the abuse and use of their data to further the profits and goals of huge AI companies, most of which are based in the United States.
Noble Lords have only to look at the outrage last year at Meta’s use of Instagram users’ data without their consent to train the datasets for its new Llama AI model to understand the levels of concern. There were complaints to regulators, and the ICO posted that Meta
“responded to our request to pause and review plans to use Facebook and Instagram user data to train generative AI”.
However, so far, there has been no official change to Meta’s privacy policy that would legally bind it to stop processing data without consent for the development of its AI technologies, and the ICO has not issued a binding order to stop Meta’s plans to scrape users’ data to train its AI systems. Meanwhile, Meta has resumed reusing subjects’ data without their consent.
I thank the Minister for meeting me and talking through Amendment 14. I understand his concerns that, at a public interest threshold, the definition of scientific research will create a heavy burden on researchers, but I think it is worth the risk in the name of safety. Some noble Lords are concerned about the difficulty of defining “public interest”. However, the ICO has very clear guidelines about what public interest consists of. It states that
“you should broadly interpret public interest in the research context to include any clear and positive public benefit likely to arise from that research”.
It continues:
“The public interest covers a wide range of values and principles about the public good, or what is in society’s best interests. In making the case that your research is in the public interest, it is not enough to point to your own private interests”.
The guidance even includes further examples of research in the public interest, such as
“the advancement of academic knowledge in a given field … the preservation of art, culture and knowledge for the enrichment of society … or … the provision of more efficient or more effective products and services for the public”.
This guidance is already being applied in the Bill to sensitive data and public health data. I contend that if these carefully thought-through guidelines are good enough for health data, they should be good enough for all scientific data.
This view is supported in the EU, where
“the special data protection regime for scientific research is understood to apply where … the research is carried out with the aim of growing society’s collective knowledge and wellbeing, as opposed to serving primarily one or several private interests.”
The Minister will tell the House that the data exempted to be used for scientific research is well protected—that it has both the lawfulness test, as set out in the UK GDPR, and a reasonableness test. I am concerned that the reasonableness test in this Bill references
“processing for the purposes of any research that can reasonably be described as scientific, whether publicly or privately funded and whether carried out as a commercial or non-commercial activity”.
Normally, a reasonableness test requires an expert in the context of that research to decide whether it is reasonable to consider it scientific. However, in this Bill, “reasonable” just means that an ordinary person in the street can decide whether the research is reasonable to be considered scientific. This must be a broadening of the threshold of the definition.
It seems “reasonable” in the current climate to ask the Government to include a public interest test before giving the AI companies extensive scope to reuse our data, without getting renewed consent, on the pretext that the work is for scientific research. In the light of possible deregulation of the sector by the new regime in America, it is beholden on this country to ensure that our scientific research is dynamic, but safe. If the Government can bring this reassurance then for millions of people in this country they will increase trust in Britain’s AI revolution. I beg to move.
My Lords, I support my noble friend Lord Colville. He has made an excellent argument, and I ask noble Lords on the Government Benches to think about it very carefully. If it is good enough for health data, it is good enough for the rest of science. In the interest of time, I will give an example of one of the issues, rather than repeat the excellent argument made by my noble friend.
In Committee, I asked the Government three times whether the cover of scientific research could be used, for example, to market-test ways to hack human responses to dopamine in order to keep children online. In the Minister’s letter, written during Committee, she could not say that the A/B testing of millions of children to make services more sticky—that is, more addictive—would not be considered scientific, but rather that the regulator, the ICO, could decide on a case-by-case basis. That is not good enough.
There is no greater argument for my noble friend Lord Colville’s amendment than the fact that the Government are unable to say if hacking children’s attention for commercial gain is scientific or not. We will come to children and child protection in the Bill in the next group, but it is alarming that the Government feel able to put in writing that this is an open question. That is not what Labour believed in opposition, and it is beyond disappointing that, now in government, Labour has forgotten what it then believed. I will be following my noble friend through the Lobby.