Online Safety Act 2023: Repeal Debate
Full Debate: Read Full DebateTom Hayes
Main Page: Tom Hayes (Labour - Bournemouth East)Department Debates - View all Tom Hayes's debates with the Department for Digital, Culture, Media & Sport
(1 day, 20 hours ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Tom Collins
I thank my hon. Friend for his remark. He is entirely right. In my own experience of engineering products, very critically, for safety, it was incumbent upon us to be fully open about everything we had done with those regulating and certifying our products for approval. We had numerous patents on our technology, which was new and emerging and had immense potential and value, yet we were utterly open with those notified bodies to ensure that our products were safe.
Similarly, I was fortunate enough to be able to convene industry to share the key safety insights that we were discovering early on to make sure that no mistake was ever repeated, and that the whole industry was able to innovate and develop in a safe way. I thank my hon. Friend the Member for Rugby (John Slinger) for his comments, and I strongly agree that there is no excuse for a lack of openness when it comes to safety.
How do we move forward? The first step is to start breaking down the problem. I have found it helpful to describe it in four broad categories, including hazards that apply to the individual simply through exposure. This would be content such as pornography, violence and images of or about abuse. And then there are hazards that apply to an individual by virtue of interaction, such as addictive user interfaces or personified GPTs. We cannot begin to comprehend the potential psychological harms that could come to human beings when we start to promote attachment with machines. There is no way we can have evidence to inform how safe or harmful that would be, but I suggest that all the knowledge that exists in the psychology and psychiatric communities would probably point to it being extremely risky and dangerous.
We have discussed recommendation algorithms at length. There are also societal harms that affect us collectively by exposure. These harms could be misinformation or echo chambers, for example. The echo chambers of opinion have now expanded to become echo chambers of reality in which people’s worldviews are increasingly being informed by what they see in those spaces, which are highly customised to their existing biases.
Tom Hayes (Bournemouth East) (Lab)
I have met constituents to understand their concerns and ambitions in relation to online safety legislation. There is a clear need to balance the protection of vulnerable users against serious online harms with the need to protect lawful speech as we pragmatically review and implement the Act.
My hon. Friend talks about equipping our younger people, in particular, with the skills to scrutinise what is real or fake. Does he agree that, although we have online safety within the national curriculum, we need to support our teachers to provide consistent teaching in schools across our country so that our children have the skills to think critically about online safety, in the same way as they do about road safety, relationships or consent? [Interruption.]
Before we continue, could I ask that everybody has their phone on silent, please?
Emily Darlington
I agree; my hon. Friend makes a very important point about the slander that happens online, the lack of basis in reality and the lack of ability to address it. If somebody posts something about someone else that is untrue, platforms will not take it down; they will say, “It doesn’t breach our terms and conditions.” Somebody could post that I am actually purple and have pink eyes. I would say, “I don’t want you to say that,” and the platform would say, “But there’s nothing offensive about it.” I would say, “But it’s not me.” The thing is that this is happening in much more offensive ways.
My hon. Friend the Member for Oldham West, Chadderton and Royton (Jim McMahon) made the point that what happens online is then repeated offline. We have even seen deaths when children try to replicate the challenges that they see being set online. With AI-generated material, those challenges often are not real. It is the equivalent of somebody trying to repeat magic tricks and dying as a result, which is quite worrying.
The Online Safety Act is not perfect; it needs to go further. The petitioner has made a really important point. The lack of proper definition around small but non-harmful sites versus small but harmful sites is very unclear, and it is really important that the Act provides some clarity on that.
We do not have enough protections for democracy. The Science, Innovation and Technology Committee, which I am a member of, produced a really important report on misinformation and how it led to the riots two summers ago. Misinformation was used as a rallying cry to create unrest across our country of a sort that we had not seen in a very long time. The response from the social media companies was variable; it went from kind of “meh” to really awful. The platforms say, “We don’t police our content. We’re just a platform.” That is naive in the extreme. Quite frankly, they are happy to make money off us, so they should also know that they have to protect us—their customers—just as any other company does, as my hon. Friend the Member for Oldham West, Chadderton and Royton said.
The radicalisation that is happening online is actually shifting the Overton window; we are seeing a more divided country. There is a fantastic book called “Man Up”—it is very academic, but it shows the rise of misogyny leading to the rise of every other form of extremism and how that links back to the online world. If this was all about Islam, this House would be outraged, but because it starts with misogyny, it goes down with a fizzle, and too often people in this House say, “This is all about free speech.” We know that misogyny is the first step on a ladder of radicalisation that leads people to violence—whether into violence against women or further into antisemitism, anti-Islam, anti-anybody who is not the same colour, or anti-anybody who is perceived not to be English from Norman times.
The algorithms provoke violent and shocking content, but they also shadow-ban really important content, such as information on women’s health. Platforms are happy to shadow-ban terms such as “endometriosis” and “tampon”—and God forbid that a tampon commercial should feature red liquid, rather than blue liquid. That content gets shadow-banned and is regularly taken down and taken out of the algorithms, yet the platforms say they can do nothing about people threatening to rape and harm. That is not true; they can, and they choose not to. The public agree that algorithms must be part of the solution; 78% of British parents want to see action on algorithms. My hon. Friends are right that the Online Safety Act and Ofcom could do that, yet they have not done so—they have yet to create transparency in algorithms, which was the Select Committee’s No. 1 recommendation.
[Sir John Hayes in the Chair]
Finally, I want to talk about a few other areas in which we need to move very quickly: deepfakes and AI nudifying apps. We have already seen an example of how deepfakes are being used in British democracy: a deepfake was made of the hon. Member for Mid Norfolk (George Freeman) saying that he is moving from the Conservatives to Reform. It is a very convincing three-minute video. Facebook still refuses to take it down because it does not breach its terms. This should be a warning to us all about how individuals, state actors and non-state actors can impact our local democracy by creating deepfakes of any one of us that we cannot get taken down.
Tom Hayes
We heard today from the MI6 chief, who talked about how Russia is seeking to “export chaos” into western democracies and said that the UK is one of the most targeted. Does my hon. Friend agree that we need online safety, because it is our national security too, and that as we face the rising threat from Putin and the Kremlin, we need as a country to be secure in the air, at sea, on land and in the digital space?
Emily Darlington
I absolutely agree with my hon. Friend. They seek to promote chaos and the destruction of British values, and we need to fight that and protect those values.
The AI nudifying apps, which did not even exist when the Online Safety Act came in, need a very fast response. We know that deepfakes and AI nudifying apps are being used overwhelmingly against classmates and colleagues. Think about how it destroys a 13-year-old girl to have a fake nude photo of her passed around. The abuse that we politicians and many others receive from fake and anonymous accounts needs to be addressed. Seventy-one per cent of British people consider this to be a problem, and we need to take action. AI chatbots are another thing that was not foreseen in the development of the Online Safety Act, and therefore it is far behind on them, too.
The Online Safety Act is in no way perfect, but it is a good step forward. We must learn the lessons of its implementation to go further and faster, and listen to British parents across the country who want the Government’s help to protect our children online—and we as a Government must also protect our democracy online.