Charlotte Nichols Portrait Charlotte Nichols (Warrington North) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairship, Dame Angela. I did not make a note of the specific word I was on when we adjourned, so I hope Hansard colleagues will forgive me if the flow between what I said previously and what I say now is somewhat stilted.

I will keep this brief, because I was—purposefully—testing the patience of the Minister with some of my contributions. However, I did so to hammer home the fact that the removal of clauses 12 and 13 from the Bill is a fatal error. If the recommittal of the Bill is not to fundamentally undermine what the Bill set out to do five years or so ago, their removal should urgently be reconsidered. We have spent five years debating the Bill to get it to this point.

As I said, there are forms of harm that are not illegal, but they are none the less harmful, and they should be legislated for. They should be in the Bill, as should specific protections for adults, not just children. I therefore urge the Minister to keep clauses 12 and 13 in the Bill so that we do not undermine what it set out to do and all the work that has been done up to this point. Inexplicably, the Government are trying to undo that work at this late stage before the Bill becomes law.

Sarah Owen Portrait Sarah Owen (Luton North) (Lab)
- Hansard - -

It is a pleasure to see you in the Chair, Dame Angela—I wish it was a toastier room. Let me add to the points that the shadow Minister, my hon. Friend the Member for Pontypridd, made so powerfully about vulnerable people. There is no cliff edge when such a person becomes 18. What thought have the Minister and the Department given to vulnerable young adults with learning disabilities or spectrum disorders? Frankly, the idea that, as soon as a person turns 18, they are magically no longer vulnerable is for the birds—particularly when it comes to eating disorders, suicide and self-harm.

Adults do not live in isolation, and they do not just live online. We have a duty of care to people. The perfect example is disinformation, particularly when it comes to its harmful impact on public health. We saw that with the pandemic and vaccine misinformation. We saw it with the harm done to children by the anti-vaccine movement’s myths about vaccines, children and babies. It causes greater harm than just having a conversation online.

People do not stay in one lane. Once people start being sucked into conspiracy myths, much as we discussed earlier around the algorithms that are used to keep people online, it has to keep ramping up. Social media and tech companies do that very well. They know how to do it. That is why I might start looking for something to do with ramen recipes and all of a sudden I am on to a cat that has decided to make noodles. It always ramps up. That is the fun end of it, but on the serious end somebody will start to have doubts about certain public health messages the Government are sending out. That then tips into other conspiracy theories that have really harmful, damaging consequences.

I saw that personally. My hon. Friend the Member for Warrington North eloquently put forward some really powerful examples of what she has been subjected to. With covid, some of the anti-vaccinators and anti-mask-wearers who targeted me quickly slipped into Sinophobia and racism. I was sent videos of people eating live animals, and being blamed for a global pandemic.

The people who have been targeted do not stay in one lane. The idea that adults are not vulnerable, and susceptible, to such targeting and do not need protection from it is frankly for the birds. We see that particularly with extremism, misogyny and the incel culture. I take the point from our earlier discussion about who determines what crosses the legal threshold, but why do we have to wait until somebody is physically hurt before the Government act?

That is really regrettable. So, too, is the fact that this is such a huge U-turn in policy, with 15% of the Bill coming back to Committee. As we have heard, that is unprecedented, and yet, on the most pivotal point, we were unable to hear expert advice, particularly from the National Society for the Prevention of Cruelty to Children, Barnardo’s and the Antisemitism Policy Trust. I was struggling to understand why we would not hear expert advice on such a drastic change to an important piece of legislation—until I heard the hon. Member for Don Valley talk about offence. This is not about offence; it is about harm.

The hon. Member’s comments highlighted perfectly the real reason we are all here in a freezing cold Bill Committee, rehashing work that has already been solved. The Bill was not perfect by any stretch of the imagination, but it was better than what we have today. The real reason we are here is the fight within the Conservative party.

Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - - - Excerpts

No such fight has taken place. These are my personal views, and I genuinely believe that people have a right to say what they would like to say. That is free speech. There have been no fights whatever.

Sarah Owen Portrait Sarah Owen
- Hansard - -

In that case, I must have been mistaken in thinking that the hon. Member—who has probably said quite a lot of things, which is why his voice is as hoarse as it is—was criticising the former Minister for measures that were agreed in previous Committee sittings.

For me, the current proposals are a really disappointing, retrograde step. They will not protect the most vulnerable people in our communities, including offline—this harm is not just online, but stretches out across all our communities. What happens online does not take place, and stay, in an isolated space; people are influenced by it and take their cues from it. They do not just take their cues from what is said in Parliament; they see misogynists online and think that they can treat people like that. They see horrific abuses of power and extreme pornography and, as we heard from the hon. Member for Aberdeen North, take their cues from that. What happens online does not stay online.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

My hon. Friend makes an important point about what happens online and its influence on the outside world. We saw that most recently with Kanye West being reinstated to Twitter and allowed to spew his bile and abhorrent views about Jews. That antisemitism had a real-world impact in terms of the rise in antisemitism on the streets, particularly in the US. The direct impact of his being allowed to talk about that online was Jews being harmed in the real world. That is exactly what is happening.

Sarah Owen Portrait Sarah Owen
- Hansard - -

I thank the shadow Minister for that intervention. She is absolutely right. We have had a discussion about terms of reference and terms of service. Not only do most people not actually fully read them or understand them, but they are subject to change. The moment Elon Musk took over Twitter, everything changed. Not only have we got Donald Trump back, but Elon Musk also gave the keys to a mainstream social media platform to Kanye West. We have seen what happened then.

That is the situation the Government will now not shut the door on. That is regrettable. For all the reasons we have heard today, it is really damaging. It is really disappointing that we are not taking the opportunity to lead in this area.

Paul Scully Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Paul Scully)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Dame Angela.

A lot of the discussion has replayed the debate from day two on Report about the removal of “legal but harmful” measures. Some of the discussion this morning and this afternoon has covered really important issues such as self-harm on which, as we said on the Floor of the House, we will introduce measures at a later stage. I will not talk about those measures now, but I would just say that we have already said that if we agree that the promotion of things such as self-harm is illegal, it should be illegal. Let us be very straight about how we deal with the promotion of self-harm.

The Bill will bring huge improvements for adult safety online. In addition to their duty to tackle illegal content, companies will have to provide adult users with tools to keep themselves safer. On some of the other clauses, we will talk about the triple shield that was mentioned earlier. If the content is illegal, it will still be illegal. If content does not adhere to the companies’ terms of service—that includes many of the issues that we have been debating for the last hour—it will have to be removed. We will come to user enforcement issues in further clauses.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

No, not about whether climate change is happening, but we are talking about a wide range. “Provides false information”—how do the companies determine what is false? I am not talking about the binary question of whether climate change is happening, but climate change is a wide-ranging debate. “Provides false information” means that someone has to determine what is false and what is not. Basically, the amendment outsources that to the social media platforms. That is not appropriate.

Sarah Owen Portrait Sarah Owen
- Hansard - -

Would that not also apply to vaccine efficacy? If we are talking about everything being up for debate and nothing being a hard fact, we are entering slightly strange worlds where we undo a huge amount of progress, in particular on health.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The amendment does not talk about vaccine efficacy; it talks about content that is harmful to health. That is a wide-ranging thing.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we know, the clause makes provision in relation to the making of regulations designating primary and priority content that is harmful to children, and priority content that is harmful to adults. The Secretary of State may specify a description of content in regulations only if they consider that there is a material risk of significant harm to an appreciable number of children or adults in the United Kingdom presented by user-generated or search content of that description, and must consult Ofcom before making such regulations.

In the last Bill Committee, Labour raised concerns that there were no duties that required the Secretary of State to consult others, including expert stakeholders, ahead of making these regulations. That decision cannot be for one person alone. When it comes to managing harmful content, unlike illegal content, we can all agree that it is about implementing systems that prevent people from encountering it, rather than removing it entirely.

Sarah Owen Portrait Sarah Owen
- Hansard - -

The fact that we are here again to discuss what one Secretary of State wanted to put into law, and which another is now seeking to remove before the law has even been introduced, suggests that my hon. Friend’s point about protection and making sure that there are adequate measures within which the Secretary of State must operate is absolutely valid.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree: we are now on our third Secretary of State, our third Minister and our third Prime Minister since we began considering this iteration of the Bill. It is vital that this does not come down to one person’s ideological beliefs. We have spoken at length about this issue; the hon. Member for Don Valley has spoken about his concerns that Parliament should be sovereign, and should make these decisions. It should not be for one individual or one stakeholder to make these determinations.

We also have issues with the Government’s chosen toggle approach—we see that as problematic. We have debated it at length, but our concerns regarding clause 56 are about the lack of consultation that the Secretary of State of the day, whoever that may be and whatever political party they belong to, will be forced to make before making widespread changes to a regime. I am afraid that those concerns still exist, and are not just held by us, but by stakeholders and by Members of all political persuasions across the House. However, since our proposed amendment was voted down in the previous Bill Committee, nothing has changed. I will spare colleagues from once again hearing my pleas about the importance of consultation when it comes to determining all things related to online safety, but while Labour Members do not formally oppose the clause, we hope that the Minister will address our widespread concerns about the powers of the Secretary of State in his remarks.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The different platforms, approaches and conditions will necessitate different numbers; it would be hard to pin a number down. The wording is vague and wide-ranging because it is trying to capture any number of scenarios, many as yet unknown. However, the regulations designating priority harms will be made under the draft affirmative resolution procedure.

Sarah Owen Portrait Sarah Owen
- Hansard - -

On that point, which we discussed earlier—my hon. Friend the Member for Warrington North discussed it—I am struggling to understand what is an acceptable level of harm, and what is the acceptable number of people to be harmed, before a platform has to act.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

It totally depends on the scenario. It is very difficult for me to stand here now and give a wide number of examples, but the Secretary of State will be reacting to a given situation, rather than trying to predict them.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

My hon. Friend is absolutely right. We want the Bill to work. We have always wanted the Bill to work. We want it to achieve its aim of keeping children, adults and everyone who uses the internet safe from the harms that are perpetuated there. If there is no transparency, how will we know that the platforms are breaking the rules covertly, and whether they are hiding content and getting round the rules? That is what they do; we know it, because we have heard it from whistleblowers, but we cannot rely on whistleblowers alone to highlight exactly what happens behind the closed doors of the platforms.

We need the transparency and the reports to be made public, so that we can see whether the legislation is working. If that does not happen, although we have waited five years, we will need another piece of legislation to fix it. We know that the Bill is not perfect, and the Minister knows that—he has said so himself—but, ultimately, we need to know that it works. If it does not, we have a responsibility as legislators to put something in place that does. Transparency is the only way in which we will figure that out.

Sarah Owen Portrait Sarah Owen
- Hansard - -

I want to add to the brilliant points made by my hon. Friend the shadow Minister, in particular on the continually changing nature of market forces, which the Minister himself referenced. We want innovation. We want the tech companies to innovate—preferably ones in the UK—but we do not want to be playing catch-up as we are now, making legislation retrospectively to right wrongs that have taken place because our legislative process has been too slow to deal with the technological changes and the changes in social media, in apps, and with how we access data and communicate with one another online. The bare minimum is a biannual report.

Within six months, if a new piece of technology comes up, it does not simply stay with one app or platform; that technology will be leapfrogged by others. Such technological advances can take place at a very rapid pace. The transparency aspect is important, because people should have a right to know what they are using and whether it is safe. We as policy makers should have a right to know clearly whether the legislation that we have introduced, or the legislation that we want to amend or update, is effective.

If we look at any other approach that we take to protect the health and safety of the people in our country—the people we all represent in our constituencies —we always say that prevention is better than cure. At the moment, without transparency and without researchers being able to update the information we need to see, we will constantly be playing catch-up with digital tech.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

This may be the only place in the Bill where I do not necessarily agree wholeheartedly with the Labour Front Benchers. I agree with the vast majority of what was said, but I have some concerns about making mandatory the requirement for transparency reports to be public in all circumstances, because there are circumstances in which that would simply highlight loopholes, allowing people to exploit them in a way that we do not want them to do.

Specifically on the regularity of reporting and some level of transparency, given that the Minister is keen on the commercial imperative and ensuring that people are safe, we need a higher level of transparency than we currently see among the platforms. There is a very good case to be made for some of the transparency reporting to be made public, particularly for the very largest platforms to be required to make it public, or to make sections of it public.

I want to talk about the speed of change to the terms of service and about proportionality. If Ofcom could request transparency reporting only annually, imagine that it received transparency information three days before Elon Musk took over Twitter. Twitter would be a completely different place three days later, and Ofcom would be unable to ask for more transparency information for a whole year, by which point a significant amount of damage could have been done. We have seen that the terms of service can change quickly. Ofcom would not have the flexibility to ask for an updated transparency report, even if drastic changes were made to the services.

Another thing slightly concerns me about doing this annually and not allowing a bit more flexibility. Let us say that a small platform that none of us has ever heard of, such as Mastodon, shoots to prominence overnight. Let us also say that, as a small platform, Mastodon was previously regulated, and Ofcom had made a request for transparency information shortly before Elon Musk took over Twitter and people had migrated to Mastodon. Mastodon would now be suffering from very different issues than those it had when it had a small number of users, compared with the significant number that it has now. It would have changed dramatically, yet Ofcom would not have the flexibility to seek that information. We know that platforms in the online world have sudden stellar increases in popularity overnight. Some have been bubbling along for ages with nobody using them. Not all of them are brand-new platforms that suddenly shoot to prominence. The lack of flexibility is a problem.

Lastly, I agree about researchers being able to access the transparency information provided. It is really important that we recognise that Ofcom is not the only expert. Ofcom has a huge amount of expertise, and it is massively increasing its staff numbers to cope with these issues, but the reality is that those staff are not academic researchers. They are unable to look at the issues and are not necessarily the most prominent experts in the field of child protection, for example. That is not to take away from the expertise in Ofcom, but we could allow it to ask a regulated group of researchers to look at the information and point out any issues that may not have been spotted, particularly given the volume of transparency reports that there are likely to be.