Wednesday 7th October 2020

(3 years, 5 months ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Holly Lynch Portrait Holly Lynch
- Hansard - - - Excerpts

My hon. Friend, who has vast experience in this area, references some of the most extreme and harrowing online experiences, which our children are now becoming exposed to on a regular basis. We absolutely must re-resource this area to get a grip of it and prevent children from becoming victims, which happens every day that we do not tighten up the rules and regulations surrounding the use of the internet.

I also ask the Minister whether legislation will include— it should—regulation of, or rather the removal of, misinformation and disinformation online. Will it seek to regulate much more of what is harmful and hateful but is not necessarily criminal from a public health perspective, if nothing else? Will the proposed duty of care be properly underpinned by a statutory framework? Just how significant will the consequences be for those who do not adhere to it?

The Government announced the suspension of the implementation of an age-verification regime for commercial pornography sites on 16 October 2019, despite the fact that it only needed a commencement date. It is not at all clear why that was or when it will be reintroduced. I hope that the Minister can enlighten is about when the regime will come into effect.

The Local Government Association has raised important concerns. Local authorities have statutory safeguarding responsibilities on issues such as child exploitation, as we have just heard, suicide prevention and tackling addiction, all of which become incredibly difficult when a child or young person—or an adult, for that matter—goes online. It had to produce the “Councillors’ guide to handling intimidation”, which recognises the growing need among councillors for support related to predominantly online intimidation. That is another damning indication of just how bad things have become.

I have worked with these groups on this issue and have been overwhelmed with suggestions for what more could be done. First, no one should be able to set up an entirely anonymous profile on social media platforms. The rise in bots and people hiding behind anonymous profiles who push hate and abuse should simply no longer be allowed. People would not necessarily have to put all their information in the public domain, but they would need to provide accurate information in order to be able to set up an account or a profile. The approach is explicitly called for in two of the public petitions attached to the debate, demonstrating that there is public support for such an approach. That would allow us to hold both the platform and the individuals responsible to account for any breaches in conduct.

Imagine if being held to account for posting something that is predetermined to be abusive through the online harms Bill, such as hateful antisemitic content, meant that an appropriate agency—be it Ofcom, the police or the enforcement arm of a new regulator— could effectively issue on-the-spot fines to the perpetrator. If we can identify the perpetrator, we can also work with police to determine whether a hate crime has occurred and bring charges wherever possible. The increased resources that are necessary for such an approach would be covered by the revenue generated by those fines. That type of approach would be transformative. Can the Minister respond to that point—not necessarily to me, but to all those who have signed the petitions before us, which ask for that kind of thinking?

Fearing that the Government lack the will to adopt the radical approach that is required, the working group that I spoke about will look to get more and more advertisers on board that are prepared to pull their advertising from social media platforms if the sorts of transformations that we are calling for are not forthcoming. I put everyone on notice that that work is well under way.

On securing the debate, I was approached by colleagues from all parties, and I am pleased that so many are able to take part. Given just how broad this topic is, I have not said anything about extremist and radical content online, gang violence, cyber-bullying, self-harm, explicit and extreme content, sexual content, grooming, gaming and gambling, and the promotion of eating disorders. I am sure others will say more about such things, but I fear the Government will say that there is so much to regulate that they are struggling to see the way forward. There is so much there that it is a dereliction of duty every day that we fail to regulate this space and keep damaging content from our young people and adults alike.

We know that this is an international issue, and Plan International has just released the results of its largest ever global survey on online violence after speaking to 14,000 girls aged 15 to 25 across 22 countries. The data reveal that nearly 60% have been harassed or abused online, and that one in five girls have left a social media platform or significantly reduced their use of it after being harassed. This plea goes to the social media companies as well: if they want to have users in the future who can enjoy what they provide, they must create a safe space. Currently, they simply do not. It is an international issue, but we are the mother of Parliaments, are we not?

The Government seem so overwhelmed by the prospect of doing everything that they are not doing anything. I urge the Minister to start that process. Take those first steps, because each one will make some difference in bringing about the change that we have a moral obligation to deliver.

Edward Leigh Portrait Sir Edward Leigh (in the Chair)
- Hansard - -

I remind Members that the convention still applies: if you wish to speak, you should be present at the beginning. There are quite a large number of people on the call list, so please restrict your comments to about four minutes; otherwise, I will have to impose a time limit. I will call Members as on the call list, starting with Andrew Percy.

--- Later in debate ---
Andrew Percy Portrait Andrew Percy
- Hansard - - - Excerpts

It is absolutely shocking. It should not take legislation to deal with it; it is obvious that the content should not be there. We need the Government to legislate, as I shall come on to in a moment, but it takes no brain surgeon to figure this stuff out. Sadly, too many platforms do not do enough.

Then of course there was the shocking Wiley incident, when he was tweeting on average every 87 seconds, which is incredible. There were 600 tweets, on a conservative estimate, which were seen online by more than 47 million people, of vile antisemitic abuse. Let us just consider some examples of it. He tweeted:

“If you work for a company owned by 2 Jewish men and you challenge the Jewish community in anyway of course you will get fired.”

Another one was:

“Infact there are 2 sets of people who nobody has really wanted to challenge #Jewish & #KKK but being in business for 20 years you start to undestand why:”

Then—something completely disgusting:

“Jewish people you think you are too important I am sick of you”

and

“Jewish people you make me sick and I will not budge”.

It took days. As I said, it took, at a conservative estimate, 600 tweets before anything was done about it. Instagram videos were posted. When one platform closed it down it ended up elsewhere. That is despite all the terms and conditions in place.

Enforcement is, sadly, all too invisible, as the hon. Member for Cardiff South and Penarth (Stephen Doughty) has highlighted, with regard to Radio Aryan. I was pleased that Wiley was stripped of his honour, but he should never have been able to get into the position of being able to spout that bile for so long. The best we have been able to do is strip him of an honour. It is completely and utterly unacceptable.

There is a similar problem with other platforms. I want to talk briefly about BitChute. It is an alternative platform, but we see the same old tropes there. Videos get millions of views there. It is a nastier version of YouTube—let us be honest—with videos in the name of the proscribed group National Action, a channel, for example, with the name “Good Night Jewish Parasite”, livestreaming of terrorist content, racist videos about Black Lives Matter protesters and much more; but it is a UK-based platform with UK directors, and while action is taken against individual videos there is, sadly, not enough recourse. Given the time limits, I shall quickly ask two questions and make two comments on legislation and where we are heading.

The online harms White Paper suggested a number of codes of practice, and that seems to have been rowed away from somewhat in recent weeks and months, so that there will be reliance, instead, on the terms and conditions. I do not think that that is enough. I hope that the Minister will confirm that enforceable codes of action will flow. I hope that also if she has some time she will perhaps meet me, and the Antisemitism Policy Trust and other partners, to discuss the matter in more detail.

Will the Minister consider introducing senior management liability for social media companies? The German model for fines is often talked about, but it has not worked. The maximum fine so far issued in Germany is, I think, two million dollars or pounds, which is nothing for Facebook. It can afford to build that into its programme.

There is plenty more I could have said—I am conscious of the time—but I hope the Minister will commit to meet with us and respond to those two points.

Edward Leigh Portrait Sir Edward Leigh (in the Chair)
- Hansard - -

I remind Members that unless we keep to four minutes, we will not get everybody in.

Chris Elmore Portrait Chris Elmore (Ogmore) (Lab)
- Hansard - - - Excerpts

Thank you, Sir Edward; the pressure is on. I congratulate my hon. Friend the Member for Halifax (Holly Lynch), as I have already said. I remember a debate on online harms some four years ago, when I first entered the House, when only three Members were in this room. Clearly our numbers are restricted today, but it is great to see a full Westminster Hall, as more and more Members come to realise the huge problems that the online platforms create.

Being aware of the time, I want to stick to two areas where I think the platforms are failing. First, I have raised anti-vax issues right across the summer, and as the pandemic started. In the last year an additional 7.8 million people have visited anti-vax Facebook pages or followed the Twitter, Instagram or YouTube accounts of organisations that are trying to make a quick buck out of people’s fears by selling false vaccines and treatments, asking them not to consult a doctor if they have any symptoms—“Don’t get tests because you can cure these things with different types of herbal tea”.

Across all the platforms—none is particularly worse than the others in my view, because they all have a responsibility—the argument that comes back is: “It’s a point of view: a position they could take, if you could form an argument, about this possibly being the way forward on covid.” Sadly, I have heard Members of this House suggest that covid is no worse than flu, despite all clinical professionals saying that is not the case. This gets picked up on anti-vax platforms, which quote Members saying, “You don’t have to wear a mask, you don’t have to get a vaccine and you don’t have to worry about it, because it’s no worse than flu”. Even if the Member has not said that, they twist their words into that position. How the platforms manage that is a huge concern.

I welcomed Facebook’s intervention yesterday to take down President Trump’s comments about covid. It is nice to see an intervention at that level, confirming that he indeed spouts fake news. It is about time Facebook did a lot more of that to address what is happening in this pandemic.

My second point is about the protection of children and young people. I have a huge concern about cyber-bullying and the targeting of young people, and specifically the growing number of young people being coerced, via gaming or the platforms or livestreaming, into committing sexual acts of harm against themselves, and that then is moving into the dark web. The Internet Watch Foundation says that Europe is the grooming capital of the world—it is mainly in the Netherlands, but it is on the increase in this country. I have already mentioned the concern of the IWF and the Met about the need for the Government to put more resources into getting these URLs taken down. There is a real fear among the tech community that young people are being taught how to abuse themselves by people who are grooming them. I know the Minister cares about this—we have spoken about it before. It needs to be rectified.

My two asks, in the half a minute left to me, are that we introduce the Bill as quickly as possible and that it is robust and clear, and takes on the platforms. I agree with the hon. Member for Brigg and Goole (Andrew Percy) that it cannot be about the platforms setting their own regulations and then Ofcom deciding what should or should not be controlled and fines being issued. There should be culpability.

My final ask to the Minister is to create a social media levy fund that allows research into this issue. Make the platforms pay for the fact that they are not willing to do half of this work themselves, for the sake of young people, politicians, people in public life and people in the street who believe the fake news or the anti-vax information, because they are fearful of something. If they do not take responsibility, they should be fined for the dishonour of not dealing with these problems.

Edward Leigh Portrait Sir Edward Leigh (in the Chair)
- Hansard - -

Well done—four and a half minutes.

--- Later in debate ---
Darren Jones Portrait Darren Jones (Bristol North West) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship again, Sir Edward, and I congratulate my hon. Friend the Member for Halifax (Holly Lynch) on securing this important and timely debate. I also thank House officials for ensuring that Westminster Hall is open once again, so that we can have these debates. Before I begin my remarks, I will note my declarations of interest: my chairmanship of the parliamentary internet, communications and technology forum all-party parliamentary group, and of the APPG on technology and national security; my chairmanship of Labour Digital and the Institute of Artificial Intelligence; and my previous professional work on these issues as a technology lawyer, as noted in the Register of Members’ Financial Interest.

The online harms Bill will be a big and important piece of legislation, covering a range of difficult issues, from defining content that is harmful but not illegal and how we protect children, through to ensuring an effective regulatory framework that delivers a meaningful duty of care. Given the time, I will not rehearse the many important arguments for getting this right; I will keep my remarks short, both to give the Minister enough time to give substantive and full answers and so that other colleagues have a chance to contribute. The Secretary of State confirmed to the House in early September that the full response to the White Paper would be published this year—that is, 2020—and that legislation would be introduced early next year, which is 2021. On that basis, I have three sets of questions.

First, can the Minister confirm whether the publication of the full response to the White Paper is currently allocated to her Department’s forward grid, and if so, when it is pencilled in for publication? My understanding is that it will be published between now and December. Could she also tell us whether the Department has secured a legislative slot with the Leader of the House for First Reading, and if so, give us a rough idea of when that might be? Does the Department envisage a period of prelegislative scrutiny before Second Reading? If it does, what role will the House of Lords play in that?

Secondly, can the Minister reassure us that the initial scope of the duty of care and the enforcement powers being made available to the regulator have not been watered down, and that she agrees with me that, while it is difficult to define what is harmful but not illegal, Parliament is the body best placed to do so, not private companies? Will she also reassure us that the passage of this Bill will not be linked to negotiations with the United States on the UK-US trade deal, given that we know that the United States has placed liability loopholes for platforms in trade deals with other countries?

Finally, will the Minister confirm that the answer I received from the Security Minister on the Floor of the House--that the online harms Bill will include provisions for enhancing sovereign defensive and offensive digital capabilities--is correct? If so, will she tell us whether the progression of the Bill is linked to the ongoing integrated review?

Edward Leigh Portrait Sir Edward Leigh (in the Chair)
- Hansard - -

Textbook timekeeping.

--- Later in debate ---
Fleur Anderson Portrait Fleur Anderson (Putney) (Lab)
- Hansard - - - Excerpts

It is an honour to serve under your chairmanship, Sir Edward. I congratulate and thank my hon. Friend the Member for Halifax (Holly Lynch) for securing this important and, I hope, influential debate.

Online harms are one of the biggest worries and harms faced by parents across my constituency and across the country. As a parent, I am very worried about what is happening in the safety of my own home, which I cannot control. Speaking to other parents, I know that is a shared concern. In our own homes, children can have free and unfettered access to pornography and to people inciting young people to violent hate and extremist views. Women can be threatened to share intimate images, which can cause long-lasting damage. Our online world must be a safe and positive place for us all to explore, including our children, but at present it is not. Providers are not taking action. Parents just cannot keep up. Self-regulation is definitely not cutting it and online harm in our society is spiralling out of control.

The 2015 Conservative manifesto pledged that

“we will stop children’s exposure to harmful sexualised content online, by requiring age verification for access to all sites containing pornographic material.”

Well, it is time to come good on that commitment. If the Government had acted sooner, large numbers of children would not have been harmed by avoidable online experiences during lockdown. The consequences of ongoing inaction are severe and widespread. Our children can never unsee images they have stumbled across in all innocence in their own home. There are more children online for more time with more anxiety, yet there is less regulation, less action taken by providers and more sex offenders online.

I want to highlight three key issues. The first is pornography. According to the NSPCC, in the first three months of this year, more than 100 sex crimes against children were recorded every day. Lockdown led to a spike in online child abuse, meaning that that is much higher. The second issue is youth violence. The Mayor of London and deputy mayor for policing and crime have been vocal about the role of the internet in spreading violent messages and the incitement to commit serious youth violence. That is around us every day. The third issue is threats around sharing intimate images. One in 14 adults and one in seven young women have experienced threats about sharing intimate images. As a mother of two daughters, I am really concerned about that, and I know that parents across the country share that concern.

Although the sharing of intimate images was made a crime in 2015, threatening to share them can be just as damaging, but it is not illegal in England and Wales, although it is in Scotland. The threats are used to control, damage and affect mental health, and one in 10 survivors said that the threats had made them feel suicidal. There is also a substantial body of evidence suggesting that exposure to pornography is harmful to children and young people and can have a damaging impact on young people’s views of sex or relationships, leading to a belief that women are sex objects. There are links between sexually coercive behaviour and higher rates of sexual harassment and forced sex. We simply cannot let this situation go unregulated any longer, so I have some questions for the Minister.

When will the first reading of the online harms Bill be? Is there urgency to tackle online harms? Will the Minister commit to introducing legislation to outlaw threats to share intimate images as part of the Domestic Abuse Bill? Can she introduce a statutory instrument to redesignate the regulator as the British Board of Film Classification? That could be done very quickly and would enable age verification of pornographic websites. Will the online harms Bill contain strong and robust action, with a framework of comprehensive regulations and an adaptable new regulator that can adapt to the issues that will come up in future that we do not even know about yet?

It is time for tough action. We have really strict limits against hate speech and pornography in other areas of life, but just where most children are most of the time is where the Government are failing in their duty of care.

Edward Leigh Portrait Sir Edward Leigh (in the Chair)
- Hansard - -

We now come to the summing-up speeches. I call Gavin Newlands.