All 8 Nick Fletcher contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Tue 24th May 2022
Thu 9th Jun 2022
Tue 14th Jun 2022
Tue 21st Jun 2022
Tue 12th Jul 2022
Online Safety Bill
Commons Chamber

Report stage & Report stage (day 1) & Report stage
Tue 13th Dec 2022
ONLINE SAFETY BILL (First sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 1st sitting
Tue 13th Dec 2022
ONLINE SAFETY BILL (Second sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 2nd sitting
Thu 15th Dec 2022
ONLINE SAFETY BILL (Third sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 3rd sitting

Online Safety Bill (Second sitting)

Nick Fletcher Excerpts
Committee stage
Tuesday 24th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 24 May 2022 - (24 May 2022)
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Thank you. Can I ask for a bit more detail on a question that you touched on earlier with my colleague Kirsty Blackman? It is to Professor McGlynn, really. I think you included in your written evidence to the Committee a point about using age and consent verification for pornography sites for people featured in the content of the site—not the age verification assurance checks on the sites, but for the content. Could I just draw out from you whether that is feasible, and would it be retrospective for all videos, or just new ones? How would that work?

Professor Clare McGlynn: Inevitably, it would have to work from any time that that requirement was put in place, in reality. That measure is being discussed in the Canadian Parliament at the moment—you might know that Pornhub’s parent company, MindGeek, is based in Canada, which is why they are doing a lot of work in that regard. The provision was also put forward by the European Parliament in its debates on the Digital Services Act. Of course, any of these measures are possible; we could put it into the Bill that that will be a requirement.

Another way of doing it, of course, would be for the regulator to say that one of the ways in which Pornhub, for example—or XVideos or xHamster—should ensure that they are fulfilling their safety duties is by ensuring the age and consent of those for whom videos are uploaded. The flipside of that is that we could also introduce an offence for uploading a video and falsely representing that the person in the video had given their consent to that. That would mirror offences in the Fraud Act 2006.

The idea is really about introducing some element of friction so that there is a break before images are uploaded. For example, with intimate image abuse, which we have already talked about, the revenge porn helpline reports that for over half of the cases of such abuse that it deals with, the images go on to porn websites. So those aspects are really important. It is not just about all porn videos; it is also about trying to reduce the distribution of non-consensual videos.

Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - -

Q I think that it would have been better to hear from you three before we heard from the platforms this morning. Unfortunately, you have opened my eyes to a few things that I wish I did not have to know about—I think we all feel the same.

I am concerned about VPNs. Will the Bill stop anyone accessing through VPNs? Is there anything we can do about that? I googled “VPNs” to find out what they were, and apparently there is a genuine need for them when using public networks, because it is safer. Costa Coffee suggests that people do so, for example. I do not know how we could work that.

You have obviously educated me, and probably some of my colleagues, about some of the sites that are available. I do not mix in circles where I would be exposed to that, but obviously children and young people do and there is no filter. If I did know about those things, I would probably not speak to my colleagues about it, because that would probably not be a good thing to do, but younger people might think it is quite funny to talk about. Do you think there is an education piece there for schools and parents? Should these platforms be saying to them, “Look, this is out there, even though you might not have heard of it—some MPs have not heard of it.” We ought to be doing something to protect children by telling parents what to look out for. Could there be something in the Bill to force them to do that? Do you think that would be a good idea? There is an awful lot there to answer—sorry.

Professor Clare McGlynn: On VPNs, I guess it is like so much technology: obviously it can be used for good, but it can also be used to evade regulations. My understanding is that individuals will be able to use a VPN to avoid age verification. On that point, I emphasise that in recent years Pornhub, at the same time as it was talking to the Government about developing age verification, was developing its own VPN app. At the same time it was saying, “Of course we will comply with your age verification rules.”

Don’t get me wrong: the age assurance provisions are important, because they will stop people stumbling across material, which is particularly important for the very youngest. In reality, 75% know about VPNs now, but once it becomes more widely known that this is how to evade it, I expect that all younger people will know how to do so. I do not think there is anything else you can do in the Bill, because you are not going to outlaw VPNs, for the reasons you identified—they are actually really important in some ways.

That is why the focus needs to be on content, because that is what we are actually concerned about. When you talk about media literacy and understanding, you are absolutely right, because we need to do more to educate all people, including young people—it does not just stop at age 18—about the nature of the pornography and the impact it can have. I guess that goes to the point about media literacy as well. It does also go to the point about fully and expertly resourcing sex and relationships education in school. Pornhub has its own sex education arm, but it is not the sex education arm that I think many of us would want to be encouraging. We need to be doing more in that regard.

Nick Fletcher Portrait Nick Fletcher
- Hansard - -

Q This might sound like a silly question. Can we not just put age verification on VPN sites, so that you can only have VPN access if you have gone through age verification? Do you understand what I am saying?

Professor Clare McGlynn: I do. We are beginning to reach the limits of my technical knowledge.

Nick Fletcher Portrait Nick Fletcher
- Hansard - -

You have gone beyond mine anyway.

Professor Clare McGlynn: You might be able to do that through regulations on your phone. If you have a phone that is age-protected, you might not be able to download a particular VPN app, perhaps. Maybe you could do that, but people would find ways to evade that requirement as well. We have to tackle the content. That is why you need to tackle Google and Twitter as well as the likes of Pornhub.

Nick Fletcher Portrait Nick Fletcher
- Hansard - -

Can we have them back in, Sir Roger?

None Portrait The Chair
- Hansard -

Minister?

Online Safety Bill (Seventh sitting)

Nick Fletcher Excerpts
Committee stage
Thursday 9th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 9 June 2022 - (9 Jun 2022)
None Portrait The Chair
- Hansard -

With your indulgence, Minister, Nick Fletcher would like to speak.

Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - -

I have been contacted by a number of people about this clause, and they have serious concerns about the “have regard” statement. The Christian Institute said that it was

“promised ‘considerably stronger protections for free speech’, but the Bill does not deliver. Internet companies will be under ‘a duty to have regard to the importance of’ protecting free speech,”

but a “have regard” duty

“has no weight behind it. It is perfectly possible to…have regard to something…and then ignore it in practice.”

The “have regard” duty is not strong enough, and it is a real concern for a lot of people out there. Protecting children is absolutely imperative, but there are serious concerns when it comes to freedom of speech. Can the Minister address them for me?

Online Safety Bill (Tenth sitting) Debate

Full Debate: Read Full Debate

Online Safety Bill (Tenth sitting)

Nick Fletcher Excerpts
Committee stage
Tuesday 14th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 14 June 2022 - (14 Jun 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Again, Labour supports moves to ensure that there is some clarity about specific content that is deemed to be harmful to adults, but of course the Opposition have concerns about the overall aim of defining harm.

The Government’s chosen approach to regulating the online space has left too much up to secondary legislation. We are also concerned that health misinformation and disinformation—a key harm, as we have all learned from the coronavirus pandemic—is missing from the Bill. That is why we too support amendment 83. The impact of health misinformation and disinformation is very real. Estimates suggest that the number of social media accounts posting misinformation about vaccines, and the number of users following those accounts, increased during the pandemic. Research by the Centre for Countering Digital Hate, published in November 2020, suggested that the number of followers of the largest anti-vaccination social media accounts had increased by 25% since 2019. At the height of the pandemic, it was also estimated that there were 5.4 million UK-based followers of anti-vaccine Twitter accounts.

Interestingly, an Ofcom survey of around 200 respondents carried out between 12 and 14 March 2021 found that 28% of respondents had come across information about covid-19 that could be considered false or misleading. Of those who had encountered such information, respondents from minority ethnic backgrounds were twice as likely to say that the claim made to them made them think twice about the issue compared with white respondents. The survey found that of those people who were getting news and information about the coronavirus within the preceding week, 15% of respondents had come across claims that the coronavirus vaccines would alter human DNA; 18% had encountered claims that the coronavirus vaccines were a cover for the implant of trackable microchips, and 10% had encountered claims that the vaccines contained animal products.

Public health authorities, the UK Government, social media companies and other organisations all attempted to address the spread of vaccine misinformation through various strategies, including moderation of vaccine misinformation on social media platforms, ensuring the public had access to accurate and reliable information and providing education and guidance to people on how to address misinformation when they came across it.

Although studies do not show strong links between susceptibility to misinformation and ethnicity in the UK, some practitioners and other groups have raised concerns about the spread and impact of covid-19 vaccine misinformation among certain minority ethnic groups. Those concerns stem from research that shows historically lower levels of vaccine confidence and uptake among those groups. Some recent evidence from the UK’s vaccine roll-out suggests that that trend has continued for the covid-19 vaccine.

Data from the OpenSAFELY platform, which includes data from 40% of GP practices in England, covering more than 24 million patients, found that up to 7 April 2021, 96% of white people aged over 60 had received a vaccination compared with only 77% of people from a Pakistani background, 76% from a Chinese background and 69% of black people within the same age group. A 2021 survey of more than 172,000 adults in England on attitudes to the vaccine also found that confidence in covid-19 vaccines was highest in those of white ethnicity, with some 92.6% saying that they had accepted or would accept the vaccine. The lowest confidence was found in those of black ethnicity, at 72.5%. Some of the initiatives to tackle vaccine misinformation and encourage vaccine take-up were aimed at specific minority ethnic groups, and experts have emphasised the importance of ensuring that factual information about covid-19 vaccines is available in multiple different languages.

Social media companies have taken various steps to tackle misinformation on their platforms during the covid-19 pandemic, including removing or demoting misinformation, directing users to information from official sources and banning certain adverts. So, they can do it when they want to—they just need to be compelled to do it by a Bill. However, we need to go further. Some of the broad approaches to content moderation that digital platforms have taken to address misinformation during the pandemic are discussed in the Parliamentary Office of Science and Technology’s previous rapid response on covid-19 and misinformation.

More recently, some social media companies have taken specific action to counter vaccine misinformation. In February 2021, as part of its wider policies on coronavirus misinformation, Facebook announced that it would expand its efforts to remove false information about covid-19 vaccines, and other vaccines more broadly. The company said it would label posts that discuss covid-19 vaccines with additional information from the World Health Organisation. It also said it would signpost its users to information on where and when they could get vaccinated. Facebook is now applying similar measures to Instagram.

In March 2021, Twitter began applying labels to tweets that could contain misinformation about covid-19 vaccines. It also introduced a strike policy, under which users that violate its covid-19 misinformation policy five or more times would have their account permanently suspended.

YouTube announced a specific ban on covid-19 anti-vaccination videos in October 2020. It committed to removing any videos that contradict official information about the vaccine from the World Health Organisation. In March, the company said it had removed more than 30,000 misleading videos about the covid-19 vaccine since the ban was introduced. However, as with most issues, until the legislation changes, service providers will not feel truly compelled to do the right thing, which is why we must legislate and push forward with amendment 83.

Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - -

I would like to speak to the clause rather than the amendment, Sir Roger. Is now the right time to do so, or are we only allowed to speak to the amendment?

None Portrait The Chair
- Hansard -

It can be, in the sense that I am minded not to have a clause stand part debate.

Nick Fletcher Portrait Nick Fletcher
- Hansard - -

Thank you, Sir Roger. I think that the Minister would agree that this is probably one of the most contentious parts of the Bill. It concerns legal but harmful content, which is causing an awful lot of concern out there. The clause says that the Secretary of State may in regulations define as

“priority content that is harmful to adults”

content that he or she considers to present

“a material risk of significant harm to an appreciable number of adults”.

We have discussed this issue in other places before, but I am deeply concerned about freedom of speech and people being able to say what they think. What is harmful to me may not be harmful to any other colleagues in this place. We would be leaving it to the Secretary of State to make that decision. I would like to hear the Minister’s thoughts on that.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am very happy to reply to the various queries that have been made. I will start with the points on vaccine disinformation raised by the hon. Members for Ochil and South Perthshire and for Pontypridd. The Government strongly agree with the points they made about the damaging effects of vaccine misinformation and the fact that many of our fellow citizens have probably died as a result of being misled into refusing the vaccine when it is, of course, perfectly safe. We strongly share the concerns they have articulated.

Over the past two years, the Department for Digital, Culture, Media and Sport has worked together with other Departments to develop a strong operational response to this issue. We have established a counter-disinformation unit within DCMS whose remit is to identify misinformation and work with social media firms to get it taken down. The principal focus of that unit during the pandemic was, of course, covid. In the past three months, it has focused more on the Russia-Ukraine conflict, for obvious reasons.

In some cases, Ministers have engaged directly with social media firms to encourage them to remove content that is clearly inappropriate. For example, in the Russia-Ukraine context, I have had conversations with social media companies that have left up clearly flagrant Russian disinformation. This is, therefore, an area that the Government are concerned about and have been acting on operationally already.

Obviously, we agree with the intention behind the amendment. However, the way to handle it is not to randomly drop an item into the Bill and leave the rest to a statutory instrument. Important and worthy though it may be to deal with disinformation, and specifically harmful health-related disinformation, there are plenty of other important things that one might add that are legal but harmful to adults, so we will not accept the amendment. Instead, we will proceed as planned by designating the list via a statutory instrument. I know that a number of Members of Parliament, probably including members of this Committee, would find it helpful to see a draft list of what those items might be, not least to get assurance that health-related misinformation and disinformation is on that list. That is something that we are considering very carefully, and more news might be forthcoming as the Bill proceeds through Parliament.

--- Later in debate ---
Thirdly, and finally, let us think about how big platforms such as Facebook and Twitter confront such issues. The truth is that they behave in an arbitrary manner; they are not consistent in how they apply their own terms and conditions. They sometimes apply biases—a matter on which my right hon. Friend the Secretary of State commented recently. No requirement is placed on them to be consistent or to have regard to freedom of speech. So they do things such as cancel Donald Trump—people have their own views on that—while allowing Vladimir Putin’s propaganda to be spread. That is obviously inconsistent. They have taken down a video of my hon. Friend the Member for Christchurch (Sir Christopher Chope) speaking in the House of Commons Chamber. That would be difficult once the Bill is passed because clause 15 introduces protection for content of democratic importance. So I do not think that the legal but harmful duties infringe free speech. To the contrary, once the Bill is passed, as I hope it will be, it will improve freedom of speech on the internet. It will not make it perfect, and I do not pretend that it will, but it will make some modest improvements.
Nick Fletcher Portrait Nick Fletcher
- Hansard - -

The argument has been made that the social media companies are doing this anyway, but two wrongs don’t make a right. We need to stop them doing it. I understand what we are trying to do here. We can see straight away that the Opposition want to be tighter on this. At a later date, if the Bill goes through as it is, freedom of speech will be gradually suppressed, and I am really concerned about that. My hon. Friend said that it would come back to Parliament, which I am pleased about. Are the priorities going to be written into the Bill? Will we be able to vote on them? If the scope is extended at any point in time, will we be able to vote on that, or will the Secretary of State just say, “We can’t have that so we’re just going to ban it”?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I will answer the questions in reverse order. The list of harms will not be in the Bill. The amendment seeks to put one of the harms in the Bill but not the others. So no, it will not be in the Bill. The harms—either the initial list or any addition to or subtraction from the list—will be listed in an affirmative statutory instrument, which means that the House will be able to look at it and, if it wants, to vote on it. So Parliament will get a chance to look at the initial list, when it is published in an SI. If anything is to be added in one, two or three years’ time, the same will apply.

Nick Fletcher Portrait Nick Fletcher
- Hansard - -

So will we be able to vote on any extension of the scope of the Bill at any time? Will that go out to public consultation as well?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes. There is an obligation on the Secretary of State to consult—[Interruption.] Did I hear someone laugh?—before proposing a statutory instrument to add things. There is a consultation first and then, if extra things are going to be added—in my hon. Friend’s language, if the scope is increased—that would be votable by Parliament because it is an affirmative SI. So the answer is yes to both questions. Yes there will be consultation in advance, and yes, if this Government or a future Government wanted to add anything, Parliament could vote on it if it wanted to because it will be an affirmative SI. That is a really important point.

Online Safety Bill (Fourteenth sitting) Debate

Full Debate: Read Full Debate

Online Safety Bill (Fourteenth sitting)

Nick Fletcher Excerpts
Committee stage
Tuesday 21st June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Member for Pontypridd says from a sedentary position that they have given consent. The consent is not built into the website’s terms and conditions; it is an assumed social norm for people on those websites. We need to tread carefully and be thoughtful, to ensure that by doing more to protect one group we do not inadvertently criminalise another.

There is a case for looking at the issue again. My right hon. Friend has made the point thoughtfully and powerfully, and in a way that suggests we can stay within the confines of the Law Commission’s advice, while being more thoughtful. I will certainly undertake to go away and do that, in consultation with my right hon. Friend and others.

Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - -

I am pleased the Minister will go away and look at this. I am sure there are laws already in place that cover these things, but I know that this issue is very specific. An awful lot of the time, we put laws in place, but we could help an awful lot of people through education, although the last thing we want to do is victim blame. The Government could work with companies that provide devices and have those issued with the airdrop in contacts-only mode, as opposed to being open to everybody. That would stop an awful lot of people getting messages that they should not be receiving in the first place.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

My hon. Friend makes a very powerful and important point. Hopefully, people listening to our proceedings will hear that, as well as those working on media literacy—principally, Ofcom and the Government, through their media literacy strategy. We have had a couple of specific tips that have come out of today’s debate. My right hon. Friend the Member for Basingstoke and my hon. Friend the Member for Don Valley mentioned disabling a device’s airdrop, or making it contacts-only. A point was also made about inadvertently sharing geolocations, whether through Snapchat or Strava. Those are two different but important points that the general public should be more aware of than they are.

Online Safety Bill

Nick Fletcher Excerpts
I regret that the Government do not feel able to support our proposition, but I think its time will come. A lot of the stuff that we are doing in this Bill is innovative, and we are not sure where everything will land. We are likely to get some things wrong and others right. I say to all Members, from across this House, that if we really want to reduce the amount of harmful abuse online, tackling anonymous abuse, rather than anonymity, must be central to our concerns. I urge my Front-Bench team and the Government to think carefully about this.
Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- View Speech - Hansard - -

I rise to speak on amendments 50, 51 and 55, and I share the free speech concerns that I think lie behind amendment 151. As I said in Committee to the previous Minister, my hon. Friend the Member for Croydon South (Chris Philp), who knew this Bill inside out—it was amazing to watch him do it—I have deep concerns about how the duty on “legal but harmful” content will affect freedom of speech. I do not want people to be prevented from saying what they think. I am known for saying what I think, and I believe others should be allowed the same freedom, offline and online. What is harmful can be a subjective question, and many of us in this House might have different answers. When we start talking about restricting content that is perfectly legal, we should be very careful.

This Bill is very complex and detailed, as I know full well, having been on the Committee. I support the Bill—it is needed—but when it comes to legal but harmful content, we need to make sure that free speech is given enough protection. We have to get the right balance, but clause 19 does not do that. It says only that social media companies have

“a duty to have regard to the importance of protecting users’ right to freedom of expression within the law.”

There is no duty to do anything about freedom of speech; it just says, “You have to think about the importance of it”. That is not enough.

I know that the Bill does not state that social media companies have to restrict content—I understand that—but in the real world that is what will happen. If the Government define certain content as harmful, no social media company will want to be associated with it. The likes of Meta will want to be seen to get tough on legally defined harmful content, so of course it will be taken down or restricted. We have to counterbalance that instinct by putting stronger free speech duties in the Bill if we insist on it covering legal but harmful.

The Government have said that we cannot have stronger free speech obligations on private companies, and, in general, I agree with that. However, this Bill puts all sorts of other obligations on Facebook, Twitter and Instagram, because they are not like other private companies. These companies and their chief executive officers are household words all around the world, and their power and influence is incredible. In 2021, Facebook’s revenue was $117 billion, which is higher than the GDP—

Andrew Percy Portrait Andrew Percy
- Hansard - - - Excerpts

Is that not exactly why there has to be action on legal but harmful content? The cross-boundary, cross-national powers of these organisations mean that we have to insist that they take action against harm, whether lawful or unlawful. We are simply asking those organisations to risk assess and ensure that appropriate warnings are provided, just as they are in respect of lots of harms in society; the Government require corporations and individuals to risk assess those harms and warn about them. The fact that these organisations are so transnational and huge is absolutely why we must require them to risk assess legal but harmful content.

--- Later in debate ---
Nick Fletcher Portrait Nick Fletcher
- Hansard - -

I understand what my hon. Friend is saying, but the list of what is legal but harmful will be set by the Secretary of State, not by Parliament. All we ask is for that to be discussed on the Floor of the House before we place those duties on the companies. That is all I am asking us to do.

Facebook has about 3 billion active users globally. That is more than double the population of China, the world’s most populous nation, and it is well over half the number of internet users in the entire world. These companies are unlike any others we have seen in history. For hundreds of millions of people around the world, they are the public square, which is how the companies have described themselves: Twitter founder Jack Dorsey said in 2018:

“We believe many people use Twitter as a digital public square. They gather from all around the world to see what’s happening, and have a conversation about what they see.”

In 2019, Mark Zuckerberg said:

“Facebook and Instagram have helped people connect with friends, communities, and interests in the digital equivalent of a town square.”

Someone who is blocked from these platforms is blocked from the public square, as we saw when the former President of the United States was blocked. Whatever we might think about Donald Trump, it cannot be right that he was banned from Twitter. We have to have stronger protection for free speech in the digital public square than clause 19 gives. The Bill gives the Secretary of State the power to define what is legal but harmful by regulations. As I have said, this is an area where free speech could easily be affected—

Adam Afriyie Portrait Adam Afriyie (Windsor) (Con)
- Hansard - - - Excerpts

I commend my hon. Friend for the powerful speech he is making. It seems to many of us here that if anyone is going to be setting the law or a regulation, it should really be done in the Chamber of this House. I would be very happy if we had annual debates on what may be harmful but is currently lawful, in order to make it illegal. I very much concur with what he is saying.

Nick Fletcher Portrait Nick Fletcher
- Hansard - -

I thank my hon. Friend for his contribution, which deals with what I was going to finish with. It is not enough for the Secretary of State to have to consult Ofcom; there should be public consultation too. I support amendment 55, which my hon. Friend has tabled.

Anna McMorrin Portrait Anna McMorrin (Cardiff North) (Lab)
- View Speech - Hansard - - - Excerpts

Not too long ago, the tech industry was widely looked up to and the internet was regarded as the way forward for democracy and freedoms. Today that is not the case. Every day we read headlines about data leaks, racist algorithms, online abuse, and social media platforms promoting, and becoming swamped in, misinformation, misogyny and hate. These problems are not simply the fault of those platforms and tech companies; they are the result of a failure to govern technology properly. That has resulted from years of muddled thinking and a failure to bring forward this Bill, and now, a failure to ensure that the Bill is robust enough.

Ministers have talked up the Bill, and I welcome the improvements that were made in Committee. Nevertheless, Ministers had over a decade in which to bring forward proposals, and in that time online crime exploded. Child sexual abuse online has become rife; the dark web provides a location for criminals to run rampant and scams are widespread.

Delay has also allowed disinformation to spread, including state-sponsored propaganda and disinformation, such as from Russia’s current regime. False claims and fake fact checks are going viral. That encourages other groups to adopt such tactics, in an attempt to undermine democracy, from covid deniers to climate change deniers—it is rampant.

Today I shall speak in support of new clause 3, to put violence against women and girls on the face of the Bill. As a female MP, I, along with my colleagues, have faced a torrent of abuse online, attacking me personally and professionally. I have been sent images such as that of a person with a noose around their neck, as well as numerous messages containing antisemitic and misogynistic abuse directed towards both me and my children. It is deeply disturbing, but also unsurprising, that one in five women across the country have been subjected to abuse; I would guess that that figure is actually much higher.

ONLINE SAFETY BILL (First sitting)

Nick Fletcher Excerpts
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Throughout the consideration of the Bill, I have been clear that I do not want it to end up simply being the keep MPs safe on Twitter Bill. That is not what it should be about. I did not mean that we should therefore take out everything that protects adults; what I meant was that we need to have a big focus on protecting children in the Bill, which thankfully we still do. For all our concerns about the issues and inadequacies of the Bill, it will go some way to providing better protections for children online. But saying that it should not be the keep MPs safe on Twitter Bill does not mean that it should not keep MPs safe on Twitter.

I understand how we have got to this situation. What I cannot understand is the Minister’s being willing to stand up there and say, “We can’t have these clauses because they are a risk to freedom of speech.” Why are they in the Bill in the first place if they are such a big risk to freedom of speech? If the Government’s No. 1 priority is making sure that we do not have these clauses, why did they put them in it? Why did it go through pre-legislative scrutiny? Why were they in the draft Bill? Why were they in the Bill? Why did they agree with them in Committee? Why did they agree with them on Report? Why have we ended up in a situation where, suddenly, there is a massive epiphany that they are a threat to freedom of speech and therefore we cannot possibly have them?

What is it that people want to say that they will be banned from saying as a result of this Bill? What is it that freedom of speech campaigners are so desperate to want to say online? Do they want to promote self-harm on platforms? Is that what people want to do? Is that what freedom of speech campaigners are out for? They are now allowed to do that a result of the Bill.

Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - -

I believe that the triple shield being put in is in place of “legal but harmful”. That will enable users to put a layer of protection in so they can actually take control. But the illegal content still has to be taken down: anything that promotes self-harm is illegal content and would still have to be removed. The problem with the way it was before is that we had a Secretary of State telling us what could be said out there and what could not. What may offend the hon. Lady may not offend me, and vice versa. We have to be very careful of that. It is so important that we protect free speech. We are now giving control to each individual who uses the internet.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The promotion of self-harm is not illegal content; people are now able to do that online—congratulations, great! The promotion of incel culture is not illegal content, so this Bill will now allow people to do that online. It will allow terms of service that do not require people to be banned for promoting incel culture, self-harm, not wearing masks and not getting a covid vaccine. It will allow the platforms to allow people to say these things. That is what has been achieved by campaigners.

The Bill is making people less safe online. We will continue to have the same problems that we have with people being driven to suicide and radicalised online as a result of the changes being made in this Bill. I know the Government have been leaned on heavily by the free speech lobby. I still do not know what people want to say that they cannot say as a result of the Bill as it stands. I do not know. I cannot imagine that anybody is not offended by content online that drives people to hurt themselves. I cannot imagine anybody being okay and happy with that. Certainly, I imagine that nobody in this room is okay and happy with that.

These people have won this war on the attack on free speech. They have won a situation where they are able to promote misogynistic, incel culture and health disinformation, where they are able to say that the covid vaccine is entirely about putting microchips in people. People are allowed to say that now—great! That is what has been achieved, and it is a societal issue. We have a generational issue where people online are being exposed to harmful content. That will now continue.

It is not just a generational societal thing—it is not just an issue for society as a whole that these conspiracy theories are pervading. Some of the conspiracy theories around antisemitism are unbelievably horrific, but do not step over into illegality or David Icke would not be able to stand up and suggest that the world is run by lizard people—who happen to be Jewish. He would not be allowed to say that because it would be considered harmful content. But now he is. That is fine. He is allowed to say that because this Bill is refusing to take action on that.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That is the thing: this Bill is supposed to be the Online Safety Bill. It is supposed to be about protecting people from the harm that can be done to them by others. It is also supposed to be about protecting people from that radicalisation and that harm that they can end up in. It is supposed to make a difference. It is supposed to be game changer and a world leader.

Although, absolutely, I recognise the importance of the child-safety duties in the clauses and the change that that will have, when people turn 18 they do not suddenly become different humans. They do not wake up on their 18th birthday as a different person from the one that they were before. They should not have to go from that level of protection, prior to 18, to being immediately exposed to comments and content encouraging them to self-harm, and to all of the negative things that we know are present online.

Nick Fletcher Portrait Nick Fletcher
- Hansard - -

I understand some of the arguments the hon. Lady is making, but that is a poor argument given that the day people turn 17 they can learn to drive or the day they turn 16 they can do something else. There are lots of these things, but we have to draw a line in the sand somewhere. Eighteen is when people become adults. If we do not like that, we can change the age, but there has to be a line in the sand. I agree with much of what the hon. Lady is saying, but that is a poor argument. I am sorry, but it is.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I do not disagree that overnight changes are involved, but the problem is that we are going from a certain level of protection to nothing; there will be a drastic, dramatic shift. We will end up with any vulnerable person who is over 18 being potentially subject to all this content online.

I still do not understand what people think they will have won as a result of having the provisions removed from the Bill. I do not understand how people can say, “This is now a substantially better Bill, and we are much freer and better off as a result of the changes.” That is not the case; removing the provisions will mean the internet continuing to be unsafe—much more unsafe than it would have been under the previous iteration of the Bill. It will ensure that more people are harmed as a result of online content. It will absolutely—

Nick Fletcher Portrait Nick Fletcher
- Hansard - -

Will the hon. Lady give way?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

No, I will not give way again. The change will ensure that people can absolutely say what they like online, but the damage and harm that it will cause are not balanced by the freedoms that have been won.

ONLINE SAFETY BILL (Second sitting)

Nick Fletcher Excerpts
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Sarah Owen Portrait Sarah Owen (Luton North) (Lab)
- Hansard - - - Excerpts

It is a pleasure to see you in the Chair, Dame Angela—I wish it was a toastier room. Let me add to the points that the shadow Minister, my hon. Friend the Member for Pontypridd, made so powerfully about vulnerable people. There is no cliff edge when such a person becomes 18. What thought have the Minister and the Department given to vulnerable young adults with learning disabilities or spectrum disorders? Frankly, the idea that, as soon as a person turns 18, they are magically no longer vulnerable is for the birds—particularly when it comes to eating disorders, suicide and self-harm.

Adults do not live in isolation, and they do not just live online. We have a duty of care to people. The perfect example is disinformation, particularly when it comes to its harmful impact on public health. We saw that with the pandemic and vaccine misinformation. We saw it with the harm done to children by the anti-vaccine movement’s myths about vaccines, children and babies. It causes greater harm than just having a conversation online.

People do not stay in one lane. Once people start being sucked into conspiracy myths, much as we discussed earlier around the algorithms that are used to keep people online, it has to keep ramping up. Social media and tech companies do that very well. They know how to do it. That is why I might start looking for something to do with ramen recipes and all of a sudden I am on to a cat that has decided to make noodles. It always ramps up. That is the fun end of it, but on the serious end somebody will start to have doubts about certain public health messages the Government are sending out. That then tips into other conspiracy theories that have really harmful, damaging consequences.

I saw that personally. My hon. Friend the Member for Warrington North eloquently put forward some really powerful examples of what she has been subjected to. With covid, some of the anti-vaccinators and anti-mask-wearers who targeted me quickly slipped into Sinophobia and racism. I was sent videos of people eating live animals, and being blamed for a global pandemic.

The people who have been targeted do not stay in one lane. The idea that adults are not vulnerable, and susceptible, to such targeting and do not need protection from it is frankly for the birds. We see that particularly with extremism, misogyny and the incel culture. I take the point from our earlier discussion about who determines what crosses the legal threshold, but why do we have to wait until somebody is physically hurt before the Government act?

That is really regrettable. So, too, is the fact that this is such a huge U-turn in policy, with 15% of the Bill coming back to Committee. As we have heard, that is unprecedented, and yet, on the most pivotal point, we were unable to hear expert advice, particularly from the National Society for the Prevention of Cruelty to Children, Barnardo’s and the Antisemitism Policy Trust. I was struggling to understand why we would not hear expert advice on such a drastic change to an important piece of legislation—until I heard the hon. Member for Don Valley talk about offence. This is not about offence; it is about harm.

The hon. Member’s comments highlighted perfectly the real reason we are all here in a freezing cold Bill Committee, rehashing work that has already been solved. The Bill was not perfect by any stretch of the imagination, but it was better than what we have today. The real reason we are here is the fight within the Conservative party.

Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - -

No such fight has taken place. These are my personal views, and I genuinely believe that people have a right to say what they would like to say. That is free speech. There have been no fights whatever.

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

In that case, I must have been mistaken in thinking that the hon. Member—who has probably said quite a lot of things, which is why his voice is as hoarse as it is—was criticising the former Minister for measures that were agreed in previous Committee sittings.

For me, the current proposals are a really disappointing, retrograde step. They will not protect the most vulnerable people in our communities, including offline—this harm is not just online, but stretches out across all our communities. What happens online does not take place, and stay, in an isolated space; people are influenced by it and take their cues from it. They do not just take their cues from what is said in Parliament; they see misogynists online and think that they can treat people like that. They see horrific abuses of power and extreme pornography and, as we heard from the hon. Member for Aberdeen North, take their cues from that. What happens online does not stay online.

ONLINE SAFETY BILL (Third sitting)

Nick Fletcher Excerpts
Committee stage (re-committed clauses and schedules)
Thursday 15th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 15 December 2022 - (15 Dec 2022)
Brought up, and read the First time.
Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - -

I beg to move, That the clause be read a Second time.

It is a pleasure to serve under your chairmanship, Dame Angela. If you will allow, I want to apologise for comments made on the promotion of suicide and self-harm to adults. I believed that to be illegal, but apparently it is not. I am a free speech champion, but I do not agree with the promotion of this sort of information. I hope that the three shields will do much to stop those topics being shared.

I turn to new clause 9. I have done much while in this position to try to protect children, and that is why I followed the Bill as much as I could all the way through. Harmful content online is having tragic consequences for children. Cases such as that of Molly Russell demonstrate the incredible power of harmful material and dangerous algorithms. We know that the proliferation of online pornography is rewiring children’s brains and leading to horrendous consequences, such as child-on-child sexual abuse. This issue is of immense importance for the safety and protection of children, and for the future of our whole society.

Under the Bill, senior managers will not be personally liable for breaching the safety duties, and instead are liable only where they fail to comply with information requests or willingly seek to mislead the regulator. The Government must hardwire the safety duties to deliver a culture of compliance in regulated firms. The Bill must be strengthened to actively promote cultural change in companies and embed compliance with online safety regulations at board level.

We need a robust corporate and senior management liability scheme that imposes personal liability on directors whose actions consistently and significantly put children at risk. The Bill must learn lessons from other regulated sectors, principally financial services, where regulation imposes specific duties on the directors and senior managers of financial institutions, and those responsible individuals face regulatory enforcement if they act in breach of such duties.

The Joint Committee on the draft Online Safety Bill, which conducted pre-legislative scrutiny, recommended that a senior manager at or reporting to board level

“should be designated the ‘Safety Controller’ and made liable for a new offence: the failure to comply with their obligations as regulated service providers when there is clear evidence of repeated and systemic failings that result in a significant risk of serious harm to users.”

Some 82% of UK adults would support the appointment of a senior manager to be held liable for children’s safety on social media sites, and I believe that the measure is also backed by the NSPCC.

There is no direct relationship in the Bill between senior management liability and the discharge by a platform of its safety duties. The Government have repeatedly argued against the designation of a specific individual as a safety controller for some understandable reasons: an offence could be committed by the company without the knowledge of the named individual, and the arrangement would allow many senior managers and directors to face no consequences. However, new clause 9 would take a different approach by deeming any senior employee or manager at the company to be a director for the purposes of the Bill

The concept of consent or connivance is already used in other Acts of Parliament, such as the Theft Act 1968 and the Health and Safety at Work etc. Act 1974. In other words, if a tech platform is found to be in breach of the Online Safety Bill—once it has become an Act—with regard to its duties to children, and it can be proven that this breach occurred with the knowledge or consent of a senior person, that person could be held criminally liable for the breach.

I have been a director in the construction industry for many years. There is a phrase in the industry that the company can pay the fine, but it cannot do the time. I genuinely believe that holding directors criminally liable will ensure that the Bill, which is good legislation, will really be taken seriously. I hope the Minister will agree to meet me to discuss this further.

--- Later in debate ---
For the reasons that I have given, I strongly believe that the Bill’s approach to enforcement will be effective. It will protect users without introducing incentives for managers to remove swathes of content out of fear of prosecution. I want to make sure that the legislation gets on the books and is proportionate, and that we do not start gold-plating it with these sorts of measures now, because we risk disrupting the balance that I think we have achieved in the Bill as amended.
Nick Fletcher Portrait Nick Fletcher
- Hansard - -

I appreciate the Minister’s comments, but from what my hon. Friends the Members for Folkestone and Hythe, for Eastbourne, and for Redditch said this morning about TikTok—these sorts of images get to children within two and a half minutes—it seems that there is a cultural issue, which the hon. Member for Pontypridd mentioned. Including new clause 9 in the Bill would really ram home the message that we are taking this seriously, that the culture needs to change, and that we need to do all that we can. I hope that the Minister will speak to his colleagues in the Ministry of Justice to see what, if anything, can be done.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I forgot to respond to my hon. Friend’s question about whether I would meet him. I will happily meet him.

Nick Fletcher Portrait Nick Fletcher
- Hansard - -

I appreciate that. We will come back to this issue on Report, but I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

Question proposed, That the Chair do report the Bill, as amended, to the House.

None Portrait The Chair
- Hansard -

It is usual at this juncture for there to be a few thanks and niceties, if people wish to give them.