Online Safety Bill (Tenth sitting) Debate

Full Debate: Read Full Debate
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Again, Labour supports moves to ensure that there is some clarity about specific content that is deemed to be harmful to adults, but of course the Opposition have concerns about the overall aim of defining harm.

The Government’s chosen approach to regulating the online space has left too much up to secondary legislation. We are also concerned that health misinformation and disinformation—a key harm, as we have all learned from the coronavirus pandemic—is missing from the Bill. That is why we too support amendment 83. The impact of health misinformation and disinformation is very real. Estimates suggest that the number of social media accounts posting misinformation about vaccines, and the number of users following those accounts, increased during the pandemic. Research by the Centre for Countering Digital Hate, published in November 2020, suggested that the number of followers of the largest anti-vaccination social media accounts had increased by 25% since 2019. At the height of the pandemic, it was also estimated that there were 5.4 million UK-based followers of anti-vaccine Twitter accounts.

Interestingly, an Ofcom survey of around 200 respondents carried out between 12 and 14 March 2021 found that 28% of respondents had come across information about covid-19 that could be considered false or misleading. Of those who had encountered such information, respondents from minority ethnic backgrounds were twice as likely to say that the claim made to them made them think twice about the issue compared with white respondents. The survey found that of those people who were getting news and information about the coronavirus within the preceding week, 15% of respondents had come across claims that the coronavirus vaccines would alter human DNA; 18% had encountered claims that the coronavirus vaccines were a cover for the implant of trackable microchips, and 10% had encountered claims that the vaccines contained animal products.

Public health authorities, the UK Government, social media companies and other organisations all attempted to address the spread of vaccine misinformation through various strategies, including moderation of vaccine misinformation on social media platforms, ensuring the public had access to accurate and reliable information and providing education and guidance to people on how to address misinformation when they came across it.

Although studies do not show strong links between susceptibility to misinformation and ethnicity in the UK, some practitioners and other groups have raised concerns about the spread and impact of covid-19 vaccine misinformation among certain minority ethnic groups. Those concerns stem from research that shows historically lower levels of vaccine confidence and uptake among those groups. Some recent evidence from the UK’s vaccine roll-out suggests that that trend has continued for the covid-19 vaccine.

Data from the OpenSAFELY platform, which includes data from 40% of GP practices in England, covering more than 24 million patients, found that up to 7 April 2021, 96% of white people aged over 60 had received a vaccination compared with only 77% of people from a Pakistani background, 76% from a Chinese background and 69% of black people within the same age group. A 2021 survey of more than 172,000 adults in England on attitudes to the vaccine also found that confidence in covid-19 vaccines was highest in those of white ethnicity, with some 92.6% saying that they had accepted or would accept the vaccine. The lowest confidence was found in those of black ethnicity, at 72.5%. Some of the initiatives to tackle vaccine misinformation and encourage vaccine take-up were aimed at specific minority ethnic groups, and experts have emphasised the importance of ensuring that factual information about covid-19 vaccines is available in multiple different languages.

Social media companies have taken various steps to tackle misinformation on their platforms during the covid-19 pandemic, including removing or demoting misinformation, directing users to information from official sources and banning certain adverts. So, they can do it when they want to—they just need to be compelled to do it by a Bill. However, we need to go further. Some of the broad approaches to content moderation that digital platforms have taken to address misinformation during the pandemic are discussed in the Parliamentary Office of Science and Technology’s previous rapid response on covid-19 and misinformation.

More recently, some social media companies have taken specific action to counter vaccine misinformation. In February 2021, as part of its wider policies on coronavirus misinformation, Facebook announced that it would expand its efforts to remove false information about covid-19 vaccines, and other vaccines more broadly. The company said it would label posts that discuss covid-19 vaccines with additional information from the World Health Organisation. It also said it would signpost its users to information on where and when they could get vaccinated. Facebook is now applying similar measures to Instagram.

In March 2021, Twitter began applying labels to tweets that could contain misinformation about covid-19 vaccines. It also introduced a strike policy, under which users that violate its covid-19 misinformation policy five or more times would have their account permanently suspended.

YouTube announced a specific ban on covid-19 anti-vaccination videos in October 2020. It committed to removing any videos that contradict official information about the vaccine from the World Health Organisation. In March, the company said it had removed more than 30,000 misleading videos about the covid-19 vaccine since the ban was introduced. However, as with most issues, until the legislation changes, service providers will not feel truly compelled to do the right thing, which is why we must legislate and push forward with amendment 83.

Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - -

I would like to speak to the clause rather than the amendment, Sir Roger. Is now the right time to do so, or are we only allowed to speak to the amendment?

None Portrait The Chair
- Hansard -

It can be, in the sense that I am minded not to have a clause stand part debate.

Nick Fletcher Portrait Nick Fletcher
- Hansard - -

Thank you, Sir Roger. I think that the Minister would agree that this is probably one of the most contentious parts of the Bill. It concerns legal but harmful content, which is causing an awful lot of concern out there. The clause says that the Secretary of State may in regulations define as

“priority content that is harmful to adults”

content that he or she considers to present

“a material risk of significant harm to an appreciable number of adults”.

We have discussed this issue in other places before, but I am deeply concerned about freedom of speech and people being able to say what they think. What is harmful to me may not be harmful to any other colleagues in this place. We would be leaving it to the Secretary of State to make that decision. I would like to hear the Minister’s thoughts on that.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am very happy to reply to the various queries that have been made. I will start with the points on vaccine disinformation raised by the hon. Members for Ochil and South Perthshire and for Pontypridd. The Government strongly agree with the points they made about the damaging effects of vaccine misinformation and the fact that many of our fellow citizens have probably died as a result of being misled into refusing the vaccine when it is, of course, perfectly safe. We strongly share the concerns they have articulated.

Over the past two years, the Department for Digital, Culture, Media and Sport has worked together with other Departments to develop a strong operational response to this issue. We have established a counter-disinformation unit within DCMS whose remit is to identify misinformation and work with social media firms to get it taken down. The principal focus of that unit during the pandemic was, of course, covid. In the past three months, it has focused more on the Russia-Ukraine conflict, for obvious reasons.

In some cases, Ministers have engaged directly with social media firms to encourage them to remove content that is clearly inappropriate. For example, in the Russia-Ukraine context, I have had conversations with social media companies that have left up clearly flagrant Russian disinformation. This is, therefore, an area that the Government are concerned about and have been acting on operationally already.

Obviously, we agree with the intention behind the amendment. However, the way to handle it is not to randomly drop an item into the Bill and leave the rest to a statutory instrument. Important and worthy though it may be to deal with disinformation, and specifically harmful health-related disinformation, there are plenty of other important things that one might add that are legal but harmful to adults, so we will not accept the amendment. Instead, we will proceed as planned by designating the list via a statutory instrument. I know that a number of Members of Parliament, probably including members of this Committee, would find it helpful to see a draft list of what those items might be, not least to get assurance that health-related misinformation and disinformation is on that list. That is something that we are considering very carefully, and more news might be forthcoming as the Bill proceeds through Parliament.

--- Later in debate ---
Thirdly, and finally, let us think about how big platforms such as Facebook and Twitter confront such issues. The truth is that they behave in an arbitrary manner; they are not consistent in how they apply their own terms and conditions. They sometimes apply biases—a matter on which my right hon. Friend the Secretary of State commented recently. No requirement is placed on them to be consistent or to have regard to freedom of speech. So they do things such as cancel Donald Trump—people have their own views on that—while allowing Vladimir Putin’s propaganda to be spread. That is obviously inconsistent. They have taken down a video of my hon. Friend the Member for Christchurch (Sir Christopher Chope) speaking in the House of Commons Chamber. That would be difficult once the Bill is passed because clause 15 introduces protection for content of democratic importance. So I do not think that the legal but harmful duties infringe free speech. To the contrary, once the Bill is passed, as I hope it will be, it will improve freedom of speech on the internet. It will not make it perfect, and I do not pretend that it will, but it will make some modest improvements.
Nick Fletcher Portrait Nick Fletcher
- Hansard - -

The argument has been made that the social media companies are doing this anyway, but two wrongs don’t make a right. We need to stop them doing it. I understand what we are trying to do here. We can see straight away that the Opposition want to be tighter on this. At a later date, if the Bill goes through as it is, freedom of speech will be gradually suppressed, and I am really concerned about that. My hon. Friend said that it would come back to Parliament, which I am pleased about. Are the priorities going to be written into the Bill? Will we be able to vote on them? If the scope is extended at any point in time, will we be able to vote on that, or will the Secretary of State just say, “We can’t have that so we’re just going to ban it”?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I will answer the questions in reverse order. The list of harms will not be in the Bill. The amendment seeks to put one of the harms in the Bill but not the others. So no, it will not be in the Bill. The harms—either the initial list or any addition to or subtraction from the list—will be listed in an affirmative statutory instrument, which means that the House will be able to look at it and, if it wants, to vote on it. So Parliament will get a chance to look at the initial list, when it is published in an SI. If anything is to be added in one, two or three years’ time, the same will apply.

Nick Fletcher Portrait Nick Fletcher
- Hansard - -

So will we be able to vote on any extension of the scope of the Bill at any time? Will that go out to public consultation as well?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes. There is an obligation on the Secretary of State to consult—[Interruption.] Did I hear someone laugh?—before proposing a statutory instrument to add things. There is a consultation first and then, if extra things are going to be added—in my hon. Friend’s language, if the scope is increased—that would be votable by Parliament because it is an affirmative SI. So the answer is yes to both questions. Yes there will be consultation in advance, and yes, if this Government or a future Government wanted to add anything, Parliament could vote on it if it wanted to because it will be an affirmative SI. That is a really important point.