Committee stage & Committee Debate - 3rd sitting
Thursday 26th May 2022

(1 year, 12 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
None Portrait The Chair
- Hansard -

Before we hear oral evidence, I invite Members to declare any interests in connection with the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

I need to declare an interest, Ms Rees. Danny Stone from the Antisemitism Policy Trust provides informal secretariat in a personal capacity to the all-party parliamentary group on wrestling, which I co-chair.

None Portrait The Chair
- Hansard -

That is noted. Thank you.

Examination of Witnesses

Mat Ilic, William Moy, Professor Lorna Woods MBE and William Perrin OBE gave evidence.

None Portrait The Chair
- Hansard -

We will now hear oral evidence from Mat Ilic, chief development officer at Catch22; William May, chief executive at Full Fact; and Professor Lorna Woods and William Perrin of the Carnegie UK Trust. Before calling the first Member, I remind all Members that questions should be limited to matters within the scope of the Bill and that we must stick to the timings in the programme order that the Committee agreed. For this session, we have until 12.15 pm. I call Alex Davies- Jones to begin the questioning.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q187 Good morning to our witnesses. Thank you for joining us today. One of the main criticisms of the Bill is that the vast majority of the detail will not be available until after the legislation is enacted, under secondary legislation and so on. Part of the problem is that we are having difficulty in differentiating the “legal but harmful” content. What impact does that have?

William Perrin: At Carnegie, we saw this problem coming some time ago, and we worked in the other place with Lord McNally on a private Member’s Bill —the Online Harms Reduction Regulator (Report) Bill—that, had it carried, would have required Ofcom to make a report on a wide range of risks and harms, to inform and fill in the gaps that you have described.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

On a point of order, Ms Rees. There is a gentleman taking photographs in the Gallery.

None Portrait The Chair
- Hansard -

There is no photography allowed here.

William Perrin: Unfortunately, that Bill did not pass and the Government did not quite take the hint that it might be good to do some prep work with Ofcom to provide some early analysis to fill in holes in a framework Bill. The Government have also chosen in the framework not to bring forward draft statutory instruments or to give indications of their thinking in a number of key areas of the Bill, particularly priority harms to adults and the two different types of harms to children. That creates uncertainty for companies and for victims, and it makes the Bill rather hard to scrutinise.

I thought it was promising that the Government brought forward a list of priority offences in schedule 7 —I think that is where it is; I get these things mixed up, despite spending hours reading the thing. That was helpful to some extent, but the burden is on the Government to reduce complexity by filling in some of the blanks. It may well be better to table an amendment to bring some of these things into new schedules, as we at Carnegie have suggested—a schedule 7A for priority harms to adults, perhaps, and a 7B and 7C for children and so on—and then start to fill in some of the blanks in the regime, particularly to reassure victims.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Thank you. Does anybody else want to comment?

William Moy: There is also a point of principle about whether these decisions should be made by Government later or through open, democratic, transparent decision making in Parliament.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q That brings me on to my next point, William, relating to concerns about the powers that the Bill gives to the Secretary of State and about the independence of the regulator and the impact that could have. Do you have any comments on that?

William Moy: Sure. I should point out—we will need to get to this later—the fact that the Bill is not seriously trying to address misinformation and disinformation at this point, but in that context, we all know that there will be another information incident that will have a major effect on the public. We have lived through the pandemic, when information quality has been a matter of life and death; we are living through information warfare in the context of Ukraine, and more will come. The only response to that in the Bill is in clause 146, which gives the Secretary of State power to direct Ofcom to use relatively weak media literacy duties to respond.

We think that in an open society there should be an open mechanism for responding to information incidents—outbreaks of misinformation and disinformation that affect people’s lives. That should be set out in the roles of the regulator, the Government and internet companies, so that there is a framework that the public understand and that is open, democratic and transparent in declaring a misinformation and disinformation incident, creating proportionate responses to it, and monitoring the effects of those responses and how the incident is managed. At the moment, it largely happens behind closed doors and it involves a huge amount of restricting what people can see and share online. That is not a healthy approach in an open society.

William Perrin: I should add that as recently as April this year, the Government signed up to a recommendation of the Council of Ministers of the Council of Europe on principles for media and communication governance, which said that

“media and communication governance should be independent and impartial to avoid undue influence…discriminatory treatment and preferential treatment of powerful groups, including those with significant political or economic power.”

That is great. That is what the UK has done for 50 to 60 years in media regulation, where there are very few powers for the Secretary of State or even Parliament to get involved in the day-to-day working of communications regulators. Similarly, we have had independent regulation of cinema by the industry since 1913 and regulation of advertising independent of Government, and those systems have worked extremely well. However, this regime—which, I stress, Carnegie supports—goes a little too far in introducing a range of powers for the Secretary of State to interfere with Ofcom’s day-to-day doing of its business.

Clause 40 is particularly egregious, in that it gives the Secretary of State powers of direction over Ofcom’s codes of practice and, very strangely, introduces an almost infinite ability for the Government to keep rejecting Ofcom’s advice—presumably, until they are happy with the advice they get. That is a little odd, because Ofcom has a long track record as an independent, evidence-based regulator, and as Ofcom hinted in a terribly polite way when it gave evidence to this Committee, some of these powers may go a little too far. Similarly, in clause 147, the Secretary of State can give tactical guidance to Ofcom on its exercise of its powers. Ofcom may ignore that advice, but it is against convention that the Secretary of State can give that advice at all. The Secretary of State should be able to give strategic guidance to Ofcom roughly one or one and a half times per Parliament to indicate its priorities. That is absolutely fine, and is in accordance with convention in western Europe and most democracies, but the ability to give detailed guidance is rather odd.

Then, as Mr Moy has mentioned, clause 146, “Directions in special circumstances”, is a very unusual power. The Secretary of State can direct Ofcom to direct companies to make notices about things and can direct particular companies to do things without a particularly high threshold. There just have to be “reasonable grounds to believe”. There is no urgency threshold, nor is there a strong national security threshold in there, or anyone from whom the Secretary of State has to take advice in forming that judgment. That is something that we think can easily be amended down.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you. Mr Moy, you brought up the issue of misinformation and disinformation being removed from the scope of the Bill. Can you expand on your thoughts on that point?

William Moy: Absolutely. It is an extraordinary decision in a context where we are just coming through the pandemic, where information quality was such a universal concern, and we are in an information war, with the heightened risk of attempts to interfere in future elections and other misinformation and disinformation risks. It is also extraordinary because of the Minister’s excellent and thoughtful Times article, in which he pointed out that at the moment, tech companies censor legal social media posts at vast scale, and this Bill does nothing to stop that. In fact, the Government have actively asked internet companies to do that censorship—it has told them to do so. I see the Minister looking surprised, so let me quote from BBC News on 5 April 2020:

“The culture secretary is to order social media companies to be more aggressive in their response to conspiracy theories linking 5G networks to the coronavirus pandemic.”

In that meeting, essentially, the internet companies were asked to make sure they were taking down that kind of content from their services. Now, in the context of a Bill where, I think, the Minister and I completely agree about our goal—tackling misinformation in an open society—there is an opportunity for this Bill to be an example to the free world of how open societies respond to misinformation, and a beacon for the authoritarian world as well.

This is the way to do that. First, set out that the Bill must cover misinformation and disinformation. We cannot leave it to internet companies, with their political incentives, their commercial convenience and their censoring instincts, to do what they like. The Bill must cover misinformation and set out an open society response to it. Secondly, we must recognise that the open society response is about empowering people. The draft Bill had a recognition that we need to modernise the media literacy framework, but we do not have that in this Bill, which is really regrettable. It would be a relatively easy improvement to create a modern, harms and safety-based media literacy framework in this Bill, empowering users to make their own decisions with good information.

Then, the Bill would need to deal with three main threats to freedom of expression that threaten the good information in our landscape. Full Fact as a charity exists to promote informed and improved public debate, and in the long run we do that by protecting freedom of expression. Those three main threats are artificial intelligence, the internet companies and our own Government, and there are three responses to them. First, we must recognise that the artificial intelligence that internet companies use is highly error-prone, and it is a safety-critical technology. Content moderation affects what we can all see and share; it affects our democracy, it affects our health, and it is safety-critical. In every other safety-critical industry, that kind of technology would be subject to independent third-party open testing. Cars are crashed against walls, water samples are taken and tested, even sofas are sat on thousands of times to check they are safe, but internet companies are subject to no third-party independent open scrutiny. The Bill must change that, and the crash test dummy test is the one I would urge Members to apply.

The second big threat, as I said, is the internet companies themselves, which too often reach for content restrictions rather than free speech-based and information-based interventions. There are lots of things you can do to tackle misinformation in a content-neutral way—creating friction in sharing, asking people to read a post before they share it—or you can tackle misinformation by giving people information, rather than restricting what they can do; fact-checking is an example of that. The Bill should say that we prefer content-neutral and free speech-based interventions to tackle misinformation to content-restricting ones. At the moment the Bill does not touch that, and thus leaves the existing system of censorship, which the Minister has warned about, in place. That is a real risk to our open society.

The final risk to freedom of expression, and therefore to tackling misinformation, are the Government themselves. I have just read you an example of a Government bringing in internet companies to order them around by designating their terms and conditions and saying certain content is unacceptable. That content then starts to get automatically filtered out, and people are stopped from seeing it and sharing it online. That is a real risk. Apart from the fact that they press released it, that is happening behind closed doors. Is that acceptable in an open democratic society, or do we think there should be a legal framework governing when Governments can seek to put pressure on internet companies to affect what we can all see and share? I think that should be governed by a clear legislative framework that sets out if those functions need to exist, what they are and what their parameters are. That is just what we would expect for any similarly sensitive function that Government carry out.

None Portrait The Chair
- Hansard -

Thank you. I am going to bring Maria Miller in now.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Good morning, witnesses. Thank you for joining us today. Does the Bill give Ofcom discretion to regulate on the smaller but high-risk platforms?

Danny Stone: First, thank you for having me today. We have made various representations about the problems that we think there are with small, high-harm platforms. The Bill creates various categories, and the toughest risk mitigation is on the larger services. They are defined by their size and functionality. Of course, if I am determined to create a platform that will spread harm, I may look at the size threshold that is set and make a platform that falls just below it, in order to spread harm.

It is probably important to set out what this looks like. The Community Security Trust, which is an excellent organisation that researches antisemitism and produces incident figures, released a report called “Hate Fuel” in June 2020. It looked at the various small platforms and highlighted that, in the wake of the Pittsburgh antisemitic murders, there had been 26 threads, I think, with explicit calls for Jews to be killed. One month prior to that, in May 2020, a man called Payton Gendron found footage of the Christchurch attacks. Among this was legal but harmful content, which included the “great replacement” theory, GIFs and memes, and he went on a two-year journey of incitement. A week or so ago, he targeted and killed 10 people in Buffalo. One of the things that he posted was:

“Every Time I think maybe I shouldn’t commit to an attack I spend 5 min of /pol/”—

which is a thread on the small 4chan platform—

“then my motivation returns”.

That is the kind of material that we are seeing: legal but harmful material that is inspiring people to go out and create real-world harm. At the moment, the small platforms do not have that additional regulatory burden. These are public-facing message boards, and this is freely available content that is promoted to users. The risks of engaging with such content are highest. There is no real obligation, and there are no consequences. It is the most available extremism, and it is the least regulated in respect of the Bill. I know that Members have raised this issue and the Minister has indicated that the Government are looking at it, but I would urge that something is done to ensure that it is properly captured in the Bill, because the consequences are too high if it is not.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thanks, Danny. So in your opinion, you would rather see a risk-based approach, as opposed to size and functionality.

Danny Stone: I think there are various options. Either you go for a risk-based approach—categorisation—or you could potentially amend it so that it is not just size and functionality. You would take into account other things—for example, characteristics are already defined in the Bill, and that might be an option for doing it.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Does anybody else want to come in on small platforms? Liron?

Liron Velleman: From the perspective of HOPE not hate, most of our work targeting and looking at far-right groups is spent on some of those smaller platforms. I think that the original intention of the Bill, when it was first written, may have been a more sensible way of looking at the social media ecospace: larger platforms could host some of this content, while other platforms were just functionally not ready to host large, international far-right groups. That has changed radically, especially during the pandemic.

Now, there are so many smaller platforms—whether small means hundreds of thousands, tens of thousands or even smaller than that—that are almost as easy to use as some of the larger platforms we all know so well. Some of the content on those smaller platforms is definitely the most extreme. There are mechanisms utilised by the far-right—not just in the UK, but around the world—to move that content and move people from some of the larger platforms, where they can recruit, on to the smaller platforms. To have a situation in which that harmful content is not looked at as stringently as content on the larger platforms is a miscategorisation of the internet.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q One of our concerns with the Bill, which we raised with the regulator, Ofcom, in Tuesday’s evidence session, is what would happen in the interim if one of those smaller categorised platforms was to grow substantially and then need to be recategorised. Our concern is about what would happen in the interim, during the recategorisation process, while that platform was allowed to disseminate harmful content. What would you like to see happen as an interim measure during recategorisation, if that provision remained in the Bill?

Liron Velleman: We have seen this similarly with the proscription of far-right terrorist groups in other legislation. It was originally quite easy to say that, eventually, the Government would proscribe National Action as a far-right terror group. What has happened since is that aliases and very similar organisations are set up, and it then takes months or sometimes years for the Government to be able to proscribe those organisations. We have to spend our time making the case as to why those groups should be banned.

We can foresee a similar circumstance here. We turn around and say, “Here is BitChute” or hundreds of other platforms that should be banned. We spend six months saying to the Government that it needs to be banned. Eventually, it is, but then almost immediately an offshoot starts. We think that Ofcom should have delegated power to make sure that it is able to bring those platforms into category 1 almost immediately, if the categorisations stay as they are.

Danny Stone: It could serve a notice and ensure that platforms prepare for that. There will, understandably, be a number of small platforms that are wary and do not want to be brought into that category, but some of them will need to be brought in because of the risk of harm. Let us be clear: a lot of this content may well—probably will—stay on the platform, but, at the very least, they will be forced to risk assess for it. They will be forced to apply their terms and conditions consistently. It is a step better than what they will be doing without it. Serving a notice to try to bring them into that regime as quickly as possible and ensure that they are preparing measures to comply with category 1 obligations would be helpful.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you. The Antisemitism Policy Trust has made the case that search services should be eligible for inclusion as a high-risk category. Is that still your position? What is the danger, currently, of excluding them from that provision?

Danny Stone: Very much so. You heard earlier about the problems with advertising. I recognise that search services are not the same as user-to-user services, so there does need to be some different thinking. However, at present, they are not required to address legal harms, and the harms are there.

I appeared before the Joint Committee on the draft Bill and talked about Microsoft Bing, which, in its search bar, was prompting people with “Jews are” and then a rude word. You look at “Gays are”, today, and it is prompting people with “Gays are using windmills to waft homosexual mists into your home”. That is from the search bar. The first return is a harmful article. Do the same in Google, for what it’s worth, and you get “10 anti-gay myths debunked.” They have seen this stuff. I have talked to them about it. They are not doing the work to try to address it.

Last night, using Amazon Alexa, I searched “Is George Soros evil?” and the response, was “Yes, he is. According to an Alexa Answers contributor, every corrupt political event.” “Are the White Helmets fake?” “Yes, they are set up by an ex-intelligence officer.” The problem with that is that the search prompts—the things that you are being directed to; the systems here—are problematic, because one person could give an answer to Amazon and that prompts the response. The second one, about the White Helmets, was a comment on a website that led Alexa to give that answer.

Search returns are not necessarily covered because, as I say, they are not the responsibility of the internet companies, but the systems that they design as to how those things are indexed and the systems to prevent them going to harmful sites by default are their responsibility, and at present the Bill does not address that. Something that forces those search companies to have appropriate risk assessments in place for the priority harms that Parliament sets, and to enforce those terms and conditions consistently, would be very wise.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you to the witnesses for joining us today. The Bill contains duties to protect content of “democratic importance” and “journalistic content”. What is your view of these measures and their likely effectiveness?

Liron Velleman: These are both pretty dangerous clauses. We are very concerned about what I would probably be kind and call their unintended consequences. They are loopholes that could allow some of the most harmful and hateful actors to spread harm on social media. I will take “journalistic” first and then move on to “democratic”.

A number of companies mentioned in the previous evidence session are outlets that could be media publications just by adding a complaints system to their website. There is a far-right outlet called Urban Scoop that is run by Tommy Robinson. They just need to add a complaints system to their website and then they would be included as a journalist. There are a number of citizen journalists who specifically go to our borders to harass people who are seeking refuge in this country. They call themselves journalists; Tommy Robinson himself calls himself a journalist. These people have been specifically taken off platforms because they have repeatedly broken the terms of service of those platforms, and we see this as a potential avenue for them to make the case that they should return.

We also see mainstream publications falling foul of the terms of service of social media companies. If I take the example of the Christchurch massacre, social media companies spent a lot of time trying to take down both the livestream of the attack in New Zealand and the manifesto of the terrorist, but the manifesto was then put on the Daily Mail website—you could download the manifesto straight from the Daily Mail website—and the livestream was on the Daily Mirror and The Sun’s websites. We would be in a situation where social media companies could take that down from anyone else, but they would not be able to take it down from those news media organisations. I do not see why we should allow harmful content to exist on the platform just because it comes from a journalist.

On “democratic”, it is still pretty unclear what the definition of democratic speech is within the Bill. If we take it to be pretty narrow and just talk about elected officials and candidates, we know that far-right organisations that have been de-platformed from social media companies for repeatedly breaking the terms of service—groups such as Britain First and, again, Tommy Robinson—are registered with the Electoral Commission. Britain First ran candidates in the local elections in 2022 and they are running in the Wakefield by-election, so, by any measure, they are potentially of “democratic importance”, but I do not see why they should be allowed to break terms of service just because they happen to have candidates in elections.

If we take it on a wider scale and say that it is anything of “democratic importance”, anyone who is looking to cause harm could say, “A live political issue is hatred of the Muslim community.” Someone could argue that that or the political debate around the trans community in the UK is a live political debate, and that would allow anyone to go on the platform and say, “I’ve got 60 users and I’ve got something to say on this live political issue, and therefore I should be on the platform,” in order to cause that harm. To us, that is unacceptable and should be removed from the Bill. We do not want a two-tier internet where some people have the right to be racist online, so we think those two clauses should be removed.

Stephen Kinsella: At Clean up the Internet this is not our focus, although the proposals we have made, which we have been very pleased to see taken up in the Bill, will certainly introduce friction. We keep coming back to friction being one of the solutions. I am not wearing this hat today, but I am on the board of Hacked Off, and if Hacked Off were here, I think they would say that the solution—although not a perfect solution—might be to say that a journalist, or a journalistic outlet, will be one that has subjected itself to proper press regulation by a recognised press regulator. We could then possibly take quite a lot of this out of the scope of social media regulation and leave it where I think it might belong, with proper, responsible press regulation. That would, though, lead on to a different conversation about whether we have independent press regulation at the moment.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you again to the witnesses for joining us this morning. I will start with Stephen Kinsella. You have spoken already about some of the issues to do with anonymity. Can you share with the Committee your view on the amendments made to the Bill, when it was introduced a couple of months ago, to give users choices over self-verification and the content they see? Do you think they are useful and helpful updates to the Bill?

Stephen Kinsella: Yes. We think they are extremely helpful. We welcome what we see in clause 14 and clause 57. There is thus a very clear right to be verified, and an ability to screen out interactions with unverified accounts, which is precisely what we asked for. The Committee will be aware that we have put forward some further proposals. I would really hesitate to describe them as amendments; I see them as shading-in areas—we are not trying to add anything. We think that it would be helpful, for instance, when someone is entitled to be verified, that verification status should also be visible to other users. We think that should be implicit, because it is meant to act as a signal to others as to whether someone is verified. We hope that would be visible, and we have suggested the addition of just a few words into clause 14 on that.

We think that the Bill would benefit from a further definition of what it means by “user identity verification”. We have put forward a proposal on that. It is such an important term that I think it would be helpful to have it as a defined term in clause 189. Finally, we have suggested a little bit more precision on the things that Ofcom should take into account when dealing with platforms. I have been a regulatory lawyer for nearly 40 years, and I know that regulators often benefit from having that sort of clarity. There is going to be negotiation between Ofcom and the platforms. If Ofcom can refer to a more detailed list of the factors it is supposed to take into account, I think that will speed the process up.

One of the reasons we particularly welcomed the structure of the Bill is that there is no wait for detailed codes of conduct because these are duties that we will be executing immediately. I hope Ofcom is working on the guidance already, but the guidance could come out pretty quickly. Then there would be the process of—maybe negotiating is the wrong word—to-and-fro with the platforms. I would be very reluctant to take too much on trust. I do not mean on trust from the Government; I mean on trust from the platforms—I saw the Minister look up quickly then. We have confidence in Government; it is the platforms we are little bit wary of. I heard the frustration expressed on Tuesday.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

indicated assent.

Stephen Kinsella: I think you said, “If platforms care about the users, why aren’t they already implementing this?” Another Member, who is not here today, said, “Why do they have to be brought kicking and screaming?” Yet, every time platforms were asked, we heard them say, “We will have to wait until we see the detail of—”, and then they would fill in whatever thing is likely to come last in the process. So we welcome the approach. Our suggestions are very modest and we are very happy to discuss them with you.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Good. Thank you. I hope the Committee is reassured by those comments on the freedom of speech question.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q I will use the small amount of time we have left to ask one question. A number of other stakeholders and witnesses have expressed concerns regarding the removal of a digital media literacy strategy from the Bill. What role do you see a digital media literacy strategy playing in preventing the kind of abuse that you have been describing?

Danny Stone: I think that a media literacy strategy is really important. There is, for example, UCL data on the lack of knowledge of the word “antisemitism”: 68% of nearly 8,000 students were unfamiliar with the term’s meaning. Dr Tom Harrison has discussed cultivating cyber-phronesis—this was also in an article by Nicky Morgan in the “Red Box” column some time ago—which is a method of building practical knowledge over time to make the right decisions when presented with a moral challenge. We are not well geared up as a society—I am looking at my own kids—to educate young people about their interactions, about what it means when they are online in front of that box and about to type something, and about what might be received back. I have talked about some of the harms people might be directed to, even through Alexa, but some kind of wider strategy, which goes beyond what is already there from Ofcom—during the Joint Committee process, the Government said that Ofcom already has its media literacy requirements—and which, as you heard earlier, updates it to make it more fit for purpose for the modern age, would be very appropriate.

Stephen Kinsella: I echo that. We also think that that would be welcome. When we talk about media literacy, we often find ourselves with the platforms throwing all the obligation back on to the users. Frankly, that is one of the reasons why we put forward our proposal, because we think that verification is quite a strong signal. It can tell you quite a lot about how likely it is that what you are seeing or reading is going to be true if someone is willing to put their name to it. Seeing verification is just one contribution. We are really talking about trying to build or rebuild trust online, because that is what is seriously lacking. That is a system and design failure in the way that these platforms have been built and allowed to operate.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q The shadow Minister’s question is related to the removal of what was clause 103 in the old draft of the Bill. As she said, that related to media literacy. Does the panel draw any comfort from three facts? First, there is already a media literacy duty on Ofcom under section 11 of the Communications Act 2003—the now deleted clause 103 simply provided clarification on an existing duty. Secondly, last December, after the Joint Committee’s deliberations, but before the updated Bill was published, Ofcom published its own updated approach to online media literacy, which laid out the fact that it was going to expand its media literacy programme beyond what used to be in the former clause 103. Finally, the Government also have their own media literacy strategy, which is being funded and rolled out. Do those three things—including, critically, Ofcom’s own updated guidance last December—give the panel comfort and confidence that media literacy is being well addressed?

Liron Velleman: If the Bill is seeking to make the UK the safest place to be on the internet, it seems to be the obvious place to put in something about media literacy. I completely agree with what Danny said earlier: we would also want to specifically ensure—although I am sure this already exists in some other parts of Ofcom and Government business—that there is much greater media literacy for adults as well as children. There are lots of conversations about how children understand use of the internet, but what we have seen, especially during the pandemic, is the proliferation of things like community Facebook groups, which used to be about bins and a fair that is going on this weekend, becoming about the worst excesses of harmful content. People have seen conspiracy theories, and that is where we have seen some of the big changes to how the far-right and other hateful groups operate, in terms of being able to use some of those platforms. That is because of a lack of media literacy not just among children, but among the adult population. I definitely would encourage that being in the Bill, as well as anywhere else, so that we can remove some of those harms.

Danny Stone: I think it will need further funding, beyond what has already been announced. That might put a smile on the faces of some Department for Education officials, who looked so sad during some of the consultation process—trying to ensure that there is proper funding. If you are going to roll this out across the country and make it fit for purpose, it is going to cost a lot of money.