None Portrait The Chair
- Hansard -

We are now sitting in public again, and the proceedings are being broadcast. Before we start hearing from the witnesses, do any Members wish to make declarations of interest in connection with the Bill?

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

The witness on Thursday’s sitting, Danny Stone from the Antisemitism Policy Trust, is an informal secretariat in a personal capacity to the all-party parliamentary group on wrestling, which I co-chair.

None Portrait The Chair
- Hansard -

That is noted.

--- Later in debate ---
None Portrait The Chair
- Hansard -

I will open up to the floor for questions now. I call Alex Davies-Jones.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Good morning, both, and welcome to the Committee. The Bill as it stands places responsibility on Ofcom to regulate the 25,000 tech companies and the tens—if not hundreds—of thousands of websites within the UK. How does that look in practice? What technical and administrative capacity do you have to carry that function out, realistically?

Kevin Bakhurst: We should say that we feel the Bill has given us a very good framework to regulate online safety. We have been working closely with the Department for Digital, Culture, Media and Sport to make sure that the Bill gives us a practical, deliverable framework. There is no doubt that it is a challenge. As you rightly say, there will be potentially 25,000 platforms in scope, but we feel that the Bill sets out a series of priorities really clearly in terms of categories.

It is also for us to set out—we will be saying more about this in the next couple of months—how we will approach this, and how we will prioritise certain platforms and types of risk. It is important to say that the only way of achieving online safety is through what the Bill sets out, which is to look at the systems in place at the platforms, and not the individual pieces of content on them, which would be unmanageable.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you, Kevin. You mentioned the categorisation of platforms. A number of stakeholders, including the platforms themselves and charities, have quite rightly raised some serious concerns around the categorisation of platforms. Would you, the regulator, prefer a risk-based approach, or the categorisation as it stands within the Bill?

Richard Wronka: We completely recognise the concerns that have been raised by stakeholders, and we have been speaking to many of them ourselves, so we have first-hand experience. I think my starting point is that the Bill captures those high-risk services, which is a really important feature of it. In particular, responsibilities around the legal content apply across all services in scope. That means that, in practice, when we are regulating, we will take a risk-based approach to whom we choose to engage with, and to where we focus our effort and attention.

We recognise that some of the debate has been about the categorisation process, which is intended to pick up high-risk and high-reach services. We understand the logic behind that. Indeed, I think we would have some concerns about the workability of an approach that was purely risk-based in its categorisation. We need an approach that we can put into operation. Currently, the Bill focuses on the reach of services and their functionality. We would have some concerns about a purely risk-based approach in terms of whether it was something that we could put into practice, given the number of services in scope.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q May I bring you back to putting this into practice, and to the recategorisation of platform and practice? If a category 2B platform as it stands in the Bill grows exponentially in size, and is spreading disinformation and incredibly harmful content quite quickly, how quickly would you be able to react as a regulator to recategorise that platform and bring it into scope as a category 1 platform? How long would that process take, and what would happen in the interim?

Richard Wronka: At the moment, the category 2B service would have transparency reporting requirements. That would be helpful, because it would be one way that the nature of harmful content on that platform could be brought to our attention, and to the public’s attention. We would also be looking at approaches that we could use to monitor the whole scope of the services, to ensure that we had a good grip of who was growing quickest and where the areas of risk were. Some of that is through engaging with the platforms themselves and a whole range of stakeholders, and some of it is through more advanced data and analytical techniques—“supervision technology”, as it is known in the regulatory jargon.

On the specifics of your question, if a company was growing very quickly, the Bill gives us the ability to look at that company again, to ask it for information to support a categorisation decision, and to recategorise it if that is the right approach and if it has met the thresholds set out by the Secretary of State. One of the thresholds regards the number of users, so if a company has moved over that threshold, we look to act as quickly as possible while running a robust regulatory process.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q So while that process is under way, there is no mechanism for you to take action against the platform.

Kevin Bakhurst: May I answer this? We have some experience of this already in the video-sharing platform regime, which is much more limited in scope, and we are already regulating a number of platforms, ranging from some very big ones such as Twitch, TikTok and Snap, down to some much smaller platforms that have caused us some concerns. We think we have the tools, but part of our approach will also be to focus on high-risk and high-impact content, even if it comes through small platforms. That is what we have already done with the video-sharing platform regime. We have to be agile enough to capture that and to move resources to it. We are doing that already with the video-sharing platform regime, even though we have only been regulating it for less than a year.

None Portrait The Chair
- Hansard -

Maria Miller has indicated that she would like to ask a question, so if I may, I will bring her in.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

Not immediately —go on please.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you, Chair, and thank you, Maria.

I am just trying to get to the intricacies of this, and of what would happen during the time that it would take for you to recategorise. This platform, which is disseminating harm to both children and adults, would be allowed to carry on while the recategorisation process is under way. There is no mechanism in the Bill to stop that from happening.

Richard Wronka: A really important point here is that we will be regulating that platform from the outset for illegal content and, potentially, for how it protects children on its platform, irrespective of the categorisation approach. That is really important. We will be able to take action, and take action quickly, irrespective of how the platform is categorised. Categorisation really determines whether the adult “legal but harmful” provisions apply. That is the bit that really matters in this context.

It is worth reminding ourselves what those provisions mean: they are more a transparency and accountability measure. Those categorised category 1 platforms will need to have clear terms and conditions applied to adult “legal but harmful” content, and they will need to implement those consistently. We would expect the really serious and egregious concerns to be picked up by the “illegal” part of the regime, and the protection-of-children part of the regime. The categorisation process may go on. It may take a little time, but we will have tools to act in those situations.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q May I bring you on to the powers of the Secretary of State and the question of the regulator’s independence? The Bill will see the Secretary of State, whoever that may be, have a huge amount of personal direction over Ofcom. Do you have any other experience of being directed by a Secretary of State in this way, and what are the consequences of such an approach?

Kevin Bakhurst: We do have some experience across the various sectors that we regulate, but being directed by the Secretary of State does not happen very often. Specifically on the Bill, our strong feeling is that we think it entirely appropriate, and that the Secretary of State should be able to direct us on matters of national security and terrorist content. However, we have some concerns about the wider direction powers of the Secretary of State, and particularly the grounds on which the Secretary of State can direct public policy, and we have expressed those concerns previously.

We feel it is important that the independence of a regulator can be seen to be there and is there in practice. Legally, we feel it important that there is accountability. We have some experience of being taken to judicial review, and there must be accountability for the codes of practice that we put in place. We must be able to show why and how we have created those codes of practice, so that we can be accountable and there is absolute clarity between regulator and Government.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Thank you very much to the witnesses who have taken the time to be with us today. We are really grateful. You have already alluded to the fact that you have quite extensive experience in regulation, even in social media spaces. I think the Committee would be really interested in your view, based on your experience, about what is not in the Bill that should be.

Kevin Bakhurst: Richard has been leading this process, so he can give more detail on it, but suffice to say, we have been engaging closely with DCMS over the last year or so, and we appreciate the fact that it has taken on board a number of our concerns. What we felt we needed from the Bill was clarity as far as possible, and a balance between clarity and flexibility for this regime, which is a very fast-moving field. We feel, by and large, that the Bill has achieved that.

We still have concerns about one or two areas, to pick up on your question. We feel it is really important—hopefully this is something the Committee can contribute to—that the definition of “illegal content” is really clear for platforms, and particularly the area of intent of illegality, which at the moment might be quite tricky for the platforms to pick up on.

Richard Wronka: I completely agree with Kevin that the Bill as it stands gives us a good framework. I think the pre-legislative scrutiny process has been really helpful in getting us there, and I point out that it is already quite a broad and complex regime. We welcome the introduction of issues such as fraudulent advertising and the regulation of commercial pornographic providers, but I think there is a point about ensuring that the Bill does not expand too much further, because that might raise some practical and operational issues for us.

I completely agree with Kevin that clarity in the Bill regarding illegal content and what constitutes that is really important. An additional area that requires clarity is around some of the complex definitions in the Bill, such as journalistic content and democratically important content. Those are inherently tricky issues, but any extra clarity that Parliament can provide in those areas would be welcome.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Just quickly, do coroners have sufficient powers? Should they have more powers to access digital data after the death of a child?

Andy Burrows: We can see what a protracted process it has been. There have been improvements to the process. It is currently a very lengthy process because of the mutual legal assistance treaty arrangements—MLAT, as they are known—by which injunctions have to be sought to get data from US companies. It has taken determination from some coroners to pursue cases, very often going up against challenges. It is an area where we think the arrangements could certainly be streamlined and simplified. The balance here should shift toward giving parents and families access to the data, so that the process can be gone through quickly and everything can be done to ease the heartbreak for families having to go through those incredibly traumatic situations.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Very briefly, Dame Rachel, I will build on what you were just saying, based on your experience as a headteacher. When I make my school visits, the teachers overwhelmingly tell me how, on a daily basis, they have to deal with the fallout from an issue that has happened online or on social media. On that matter, the digital media literacy strategy is being removed from the Bill. What is your thinking on that? How important do you see a digital media literacy strategy being at the heart of whatever policy the Government try to make regarding online safety for children?

Dame Rachel de Souza: There is no silver bullet. This is now a huge societal issue and I think that some of the things that I would want to say would be about ensuring that we have in our educational arsenal, if you like, a curriculum that has a really strong digital media literacy element. To that end, the Secretary of State for Education has just asked me to review how online harms and digital literacy are taught in schools—reviewing not the curriculum, but how good the teaching is and what children think about how the subject has been taught, and obviously what parents think, too.

I would absolutely like to see the tech companies putting some significant funding into supporting education of this kind; it is exactly the kind of thing that they should be working together to provide. So we need to look at this issue from many aspects, not least education.

Obviously, in a dream world I would like really good and strong digital media literacy in the Bill, but actually it is all our responsibility. I know from my conversations with Nadhim Zahawi that he is very keen that this subject is taught through the national curriculum, and very strongly.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I have a quick question on parental digital literacy. You mentioned the panel that you put together of 16 to 21-year-olds. Do you think that today’s parents have the experience, understanding, skills and tools to keep their children properly safe online? Even if they are pretty hands-on and want to do that, do you think that they have all the tools they need to be able to do that?

Dame Rachel de Souza: It is a massive concern to parents. Parents talk to me all the time about their worries: “Do we know enough?” They have that anxiety, especially as their children turn nine or 10; they are thinking, “I don’t even know what this world out there is.” I think that our conversations with 16 to 21-year-olds were really reassuring, and we have produced a pamphlet for parents. It has had a massive number of downloads, because parents absolutely want to be educated in this subject.

What did young people tell us? They told us, “Use the age controls; talk to us about how much time we are spending online; keep communication open; and talk to us.” Talk to children when they’re young, particularly boys, who are likely to be shown pornography for the first time, even if there are parental controls, around the age of nine or 10. So have age-appropriate conversations. There was some very good advice about online experiences, such as, “Don’t worry; you’re not an expert but you can talk to us.” I mean, I did not grow up with the internet, but I managed parenting relatively well—my son is 27 now. I think this is a constant concern for parents.

I do think that the tech companies could be doing so much more to assist parents in digital media literacy, and in supporting them in how to keep their child safe. We are doing it as the Office of the Children’s Commissioner. I know that we are all trying to do it, but we want to see everyone step up on this, particularly the tech companies, to support parents on this issue.

--- Later in debate ---
None Portrait The Chair
- Hansard -

We will now hear from Ben Bradley, government relations and public policy manager at TikTok, and Katy Minshall, head of UK public policy at Twitter. We have until 11.25 for this panel of witnesses. Could the witnesses please introduce themselves for the record?

Ben Bradley: I am Ben Bradley. I am a public policy manager at TikTok, leading on the Bill from TikTok.

Katy Minshall: I am Katy Minshall. I am head of UK public policy for Twitter.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Good morning, both. Thank you for joining us today. We have recently had it confirmed by the Minister in a written parliamentary question that NFTs—non-fungible tokens—will be included in the scope of the Bill. Concerns have been raised about how that will work in practice, and also in relation to GIFs, memes and other image-based content that is used on your platforms, Twitter specifically. Katy, how do you see that working in practice? Is the Bill workable in its current form to encapsulate all of that?

Katy Minshall: Thank you for inviting me here to talk about the Online Safety Bill. On whether the Bill is workable in its current form, on the one hand, we have long been supportive of an approach that looks at overall systems and processes, which I think would capture some of the emerging technologies that you are talking about. However, we certainly have questions about how are aspects of the Bill would work in practice. To give you an example, one of the late additions to the Bill was about user verification requirements, which as I understand it means that all category 1 platforms will need to offer users the opportunity to verify themselves and, in turn, those verified users have the ability to turn off interaction from unverified users. Now, while we share the Government’s policy objective of giving users more control, we certainly have some workability questions.

Just to give you one example, let’s say this existed today, and Boris Johnson turned on the feature. In practice, that would mean one of two things. Either the feature is only applicable to users in the UK, meaning that people around the world—in France, Australia, Germany or wherever it may be—are unable to interact with Boris Johnson, and only people who are verified in the UK can reply to him, tweet at him and so on, or it means the opposite and anyone anywhere can interact with Boris Johnson except those people who have chosen not to verify their identity, perhaps even in his own constituency, who are therefore are at a disadvantage in being able to engage with the Prime Minister. That is just one illustration of the sorts of workability questions we have about the Bill at present.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q You brought up the Prime Minister, so we’ll carry on down that route. One of the concerns about the Bill is the issue of protecting democratic importance. If there is an exemption for content of democratic importance, would your platforms be able to take that down?

Katy Minshall: I am sorry, do you mean—

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Would you be able to remove the content?

Katy Minshall: At present, what would be expected of companies in that scenario is not entirely clear in the Bill. There are certainly examples of content we have removed over the years for abuse and hateful conduct where the account owner that we suspended would have grounds to say, “Actually, this is content of democratic importance.” At the very least, it is worth pointing out that, in practice, it is likely to slow down our systems because we would have to build in extra steps to understand if a tweet or an account could be considered content of democratic importance, and we would therefore treat it differently.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q That brings me to my next question. Because what would be classed as content of democratic importance is so ambiguous, would your platforms even be able to detect it?

Katy Minshall: That is a really important question. At present, the Bill envisages that we would treat journalistic content differently from other types of content. I think the definition in the Bill—correct me if I get this wrong—is content for the purposes of journalism that is UK linked. That could cover huge swathes of the conversation on Twitter—links to blog posts, citizen journalists posting, front pages of news articles. The Bill envisages our having a system to separate that content from other content, and then treating that content differently. I struggle to understand how that would work in practice, especially when you layer on top the fact that so much of our enforcement is assisted by technology and algorithms. Most of the abusive content we take down is detected using algorithms; we suspend millions of spam accounts every day using automated systems. When you propose to layer something so ambiguous and complicated on top of that, it is worth considering how that might impact on the speed of enforcement across all of our platform.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you. Given the media carve-out and the journalism exemption in the Bill, how could you detect state actors that are quoting disinformation, or even misinformation?

Katy Minshall: At present, we label a number of accounts as Government actors or state-affiliated media and we take action on those accounts. We take down their tweets and in some cases we do not amplify their content because we have seen in current situations that some Governments are sharing harmful content. Again, I question the ambiguity in the Bill and how it would interact with our existing systems that are designed to ensure safety on Twitter.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you. Just one final question for Twitter. A query we raised with the Children’s Commissioner and the NSPCC is about pornography and children accessing it. A person needs to be 13 years old to join Twitter—to host a profile on the site—but you do host pornographic content; it is used mainly by sex workers to promote their trade. How does the proposed provision affect your model of business in allowing 13-year-olds and above to access your platform?

Katy Minshall: Until we see the full extent of the definitions and requirements, it is difficult to say exactly what approach we would take under the Bill. Regarding adult content, Twitter is not a service targeting a youth audience, and as you illustrate, we endeavour to give people the ability to express themselves as they see fit. That has to be balanced with the objective of preventing young people from inadvertently stumbling on such content.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q So you are not predominantly aimed at children? If you are an adult service, why is it that people aged 13 or above can access your platform?

Katy Minshall: We find that, in practice, the overwhelming majority of our user base are over the age of 18; both internal and external data show that. Of course young people can access Twitter. I think we have to be very careful that the Bill does not inadvertently lock children out of services they are entitled to use. I am sure we can all think of examples of people under the age of 18 who have used Twitter to campaign, for activism and to organise; there are examples of under-18s using Twitter to that effect. But as I say, predominantly we are not a service targeting a youth audience.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Okay. Thank you, Chair.