Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q My last question is about future-proofing the Bill. Obviously, an awful lot of things will happen in the online world that do not currently happen there, and some of those we cannot foresee. Do you think the Bill is wide enough and flexible enough to allow changes to be made so that new and emerging platforms can be regulated?

Kevin Bakhurst: Overall, we feel that it is. By and large, the balance between certainty and flexibility in the Bill is probably about right and will allow some flexibility in future, but it is very hard to predict what other harms may emerge. We will remain as flexible as possible.

Richard Wronka: There are some really important updating tools in the Bill. The ability for the Secretary of State to introduce new priority harms or offences—with the approval of Parliament, of course—is really important.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

Q Ofcom is required to produce certain codes, for example on terrorism, but others that were floated in the Green Paper are no longer in the Bill. Are you working on such codes, for example on hate crime and wider harm, and if not, what happens in the meantime? I guess that links to my concerns about the democratic importance and journalistic content provisions in the Bill, to which you have alluded. They are very vague protections and I am concerned that they could be exploited by extremists who suddenly want to identify as a journalist or a political candidate. Could you say a little about the codes and about those two particular clauses and what more you think we could do to help you with those?

Richard Wronka: I will cover the codes first. You are absolutely right that the Bill requires Ofcom to publish codes of practice, particularly on CSEA and on terror, as well as on fraudulent advertising and other areas. We are doing the work right now so that we are ready to progress with that process as soon as we get powers and duties, because it is really important that we are ready to move as quickly as possible. We will set out further detail on exactly how we plan to do that in a roadmap document that we are looking to publish before the summer break, so that will provide some of the detail.

A really important point here is that the Bill quite rightly covers a wide set of harms. We are mindful of the fact that the temptation of having a code that covers every single harm could be counterproductive and confusing for platforms, even for those that want to comply and do the right thing. One of the balancing acts for us as we produce that code framework will be to get the right coverage for all the issues that everyone is rightly concerned about, but doing that in a way that is streamlined and efficient, so that services can apply the provisions of those codes.

Richard Wronka: Shall I pick up on the second bit very quickly? I think you are right; this is one of our central concerns about the definitions. As far as possible, this should be a matter for Parliament. It is really important that to know Parliament has a view on this. Ultimately, the regulator will take a view based on what Parliament says. We have some experience in this area, but as Richard said, we recognise the challenge—it is extremely complex. We can see the policy intent of doing it, quite rightly, and the importance of enshrining freedom of expression as far as possible, but Parliament can help to add clarity and, as you rightly say, be aware of some of the potential loopholes. At the moment, someone could describe themselves as a citizen journalist; where does that leave us? I am not quite sure. Parliament could help to clarify that, and we would be grateful.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Q Do the powers in the Bill cover enough to ensure that people will not be sent flashing images if they have photosensitive epilepsy?

Richard Wronka: This picks up the point we discussed earlier, which is that I understand that the Government are considering proposals from the Law Commission to criminalise the sending of those kinds of images. It would not be covered by the illegal content duties as things stand, but if the Government conclude that it is right to criminalise those issues, it would automatically be picked up by the Bill.

Even so, the regime is not, on the whole, going to be able to pick up every instance of harm. It is about making sure that platforms have the right systems and processes. Where there is clear harm to individuals, we would expect those processes to be robust. We know there is work going on in the industry on that particular issue to try and drive forward those processes.

--- Later in debate ---
None Portrait The Chair
- Hansard -

I will bring in Kim Leadbeater and then Maria Miller and Kirsty Blackman, but I will definitely bring in the Minister at 10.45 am.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Thank you, Ms Rees, and thank you to the witnesses. Many websites host pornography without necessarily being pornographic websites, meaning that children can easily stumble across it. Does the Bill do enough to tackle pornography when it is hosted on mainstream websites?

Dame Rachel de Souza: I have argued hard to get pornographic sites brought into the Bill. That is something very positive about the Bill, and I was really pleased to see that. Why? I have surveyed more than half a million children in my Big Ask survey and spoken recently to 2,000 children specifically about this issue. They are seeing pornography, mainly on social media sites—Twitter and other sites. We know the negative effects of that, and it is a major concern.

I am pleased to see that age assurance is in the Bill. We need to challenge the social media companies—I pull them together and meet them every six months—on getting this stuff off their sites and making sure that under-age children are not on their sites seeing some of these things. You cannot go hard enough in challenging the social media companies to get pornography off their sites and away from children.

Andy Burrows: Just to add to that, I would absolutely echo that we are delighted that part 5 of the Bill, with measures around commercial pornography, has been introduced. One of our outstanding areas of concern, which applies to pornography but also more broadly, is around clause 26, the children’s access assessment, where the child safety duties will apply not to all services but to services where there is a significant number of child users or children comprise a significant part of the user base. That would seem to open the door to some small and also problematic services being out of scope. We have expressed concerns previously about whether OnlyFans, for example, which is a very significant problem as a user-generated site with adult content, could be out of scope. Those are concerns that I know the Digital, Culture, Media and Sport Committee has recognised as well. We would very much like to see clause 26 removed from the Bill, which would ensure that we have a really comprehensive package in this legislation that tackles both commercial pornography and user-generated material.

None Portrait The Chair
- Hansard -

I think Lynn Perry is back. Are you with us, Lynn? [Interruption.] No—okay. We will move on to Maria Miller.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Just one more question. We know that women and minorities face more abuse online than men do. Is that something that you have found in your experience, particularly Twitter? What are you doing to ensure that the intersectionality of harms is considered in the work that you are doing to either remove or downgrade content?

Katy Minshall: That is absolutely the case and it has been documented by numerous organisations and research. Social media mirrors society and society has the problems you have just described. In terms of how we ensure intersectionality in our policies and approaches, we are guided by our trust and safety council, which is a network of dozens of organisations around the world, 10 of which are here in the UK, and which represents different communities and different online harms issues. Alongside our research and engagement, the council ensures that when it comes to specific policies, we are constantly considering a range of viewpoints as we develop our safety solutions.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Thank you, Chair, and thank you to the witnesses. I share your concerns about the lack of clarity regarding the journalistic content and democratic content exemptions. Do you think those exemptions should be removed entirely, or can you suggest what we might do to make them clearer in the Bill?

Katy Minshall: At the very least, there must be tighter definitions. I am especially concerned when it comes to the news publisher exemption. The Secretary of State has indicated an amendment that would mean that services like Twitter would have to leave such content up while an appeals process is ongoing. There is no timeline given. The definition in the Bill of a news publisher is, again, fairly vague. If Ben and I were to set up a news website, nominally have some standards and an email address where people could send complaints, that would enable it to be considered a news publisher under the Bill. If we think about some of the accounts that have been suspended from social media over the years, you can absolutely see them creating a news website and saying, “I have a case to come back on,” to Twitter or TikTok or wherever it maybe.

Ben Bradley: We share those concerns. There are already duties to protect freedom of expression in clause 19. Those are welcome. It is the breadth of the definition of journalistic and democratic content that is a concern for us, particularly when it comes to things like the expediated and dedicated appeals mechanism, which those people would be able to claim if their content was removed. We have already seen people like Tommy Robinson on the far right present themselves as journalists or citizen journalists. Giving them access to a dedicated and expediated appeals mechanism is an area of concern.

There are different ways you could address that, such as greater clarity in those definitions and removing subjective elements. At the minute, it is whether or not a user considers their content to be journalistic; that it is not an objective criterion but about their belief about their content.

Also, if you look at something like the dedicated and expediated appeals mechanism, could you hold that in reserve so that if a platform were found to be failing in its duties to journalistic content or in its freedom of expression duties, Ofcom could say, like it can in other areas of the Bill, “Okay, we believe that you need to create this dedicated mechanism, because you have failed to protect those duties.”? That would, I think, minimise the risk for exploitation of that mechanism.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

That is really helpful, thank you. A quick question—

None Portrait The Chair
- Hansard -

I am sorry, I have to interrupt because of time. Maria Miller.