Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q187 Good morning to our witnesses. Thank you for joining us today. One of the main criticisms of the Bill is that the vast majority of the detail will not be available until after the legislation is enacted, under secondary legislation and so on. Part of the problem is that we are having difficulty in differentiating the “legal but harmful” content. What impact does that have?

William Perrin: At Carnegie, we saw this problem coming some time ago, and we worked in the other place with Lord McNally on a private Member’s Bill —the Online Harms Reduction Regulator (Report) Bill—that, had it carried, would have required Ofcom to make a report on a wide range of risks and harms, to inform and fill in the gaps that you have described.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - -

On a point of order, Ms Rees. There is a gentleman taking photographs in the Gallery.

None Portrait The Chair
- Hansard -

There is no photography allowed here.

William Perrin: Unfortunately, that Bill did not pass and the Government did not quite take the hint that it might be good to do some prep work with Ofcom to provide some early analysis to fill in holes in a framework Bill. The Government have also chosen in the framework not to bring forward draft statutory instruments or to give indications of their thinking in a number of key areas of the Bill, particularly priority harms to adults and the two different types of harms to children. That creates uncertainty for companies and for victims, and it makes the Bill rather hard to scrutinise.

I thought it was promising that the Government brought forward a list of priority offences in schedule 7 —I think that is where it is; I get these things mixed up, despite spending hours reading the thing. That was helpful to some extent, but the burden is on the Government to reduce complexity by filling in some of the blanks. It may well be better to table an amendment to bring some of these things into new schedules, as we at Carnegie have suggested—a schedule 7A for priority harms to adults, perhaps, and a 7B and 7C for children and so on—and then start to fill in some of the blanks in the regime, particularly to reassure victims.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Thank you. I am going to bring Maria Miller in now.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q This evidence session is underlining to me how complicated these issues are. I am really grateful for your expertise, because we are navigating through a lot of issues. With slight trepidation I open the conversation up into another area—the issue of protection for children. One of the key objectives of the legislation is to ensure a higher level of protection for children than for adults. In your view, does the Bill achieve that? I am particularly interested in your views on whether the risks of harm to children should be set out on the face of the Bill, and if so, what harms should be included. Can I bring Mat in here?

Mat Ilic: Thank you so much. The impact of social media in children’s lives has been a feature of our work since 2015, if not earlier; we have certainly researched it from that period. We found that it was a catalyst to serious youth violence and other harms. Increasingly, we are seeing it as a primary issue in lots of the child exploitation and missing cases that we deal with—in fact, in half of the cases we have seen in some of the areas that we work in it featured as the primary reason rather than as a coincidental reason. The online harm is the starting point rather than a conduit.

In relation to the legislation, all our public statements on this have been informed by user research. I would say that is one of the central principles to think through in the primary legislation—a safety-by-design focus. We have previously called this the toy car principle, which means any content or product that is designed with children in mind needs to be tested in a way that is explicitly for children, as Mr Moy talked about. It needs to have some age-specific frameworks built in, but we also need to go further than that by thinking about how we might raise the floor, rather than necessarily trying to tackle explicit harms. Our point is that we need to remain focused on online safety for children and the drivers of online harm and not the content.

The question is, how can that be done? One way is the legal design requirement for safety, and how that might play out, as opposed to having guiding principles that companies might adopt. Another way is greater transparency on how companies make particular decisions, and that includes creating or taking off content that pertains to children. I want to underline the point about empowerment for children who have been exposed to or experience harm online, or offline as a result of online harm. That includes some kind of recourse to be able to bring forward cases where complaints, or other issues, were not taken seriously by the platforms.

If you read the terms and conditions of any given technology platform, which lots of young people do not do on signing up—I am sure lots of adults do not do that either—you realise that even with the current non-legislative frameworks that the companies deploy to self-regulate, there is not enough enforcement in the process. For example, if I experience some kind of abuse and complain, it might never be properly addressed. We would really chime on the enforcement of the regulatory environment; we would try to raise the floor rather than chase specific threats and harms with the legislation.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Can I bring Lorna in here? We are talking about moving from content to the drivers of harm. Where would you suggest that should be achieved within the Bill?

Professor Lorna Woods: I think by an overarching risk assessment rather than one that is broken down into the different types of content, because that, in a way, assumes a certain knowledge of the type of content before you can do a risk assessment, so you are into a certain circular mode there. Rather than prejudging types of content, I think it would be more helpful to look at what is there and what the system is doing. Then we could look at what a proportionate response would be—looking, as people have said, at the design and the features. Rather than waiting for content to be created and then trying to deal with it, we could look at more friction at an earlier stage.

If I may add a technical point, I think there is a gap relating to search engines. The draft Bill excluded paid-for content advertising. It seems that, for user-to-user content, this is now in the Bill, bringing it more into line with the current standards for children under the video-sharing platform provisions. That does not apply to search. Search engines have duties only in relation to search content, and search content excludes advertising. That means, as I read it, that search engines would have absolutely no duties to children under their children safety duty in relation to advertising content. You could, for example, target a child with pornography and it would fall outside the regime. I think that is a bit of a gap.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Q Thank you, witnesses, for your time this morning. I am going to focus initially on journalistic content. Is it fair that the platforms themselves are having to try to define what journalistic content is and, by default, what a journalist is? Do you see a way around this?

William Moy: No, no, yes. First, no, it is not fair to put that all on the platforms, particularly because—I think this a crucial thing for the Committee across the Bill as a whole—for anything to be done at internet scale, it has to be able to be done by dumb robots. Whatever the internet companies tell you about the abilities of their technology, it is not magic, and it is highly error-prone. For this duty to be meaningful, it has to be essentially exercised in machine learning. That is really important to bear in mind. Therefore, being clear about what it is going to tackle in a way that can be operationalised is important.

To your second point, it is really important in this day and age to question whether journalistic content and journalists equate to one another. I think this has come up in a previous session. Nowadays, journalism, or what we used to think of as journalism, is done by all kinds of people. That includes the same function of scrutiny and informing others and so on. It is that function that we care about—the passing of information between people in a democracy. We need to protect that public interest function. I think it is really important to get at that. I am sure there are better ways of protecting the public interest in this Bill by targeted protections or specifically protecting freedom of expression in specific ways, rather than these very broad, vague and general duties.