Debates between Lord Bishop of Guildford and Lord Allan of Hallam during the 2019 Parliament

Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2

Online Safety Bill

Debate between Lord Bishop of Guildford and Lord Allan of Hallam
Lord Bishop of Guildford Portrait The Lord Bishop of Guildford
- View Speech - Hansard - -

My Lords, I will speak to Amendments 128, 130 and 132, as well as Amendments 143 to 153 in this grouping. They were tabled in the name of my right reverend colleague the Bishop of Derby, who is sorry that she cannot be here today.

The Church of England is the biggest provider of youth provision in our communities and educates around 1 million of our nation’s children. My colleague’s commitment to the principles behind these amendments also springs from her experience as vice chair of the Children’s Society. The amendments in this grouping are intended to strengthen legislation on online grooming for the purpose of child criminal exploitation, addressing existing gaps and ensuring that children are properly protected. They are also intended to make it easier for evidence of children being groomed online for criminal exploitation to be reported by online platforms to the police and the National Crime Agency.

Research from 2017 shows that one in four young people reported seeing illicit drugs advertised for sale on social media—a percentage that is likely to be considerably higher six years on. According to the Youth Endowment Fund in 2022, 20% of young people reported having seen online content promoting gang membership in the preceding 12 months, with 24% reporting content involving the carrying, use or promotion of weapons.

In relation to drugs, that later research noted that these platforms provide opportunities for dealers to build trust with potential customers, with young people reporting that they are more likely to see a groomer advertising drugs as a friend than as a dealer. This leaves young people vulnerable to exploitation, thereby reducing the scruples or trepidation they might feel about buying drugs in the first place. Meanwhile, it is also clear that social media is changing the operation of the county lines model. There is no longer the need to transport children from cities into the countryside to sell drugs, given that children who live in less populated areas can be groomed online as easily as in person. A range of digital platforms is therefore being used to target potential recruits among children and young people, with digital technologies also being deployed—for example, to monitor their whereabouts on a drugs run.

More research is being carried out by the Children’s Society, whose practitioners reported a notable increase in the number of perpetrators grooming children through social media and gaming sites during the first and second waves of the pandemic. Young people were being contacted with promotional material about lifestyles they could lead and the advantages of working within a gang, and were then asked to do jobs in exchange for money or status within this new group. It is true that some such offences could be prosecuted under the Modern Slavery Act 2015, but there remains a huge disparity between the scale of exploitation and the number of those being charged under the Act. Without a definition of child exploitation for criminal purposes, large numbers of children are being groomed online and paying the price for crimes committed by some of their most dangerous and unscrupulous elders.

It is vital that we protect our children from online content which facilitates that criminal exploitation, in the same way that we are looking to protect them from sexual exploitation. Platforms must be required to monitor for illegal content related to child criminal exploitation on their sites and to have mechanisms in place for users to flag it with those platforms so it can be removed. This can be achieved by including modern slavery and trafficking, of which child criminal exploitation is a form, into the scope of illegal content within the Bill, which is what these amendments seek to do. It is also vital that the law sets out clear expectations on platforms to report evidence of child criminal exploitation to the National Crime Agency in the same way as they are expected to report content involving child sexual exploitation and abuse to enable child victims to be identified and to receive support. Such evidence may enable action against the perpetrators without the need of a disclosure from child victims. I therefore fully support and endorse the amendments standing in the name of the right reverend Prelate.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this is again a very helpful set of amendments. I want to share some experience that shows that legality tests are really hard. Often from the outside there is an assumption that it is easy to understand what is legal and illegal in terms of speech, but in practice that is very rarely the case. There is almost never a bright line, except in a small class of child sexual abuse material where it is always illegal and, as soon as you see the material, you know it is illegal and you can act on it. In pretty much every other case, you have to look at what is in front of you.

I will take a very specific example. Something we had to deal with was images of Abdullah Öcalan, the leader of the PKK in Turkey. If somebody shared a picture of Abdullah Öcalan, were they committing a very serious offence, which is the promotion of terrorism? Were they indicating support for the peace process that was taking place in Turkey? Were they showing that they support his socialist and feminist ideals? Were they supporting the YPG, a group in Syria to which we were sending arms, that venerates him? This is one example of many I could give where the content in front of you does not tell you very clearly whether or not the speech is illegal or speech that should be permitted. Indeed, we would take speech like that down and I would get complaints, including from Members of Parliament, saying, “Why have you removed that speech? I’m entitled to talk about Abdullah Öcalan”, and we would enter into an argument with them.

We would often ask lawyers in different countries whether they could tell us whether a speech was legal or illegal. The answer would come back as probably illegal, likely illegal, maybe illegal and, occasionally, definitely not illegal, but it was nearly always on the spectrum. The amendments we are proposing today are to try to understand where the Government intend people to draw that line when they get that advice. Let us assume the company wants to do the right thing and follow the instructions of the Bill and remove illegal content. At what level do they say it has met the test sufficiently, given that in the vast majority of cases, apart from the small class of illegal content, they are going to be given only a likelihood or a probability? As the noble Lord, Lord Moylan, pointed out, we have to try to insert this notion of sufficient evidence with Amendments 273, 275, 277, 280 and 281 in the names of my noble friend Lord Clement-Jones and the noble Viscount, Lord Colville, who is unable to be in his place today. I think the noble Baroness, Lady Kidron, may also have signed them. We are trying to flesh out the point at which that illegality standard should kick in.

Just to understand again how this often works when the law gets involved, I say that there is a law in Germany; the short version is NetzDG. If there are any German speakers who can pronounce the compound noun that is its full title, there will be a prize. It is a long compound word that means “network enforcement Act”. It has been in place for a few years and it tells companies to do something similar—to remove content that is illegal in Germany. There would be cases where we would get a report from somebody saying, “This is illegal”, and we would take action; then it went into the German system and three months later we would finally get told whether it was actually illegal in a 12-page judgment that a German court had figured out. In the meantime, all we could do was work on our best guess while that process was going on. I think we need to be very clear that illegality is hard.

Cross-jurisdictional issues present us with another set of challenges. If both the speaker and the audience are in the United Kingdom, it is fairly clear. But in many cases, when we are talking about online platforms, one or other, or even both the speaker and the audience, may be outside the United Kingdom. Again, when does the speech become illegal? It may be entirely legal speech between two people in the United States. I think—and I would appreciate clarification from the Minister—that the working assumption is that if the speech was reported by someone not in the United State but in the UK, the platform would be required to restrict access to it from the UK, even though the speech is entirely legal in the jurisdiction in which it took place. Because the person in the UK encountered it, there would be a duty to restrict it. Again, it has been clarified that there is certainly not a duty to take the speech down, because it is entirely legal speech outside the UK. These cross-jurisdictional issues are interesting; I hope the Minister can clarify that.

The amendments also try to think about how this would work in practice. Amendment 287 talks about how guidance should be drawn up in consultation with UK lawyers. That is to avoid a situation where platforms are guessing too much at what UK lawyers want; they should at least have sought UK legal advice. That advice will then be fed into the guidance given to their human reviewers and their algorithms. That is the way, in practice, in which people will carry out the review. There is a really interesting practical question—which, again, comes up under NetzDG—about the extent to which platforms should be investing in legal review of content that is clearly against their terms of service.

There will be two kinds of platform. There will be some platforms that see themselves as champions of freedom of expression and say they will only remove stuff that is illegal in the UK, and everything else can stay up. I think that is a minority of platforms—they tend to be on the fringes. As soon as a platform gets a mainstream audience, it has to go further. Most platforms will have terms of service that go way beyond UK law. In that case, they will be removing the hate speech, and they will be confident that they will remove UK-illegal hate speech within that. They will remove the terrorist content. They will be confident and will not need to do a second test of the legality in order to be able to remove that content. There is a practical question about the extent to which platforms should be required to do a second test if something is already illegal under their terms.

There will be, broadly speaking again, four buckets of content. There will be content that is clearly against a platform’s terms, which it will want to get rid of immediately. It will not want to test it again for legality; it will just get rid of it.

There will be a second bucket of content that is not apparently against a platform’s terms but clearly illegal in the UK. That is a very small subset of content: in Germany, that is Holocaust denial content; in the United Kingdom, this Parliament has looked at Holocaust denial and chosen not to criminalise it, so that will not be there, but an equivalent for us would be migration advice. Migration advice will not be against the terms of service of most platforms, but in the Government’s intention, the Illegal Migration Bill is to make it illegal and require it to be removed, and the consequent effect will be that it will have to be removed under the terms of this Bill. So there will be that small set of content that is illegal in the UK but not against terms of service.

There will be a third bucket of content that is not apparently against the terms or the law, and that actually accounts for most of the complaints that a platform gets. I will choose my language delicately: complaint systems are easy, and people complain to make a point. They use complaint systems such as dislike buttons. The reality is that one of the most common sets of complaints you get is when there is a football match and the two opposing teams report the content on each other’s pages as illegal. They will do that every time, and you get used to it, and that is why you learn to discount mass-volume complaints. But again, we should be clear that there are a great many complaints that are merely vexatious.

The final bucket is of content that is unclear and legal review will be needed. Our amendment is intended to deal with those. A platform will go out and get advice. It is trying to understand at what point something like migration advice tips over into the illegal as opposed to being advice about going on holiday, and it is trying to understand that based on what it can immediately see. Once it has sought that advice, it will feed that back into the guidance to reviewers and the algorithms to try and remove content more effectively and be compliant with the Bill as a whole and not get into trouble with Ofcom.

Some areas are harder than others. The noble Lord, Lord Moylan, already highlighted one: public order offences, which are extremely hard. If somebody says something offensive or holds an offensive political view—I suspect the noble Baroness, Lady Fox, may have something to say on this—people may well make contact and claim that it is in breach of public order law. On the face of it, they may have a reasonably arguable case but again, as a platform, you are left to make a decision.