Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, as we have said many times, this is a complex Bill. As we reflect on the priorities for Report, we can be more relaxed about some of the specifics on how Ofcom may operate, thereby giving it more flexibility—the flexibility it needs to be agile in the online world—if we as a Parliament trust Ofcom. Building trust, I believe, is a triangulation. First, there is independence from government—as discussed in respect of Secretary of State powers. Secondly, we need proper scrutiny by Parliament. Earlier today I talked about my desire for there to be proper post-legislative scrutiny and a permanent Joint Committee to do that. The third leg of the stool is the transparency to assist that scrutiny.

Clause 68 contains the provisions which would require category 1, 2A and 2B services to produce an annual transparency report containing information described by Ofcom in a notice given to the service. Under these provisions, Ofcom would be able to require these services to report on, among other things: information about the incidence of illegal content and content that is harmful to children; how many users are assumed to have encountered this content by means of the service; the steps and processes for users to report this content; and the steps and processes which a provider uses for dealing with this content.

We welcome the introduction of transparency reporting in relation to illegal content and content that is harmful to children. We agree with the Government that effective transparency reporting plays a crucial role in building Ofcom’s understanding of online harms and empowering users to make a more informed choice about the services they use.

However, despite the inclusion of transparency reporting in the Bill representing a step in the right direction, we consider that these requirements could and should be strengthened to do the trust building we think is important. First, the Bill should make clear that, subject to appropriate redactions, companies will be required to make their transparency reports publicly available—to make them transparent—hence Amendment 160A.

Although it is not clear from the Bill whether companies will be required to make these reports publicly available, we consider that, in most instances, such a requirement would be appropriate. As noted, one of the stated purposes of transparency reporting is that it would enable service users to make more informed choices about their own and their children’s internet use—but they can only do so if the reports are published. Moreover, in so far as transparency reporting would facilitate public accountability, it could also act as a powerful incentive for service providers to do more to protect their users.

We also recognise that requiring companies to publish the incidences of CSEA content on their platforms, for instance, may have the effect of encouraging individuals seeking such material towards platforms on which there are high incidences of that content—that must be avoided. I recognise that simply having a high incidence of CSEA content on a platform does not necessarily mean that that platform is problematic; it could just mean that it is better at reporting it. So, as ever with the Bill, there is a balance to be struck.

Therefore, we consider that the Bill should make it explicit that, once provided to Ofcom, transparency reports are to be made publicly available, subject to redactions. To support this, Ofcom should be required to produce guidance on the publication of transparency reports and the redactions that companies should make before making reports publicly accessible. Ofcom should also retain the power to stop a company from publishing a particular transparency report if it considers that the risk of directing individuals to illegal materials outweighs the benefit of making a report public—hence Amendments 160B and 181A.

Amendments 165 and 229 are in my noble friend Lord Stevenson’s name. Amendment 165 would broaden the transparency requirements around user-to-user services’ terms of service, ensuring that information can be sought on the scope of these terms, not just their application. As I understand it, scope is important to understand, as it is significant in informing Ofcom’s regulatory approach. We are trying to guard against minimal terms of service where detail is needed for users and Ofcom.

The proposed clause in Amendment 229 probes how Ofcom will review the effectiveness of the transparency requirements in the Bill. It would require Ofcom to undertake a review of the effectiveness of transparency reports within three years and every five years thereafter, and it would give the Secretary of State powers to implement any recommendations made by the regulator. The Committee should note that we also include a requirement that a Select Committee, charged by the relevant House, must consider and report on the regulations, with an opportunity for Parliament to debate them. So we link the three corners of the triangle rather neatly there.

If we agree that transparency is an important part of building trust in Ofcom in doing this difficult and innovative regulatory job—it is always good to see the noble Lord, Lord Grade, in his place; I know he is looking forward to getting on with this—then this proposed clause is sensible. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I am pleased that the noble Lord, Lord Knight of Weymouth, has given us an opportunity to talk about transparency reports with these amendments, which are potentially a helpful addition to the Bill. Transparency is one of the huge benefits that the legislation may bring. One of the concerns that the public have and that politicians have always had with online platforms is that they appear to be a black box—you cannot see what is going on in them.

In the entire edifice that we are constructing in the Online Safety Bill, there are huge opportunities to change that. The platforms will have to do risk assessments —there are measures in the Bill to make sure that information about these is put out—and they will have to take active steps to mitigate any risks they find. Again, we may get directions and guidance from Ofcom that will explain to the public exactly what is expected of them. The final piece of the jigsaw is the transparency reports that show the outcomes—how a platform has performed and what it has done to meet its obligations in dealing with content and behaviour on its services.

For the record, I previously worked for one of the platforms, and I would have said that I was on the pro-transparency wing of the transparency party inside the company. I believed that it was in the platform’s interest: if you do not tell people what you are doing, they will make things up about you, and what they make up will generally be worse than what you are actually doing. So there are huge advantages to the platforms from being transparent.

The noble Lord, Lord Knight, has picked up on some important points in his Amendment 160B, which talks about making sure that the transparency report is not counterproductive by giving the bad guys information that they could use to ill effect. That is a valid point; it is often debated inside the platforms. Sometimes, I argued furiously with my colleagues in the platforms about why we should disclose information. They would ask, “What about the bad guys?” Sometimes I challenged that, but other times it would have been a genuine and accurate concern. The noble Lord mentioned things such as child sexual abuse material, and we have to recognise that the bad guys are incredibly devious and creative, and if you show them anything that they can use against you to get around your systems, they will try to do that. That is a genuine and valid concern.

The sort of thing that you might put into a transparency report is, for example, whether you have banned particular organisations. I would be in favour of indicating to the public that an organisation is banned, but you can see that the potential impact of that is that all the people you are concerned about would create another organisation with a different name and then get back on to your platform. We need to be alive to those kinds of concerns.

It is also relevant to Amendment 165 and the terms of service that the more granular and detailed your terms of service are, the better they are for public information, but there are opportunities to get around them. Again, we would have that argument internally. I would say, “If we are prohibiting specific hate speech terms, tell people that, and then they won’t use them”. For me, that would be a success, as they are not using those hate speech terms anymore, but, of course, they may then find alternative hate speech terms that they can use instead. You are facing that battle all the time. That is a genuine concern that I hope we will be able to debate. I hope that Ofcom will be able to mitigate that risk by discussing with platforms what these transparency reports should look like. In a sense, we are doing a risk assessment of the transparency report process.

Amendment 229 on effectiveness is really interesting. My experience was that if you did not have a transparency report, you were under huge pressure to produce one and that once you produced one, nobody was interested. For fear of embarrassing anyone in the Committee, I would be curious to know how many noble Lords participating in this debate have read the transparency reports already produced by Meta Platforms, Google and others. If they have not read them, they should not be embarrassed, because my experience was that I would talk to regulators and politicians about something they had asked me to come in to talk about, such as hate speech or child sexual abuse material, and I learned to print off the transparency report. I would go in and say, “Well, you know what we are doing; it’s in our transparency report”. They would ask, “What transparency report?”, and I would have to show them. So, having produced a transparency report, every time we published it, we would expect there to be public interest, but little use was made of it. That is not a reason not to do them—as I said, I am very much in favour of doing them—but, on their own, they may not be effective, and Amendment 229 touches on that.

I was trying to think of a collective noun for transparency reports and, seeing as they shed light, I think it may be a “chandelier”. Where we may get the real benefit is if Ofcom can produce a chandelier of transparency reports, taking all the information it gets from the different platforms, processing it and selecting the most relevant information—the reports are often too long for people to work their way through—so that it can enable comparisons. That is really good and it is quite good for the industry that people know that platform A did this, platform B did that, and platform C did something else. They will take note of that, compare with each other and want to get into the best category. It is also critical that Ofcom puts this into user-friendly language, and Ofcom has quite a good record of producing intelligible reports. In the context of Amendment 229, a review process is good. One of the things that might come out of that, thinking ahead, would be Ofcom’s role in producing meta transparency reports, the chandelier that will shed light on what the whole sector is doing.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, for once I want to be really positive. I am actually very positive about this whole group of amendments because more transparency is essential in what we are discussing. I especially like Amendment 165 from the noble Lord, Lord Stevenson of Balmacara, because it is around terms of service for user-to-user services and ensures that information can be sought on the scope as well as the application. This is important because so much has been put on user-to-user services as well as on terms of service. You need to know what is going on.

I want particularly to compliment Amendment 229 that says that transparency reports should be

“of sufficient quality to enable service users and researchers to make informed judgements”,

et cetera. That is a very elegant way in which to say that they should not be gobbledegook. If we are going to have them, they should be clear and of a quality that we can read. Obviously, we do not want them to be unreadable and full of jargon and legalistic language. I am hoping that that is the requirement.

--- Later in debate ---
Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

My Lords, I have Amendments 185A and 268AA in this group. They are on different subjects, but I will deal with them in the same contribution.

Amendment 185A is a new clause that would introduce duties on online marketplaces to limit child access to listings of knives and take proactive steps to identify and remove any listings of knives or products such as ornamental zombie knives that are suggestive of acts of violence or self-harm. I am sure the Minister will be familiar with the Ronan Kanda case that has given rise to our bringing this amendment forward. The case is particularly horrible; as I understand it, sentencing is still outstanding. Two young boys bought ninja blades and machetes online and ultimately killed another younger boy with them. It has been widely featured in news outlets and is particularly distressing. We have had some debate on this in another place.

As I understand it, the Government have announced a consultation on this, among other things, looking at banning the sale of machetes and knives that appear to have no practical use other than being designed to look menacing or suitable for combat. We support the consultation and the steps set out in it, but the amendment provides a chance to probe the extent to which this Bill will apply to the dark web, where a lot of these products are available for purchase. The explanatory statement contains a reference to this, so I hope the Minister is briefed on the point. It would be very helpful to know exactly what the Government’s intention is on this, because we clearly need to look at the sites and try to regulate them much better than they are currently regulated online. I am especially concerned about the dark web.

The second amendment relates to racist abuse; I have brought the subject before the House before, but this is rather different. It is a bit of a carbon copy of Amendment 271, which noble Lords have already debated. It is there for probing purposes, designed to tease out exactly how the Government see public figures, particularly sports stars such as Marcus Rashford and Bukayo Saka, and how they think they are supposed to deal with the torrents of racist abuse that they receive. I know that there have been convictions for racist content online, but most of the abuse goes unpunished. It is not 100% clear that much of it will be identified and removed under the priority offence provisions. For instance, does posting banana emojis in response to a black footballer’s Instagram post constitute an offence, or is it just a horrible thing that people do? We need to understand better how the law will act in this field.

There has been a lot of debate about this issue, it is a very sensitive matter and we need to get to the bottom of it. A year and a half ago, the Government responded to my amendment bringing online racist abuse into the scope of what is dealt with as an offence, which we very much welcomed, but we need to understand better how these provisions will work. I look forward to the Minister setting that out in his response. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I rise to speak primarily to the amendments in the name of my noble friend Lord Clement-Jones, but I will also touch on Amendment 268AA at the same time. The amendments that I am particularly interested in are Amendments 200 and 201 on regulatory co-operation. I strongly support the need for this, and I will illustrate that with some concrete examples of why this is essential to bring to life the kinds of challenges that need to be dealt with.

The first example relates to trying to deal with the sexual grooming of children online, where platforms are able to develop techniques to do that. They can do that by analysing the behaviour of users and trying to detect whether older users are consistently trying to approach younger users, and the kind of content of the messages they may be sending to them where that is visible. These are clearly highly intrusive techniques. If a platform is subject to the general data protection regulation, or the UK version of that, it needs to be very mindful of privacy rights. We clearly have, there, two potentially interested bodies in the UK environment. We have the child protection agencies, and we will have, in future, Ofcom seeking to ensure that the platform has met its duty of care, and we will have the Information Commission’s Office.

A platform, in a sense, can be neutral as to what it is instructed to do by the regulator. Certainly, my experience was that the platforms wanted to do those kinds of activities, but they are neutral in the sense that they will do what they are told is legal. There, you need clarity from the regulators together to say, “Yes, we have looked at this and you are not going to do something on the instruction of the child safety agency and then get criticised, and potentially fined, by the Data Protection Agency for doing the thing you have been instructed to do”—so we need those agencies to work together.

The second example is in the area of co-operation around antiterrorism, another key issue. The platforms have created something called the Global Internet Forum to Counter Terrorism. Within that forum, they share tools and techniques—things such as databases of information about terrorist content and systems that you can use to detect them—and you are encouraged within that platform to share those tools and techniques with smaller platforms and competitors. Clearly, again, there is a very significant set of questions, and if you are in a discussion around that, the lawyers will say, “Have the competition lawyers cleared this?” Again, therefore, something that is in the public interest—that all the platforms should be using similar kinds of technology to detect terrorist content—is something where you need a view not just from the counterterrorism people but also, in our case, from the Competition and Markets Authority. So, again, you need those regulators to work together.

The final example is one which I know is dear to the heart of the noble Baroness, Lady Morgan of Cotes, which is fraudsters, which we have dealt with, where you might have patterns of behaviour where you have information that comes from the telecoms companies regulated by Ofcom, the internet service providers, regulated by Ofcom, and financial institutions, regulated by their own family of regulators—and they may want to share data with each other, which is something that is subject to the Information Commission’s Office again. So, again, if we are going to give platforms instructions, which we rightly do in this legislation, and say, “Look, we want you to get tougher on online fraudsters; we want you to demonstrate a duty of care there”, the platforms will need—certainly those regulators: financial regulators, Ofcom and the Information Commissioner’s Office—to sort those things out.

Having a forum such as the one proposed in Amendment 201, where these really difficult issues can be thrashed out and clear guidance can be given to online services, will be much more efficient than what sometimes happened in the past, where you had the left hand and the right hand of the regulatory world pulling you in different directions. I know that we have the Digital Regulation Cooperation Forum. If we can build on those institutions, it is essential and ideal that they have their input before the guidance is issued, rather than have a platform comply with guidance from regulator A and then get dinged by regulator B for doing the thing that they have been instructed to do.

That leads to the very sensible Amendment 201 on skilled persons. Again, Ofcom is going to be able to call in skilled persons. In an area such as data protection, that might be a data protection lawyer, but, equally, it might be that somebody who works at the Information Commissioner’s Office is actually best placed to give advice. Amendment 200—the first of the two that talks about skilled persons being able to come from regulators—makes sense.

Finally, I will touch on the issues raised in Amendment 268AA—I listened carefully and understand that it is a probing amendment. It raises some quite fundamental questions of principle—I suspect that the noble Baroness, Lady Fox, might want to come in on these—and it has been dealt with in the context of Germany and its network enforcement Act: I know the noble Lord, Lord Parkinson of Whitley Bay, can say that in the original German. That Act went in the same direction, motivated by similar concerns around hate speech.

--- Later in debate ---
We have heard the term “rabbit hole”; there is a rabbit hole, where people intent on self-harm or indeed those who suffer from eating disorders go from larger platforms to smaller and niche ones where they encounter the very content that feeds their addiction, or which fuels and enables their desire to self-harm. As I said in a previous grouping, this cannot be the intention of the Bill, I do not believe it is the intention of the Government, and I hope that the Minister will listen to the arguments that the noble Baroness, Lady Morgan of Cotes, set out so effectively.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I am a poor substitute for the noble Baroness, Lady Parminter, in terms of the substance of the issues covered by these amendments, but I am pleased that we have been able to hear from the noble Baroness, Lady Bull, on that. I will make a short contribution on the technology and the challenges of classification, because there are some important issues here that the amendments bring out.

We will be creating rules for categorising platforms. As I understand it, the rules will have a heavy emphasis on user numbers but will not be exclusively linked to user numbers. It would be helpful if the Minister could tease out a little more about how that will work. However, it is right even at this stage to consider the possibility that there will need to be exceptions to those rules and to have a mechanism in place for that.

We need to recognise that services can grow very quickly these days, and some of the highest-risk moments may be those when services have high growth but still very little revenue and infrastructure in place to look after their users. This is a problem generally with stepped models, where you have these great jumps; in a sense, a sliding scale would be more rational, so that responsibilities increase over time, but clearly from a practical view it is hard to do that, so we are going to end up with some kind of step model.

We also need to recognise that, from a technical point of view, it is becoming cheaper and easier to build new user-to-user services all the time. That has been the trend for years, but it is certainly the case now. If someone wants to create a service, they can rent the infrastructure from a number of providers rather than buying it, they can use a lot of code that is freely available—they do not need to write as much code as they used to—and they can promote their new service using all the existing social networks, so you can go from zero to significant user numbers in very quick time, and that is getting quicker all the time. I am interested to hear how the Minister expects such services to be regulated.

The noble Baroness, Lady Morgan, referred to niche platforms. There will be some that have no intention to comply, even if we categorise them as a 2B service. The letter will arrive from Ofcom and go in the bin. They will have no interest whatever. Some of the worst services will be like that. The advantage of us ensuring that we bring them into scope is that we can move through the enforcement process quickly and get to business disruption, blocking, or whatever we need to do to get them out of the UK market. Other niche services will be willing to come into line if they are told they are categorised as 2B but given a reasonable set of requirements. Some of Ofcom’s most valuable work might be precisely to work with them: services that are borderline but recognise that they want to have a viable business, and they do not have a viable business by breaking the law. We need to get hold of them and bring them into the net to be able to work with them.

Finally, there is another group which is very mainstream but in the growing phase and busy growing and not worrying about regulation. For that category of company, we need to work with them as they grow, and the critical thing is to get them early. I think the amendments would help Ofcom to be able get to them early—ideally, in partnership with other regulators, including the European Union, which is now regulating in a similar way under the Digital Services Act. If we can work with those companies as they come into 2B, then into category 1—in European speak, that is a VLOP, a very large online platform—and get them used to the idea that they will have VLOP and category 1 responsibilities before they get there, we can make a lot more progress. Then we can deliver what we are all trying to, which is a safer internet for people in the UK

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

I shall speak very briefly at this hour, just to clarify as much as anything. It seems important to me that there is a distinction between small platforms and large platforms, but my view has never been that if you are small, you have no potential harms, any more than if you are large, you are harmful. The exception should be the rule. We have to be careful of arbitrary categorisation of “small”. We have to decide who is going to be treated as though they are a large category 1 platform. I keep saying but stress again: do not assume that everybody agrees what significant risk of harm or hateful content is. It is such highly disputed political territory outside the online world and this House that we must recognise that it is not so straightforward.

I am very sympathetic, by the way, to the speeches made about eating disorders and other issues. I see that very clearly, but other categories of speech are disputed and argued over—I have given loads of examples. We end up where it is assumed that the manifestoes of mass shooters appear on these sites, but if you read any of those manifestoes of mass shooters, they will often be quoting from mainstream journalists in mainstream newspapers, the Bible and a whole range of things. Just because they are on 4Chan, or wherever, is not necessarily the problem; it is much more complicated.

I ask the Minister, and the proposers of the amendment, to some extent: would it not be straightforwardly the case that if there is a worry about a particular small platform, it might be treated differently—

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I just want to react to the manifestos of mass shooters. While source material such the Bible is not in scope, I think the manifesto of a shooter is clear incitement to terrorism and any platform that is comfortable carrying that is problematic in my view, and I hope it would be in the noble Baroness’s view as well.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I was suggesting that we have a bigger problem than it appearing on a small site. It quotes from mainstream media, but it ends up being broadly disseminated and not because it is on a small site. I am not advocating that we all go round carrying the manifestos of mass shooters and legitimising them. I was more making the point that it can be complicated. Would not the solution be that you can make appeals that a small site is treated differently? That is the way we deal with harmful material in general and the way we have dealt with, for example, RT as press without compromising on press freedom. That is the kind of point I am trying to make.

I understand lots of concerns but I do not want us to get into a situation where we destroy the potential of all smaller platforms—many of them doing huge amounts of social good, part of civil society and all the rest of it—by treating them as though they are large platforms. They just will not have the resources to survive, that is all my point is.