Lord Allan of Hallam debates involving the Department for Digital, Culture, Media & Sport during the 2019 Parliament

Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 23rd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 23rd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 16th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1

Online Safety Bill

Lord Allan of Hallam Excerpts
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, first, I want to recognise the bravery of the families of Olly, Breck, Molly, Frankie and Sophie in campaigning for the amendments we are about to discuss. I also pay tribute to Mia, Archie, Isaac, Maia and Aime, whose families I met this morning on their way to the House. It is a great privilege to stand alongside them and witness their courage and dignity in the face of unimaginable grief. On behalf of myself, my co-signatories—the noble Lords, Lord Stevenson and Lord Clement-Jones, and the noble Baroness, Lady Morgan—and the huge number of Peers and MPs who have supported these amendments, I thank them for their work and the selflessness they have shown in their determination to ensure that other families do not suffer as they have.

This group includes Amendments 198, 199, 215 and 216, which, together, would create a pathway for coroners and, by extension, families to get access to information relevant to the death of a child from technology services. The amendments would put an end to the inhumane situation whereby coroners and families in crisis are forced to battle faceless corporations to determine whether a child’s engagement with a digital service contributed to their death. Bereaved families have a right to know what happened to their children, and coroners have a duty to ensure that lessons are learned and that those who have failed in their responsibilities are held accountable.

Since the Minister is going to be the bearer of good news this afternoon, I will take the time to make arguments for the amendments as they stand. I simply say that, while parents have been fighting for access to information, those same companies have continued to suggest friends, material and behaviours that drive children into places and spaces in which they are undermined, radicalised into despair and come to harm. In no other circumstance would it be acceptable to withhold relevant information from a court procedure. It is both immoral and a failure of justice if coroners cannot access and review all relevant evidence. For the families, it adds pain to heartbreak as they are unable to come to terms with what has happened because there is still so much that they do not know.

I am grateful to the Government for agreeing to bring forward on Report amendments that will go a very long way towards closing the loopholes that allow companies to refuse coroners’ demands and ignore parents’ entreaties. The Government’s approach is somewhat different from that in front of us, but it covers the same ground. These amendments are the result of the considerable efforts of Ministers and officials from DSIT and the Ministry of Justice, with the invaluable support of the right honourable Sajid Javid MP. I wish to note on the record the leadership of the Secretary of State, who is currently on leave, and the Minister here, the noble Lord, Lord Parkinson.

The Government’s amendments will create an express power for Ofcom to require information from services about a deceased child user’s online activity following the receipt of a Schedule 5 request from a coroner. This will vastly increase the reach and power of that coroner. Information that Ofcom can request from regulated companies under the Online Safety Bill is extremely wide and includes detailed data on what is recommended; the amount of time the child spent on the service when they accessed it; their user journey; what content they liked, shared, rewatched, paused and reported; and whether other users raised red flags about the child’s safety or well-being before their death.

Information notices prompted by a Schedule 5 request from a coroner will be backed by Ofcom’s full enforcement powers and will apply to all regulated companies. If a service fails to comply, it may be subject to enforcement action, including senior management liability and fines of up to £18 million or 10% of global turnover—vastly different from the maximum fine of £1,000 under the Coroners and Justice Act 2009. Moreover, these amendments will give coroners access to Ofcom’s expertise and understanding of how online services work and of online services’ safety duties to children. Also, there will be provisions empowering Ofcom to share information freely to assist coroners in their inquiries. Companies must provide a dedicated means of communication to manage requests for information from bereaved parents and provide written responses to those requests. I look forward to the Minister setting out that these will be operated by a team of experts and backed up by Ofcom in ensuring that the communication is adequate, timely and not obstructive. Importantly, if the communication is not adequate, bereaved families will be able to notify Ofcom.

There are a small number of outstanding questions. We remain concerned that only larger companies will be required to set out their policies on disclosure. Sadly, children are often coerced and nudged into smaller sites that have less robust safety mechanisms. Small is not safe. A further issue is to ensure that a coroner is able, via a Schedule 5 notice given to Ofcom, to compel senior management to appear at an inquest. This is a crucial ask of the legal community, who battled and failed to get companies to attend inquests, notably Wattpad at the Frankie Thomas inquest and Snap Inc at Molly Russell’s inquest. Can the Minister undertake to close these gaps before Report?

A number of matters sit outside the scope of the Online Safety Bill. I am particularly grateful to the Secretary of State for committing in writing to further work beyond the Bill to ensure that the UK’s approach is comprehensive and watertight. The Government will be exploring ways in which the Data Protection and Digital Information (No. 2) Bill can support and complement these provisions, including the potential for a code that requires data preservation if a parent or enforcement officer contacts a helpline or if there is constructive knowledge, such as when a death has been widely reported, even before a Schedule 5 notice has been delivered.

The Government are engaging with the Chief Coroner to provide training in order to ensure that coroners have the knowledge they need to carry out inquests where children’s engagement with online services is a possible factor in their death. I am concerned about the funding of this element of the Government’s plans and urge the Minister to indicate whether this could be part of Ofcom’s literacy duties and therefore benefit from the levy. Possibly most importantly, the Secretary of State has undertaken to approach the US Government to ensure that coroners can review private messages that fall outside the scope of this Bill in cases where a child’s death is being investigated. I am grateful to the noble Lord, Lord Allan, for his support in articulating the issue, and accept the invitation to work alongside the department to achieve this.

There are only two further things to say. First, delivery is in the drafting, and I hope that when he responds, the Minister will assure the House that we will see the proposed amendments well before Report so that we can ensure that this works as we have all agreed. Secondly, the Government are now looking very carefully at other amendments which deal with prevention of harm in one way or another. I share the gratitude of Bereaved Parents for Online Safety for the work that has gone into this set of amendments. However, we want to see safety by design; a comprehensive list of harms to children in the Bill, including harms caused or amplified by the design of service; principles for age assurance which ensure that the systems put in place by regulated services are measurable, secure and fit for purpose; and a proper complaints service, so that children have somewhere to turn when things go wrong. What we have been promised is a radical change of status for the coroner and for the bereaved families. What we want is fewer dead children. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, some of the issues that we have been dealing with in this Bill are more abstract or generic harms, but here we are responding to a specific need of families in the UK who are facing the most awful of circumstances.

I want to recognise the noble Baroness, Lady Kidron, for her direct support for many of those families, and for her persistent efforts to use policy and the tools we have available to us here to improve the situation for families who, sadly, will face similar tragedies in future. I appreciate the time that she has spent with me in the spirit of finding workable solutions. It is an alliance that might seem improbable, given our respective responsibilities, which have sometimes placed us in publicly adversarial roles. However, one of the strengths of this Committee process is that it has allowed us to focus on what is important and to find that we have more in common than separates us. Nothing could be more important than the issue we are dealing with now.

I am pleased that it looks like we will be able to use this Bill to make some significant improvements in this area to address the challenges faced by those families, some of whom are here today, challenges which add to their already heart-wrenching distress. The first challenge these families face is to find someone at an online service who is willing and able to answer their questions about their loved one’s use of that platform. This question about contacts at online platforms is not limited to these cases but comes up in other areas.

As noble Lords will know, I used to work for Facebook, where I was often contacted by all sorts of Governments asking me to find people in companies, often smaller companies, concerning very serious issues such as terrorism. Even when they were dealing with the distribution of terrorist content, they would find it very challenging. There is a generic problem around getting hold of people at platforms. A real strength of the Online Safety Bill is that it will necessarily require Ofcom to develop contacts at all online services that offer user-to-user and search services to people in the UK. The Government estimate that 25,000 entities are involved. We are talking about Ofcom building a comprehensive database of pretty much any service that matters to people in the UK.

Primarily, these contacts will be safety focused, as their main responsibility will be to provide Ofcom with evidence that the service is meeting its duties of care under the Bill, so again, they will have the right people in the right companies on their database in future. Importantly, Ofcom will have a team of several hundred people, paid for by a levy on these regulated services, to manage the contacts at the right level. We can expect that, certainly for the larger services, there may be a team of several people at Ofcom dedicated to working with them, whereas for the smaller services it may be a pooled arrangement whereby one Ofcom staff member deals with a group. However, in all cases there will be someone at the regulator with a responsibility for liaising with those companies. We do not expect Ofcom to use those contacts to resolve questions raised by individuals in the UK as a matter of course, but it makes sense to make this channel available where there is a relatively small number of highly impactful cases such as we are dealing with here.

--- Later in debate ---
If a provider outside the UK ignores letters and fines, these measures may well be the only possibility. Many pornography providers probably have absolutely no intention of even trying to comply with the kinds of regulations that are envisaged in the Bill. They are probably not based in the UK, are never going to pay a fine and are probably incorporated in some obscure offshore jurisdiction. Ofcom will need to use these powers in such circumstances, and on a bulk scale. We should not put that enforcement activity at risk of the legal stalling games that these sites will undoubtedly play. For that reason, I ask the Minister to commit to these changes by government amendment before Report next month.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I want to speak to Amendment 218JA in this group, in my name, to which the noble Baroness, Lady Morgan of Cotes, has added her name. This is really trying to understand what the Government’s intentions are in respect of access restriction orders.

Just to take a step back, in the Online Safety Bill regime we are creating, in effect, a licensing regime for in-scope services and saying that, if you want to operate in the United Kingdom and you are covered by the Bill—whether that is the pornography services that the noble Lord, Lord Bethell, referred to or a user-to-user or search service—here are the conditions to which you must adhere. That includes paying a fee to Ofcom for your supervision, and then following the many thousands of pages of guidance that I suspect we will end up producing and issuing to those companies. So what we are exploring here is what happens if a particular organisation does not decide to take up the offer of a licence.

Again, to go back to the previous debate, success for the Bill would be that it has a sufficient deterrent effect that the problems that we are seeking to fix are addressed. I do not think we are looking to block services or for them to fail—we are looking for them to succeed, so stage one is that Ofcom asks them nicely. It says, “You want to operate in the UK, here is what you need to do—it’s a reasonable set of requests we are making”, and the services say, “Fine”. If not, they choose to self-limit—and it is quite trivial for any online service to say, “I’m going to check incoming traffic, and if this person looks like they are coming from the UK, I’m not going to serve them”. That is self-limiting, which is an option that would be preferable if a service chose not to accept the licence condition. But let us assume that it has accepted the licence condition, and Ofcom is going to be monitoring it on a routine basis—and if Ofcom thinks it is not meeting its requirements, whether that is to produce a risk assessment or to fulfil its duty of care, Ofcom will then instruct it to do something. If it fails to follow that instruction, we are in the territory of the amendments that we are considering here: either it has refused to accept the licence conditions and to self-limit, or it has accepted them but has failed to do what we expect it to do. It has signed up and thought that it is not serious, and it is not doing the things that we expect it to do.

At that point, Ofcom has to consider what it can do. The first stage is quite right, in the group of clauses that we are looking at—Ofcom can bring in these business disruption measures. As the noble Lord, Lord Bethell, rightly pointed out, in many instances that will be effective. Any commercial service—not just pornography services, but an online service that depends on advertising—that is told that it can no longer take credit card payments from UK businesses to advertise on the service, will, one hopes, come into line and say, “That’s the end of my business in the UK—I may as well cut myself off”. But if it wants to operate, it will come into line, because that way it gets its payment services restored. But there will be others for which that is insufficient—perhaps that is not their business model—and they will carry on regardless. At that point, we may want to consider the access restrictions.

In a free society, none of us should take pleasure in the idea that we are going to instruct the internet services or block them. That is not our first instinct, but something that is rather potentially a necessary evil. At some point, there may be services that are so harmful and so oblivious to the regime that we put in place that we need to block them. Here we are trying to explore what would happen in those circumstances. The first kind of block is one that we are used to doing, and we do it today for copyright-infringing sites and a small number of other sites that break the law. We instruct service providers such as BT and TalkTalk to implement a network-level block. There are ways you can do that—various technical ways that we do not need to go into in this debate—whereby we can seek to make it so that an ordinary UK user, when they type in www.whatever, will not get to the website. But increasingly people are using technology that will work around that. Browsers, for example, may create traffic between your web browser and the online service such that TalkTalk or BT or the access provider has no visibility as to where you are going and no capability of blocking it. BT has rightly raised that. There will be different views about where we should go with this, but the question is absolutely legitimate as to what the Government’s intentions are, which is what we want to try to tease out with this amendment.

Again, we should be really candid. Somebody who is determined to bypass all the access controls will do so. There is no world in which we can say that we can guarantee that somebody with a UK internet connection can never get to a particular website. What we are seeking to do is to make violating services unavailable for most of the people most of the time. We would be unhappy if it was only some of the people some of the time, but it is not going to be all of the people all of the time. So the question is: what constitutes a sufficient access restriction to either bring them to heel or to ensure that, over the longer term, the harm is not propagated, because these services are generally not made available? It would be really helpful if the Minister was able to tease that out.

Certainly, in my view, there are services such as TOR—the Onion Router—where there is no entity that you can ask to block stuff, so if someone was using that, there is nothing that you can reasonably do. At the other end of the spectrum, there are services such as BT and TalkTalk, where it is relatively straightforward to say to them that they should block. Then there are people in between, such as browser owners that are putting in place these encrypted tunnels for very good reasons, for privacy, but which can also add value-added stuff—helping to manage bandwidth better, and so on. Is it the Government’s intention that they are going to be served with access restriction orders? That is a valid question. We might have different views about what is the right solution, but it is really important for the sector that it understands and is able to prepare if that is the Government’s intention. So we need to tease that out; that is the area in which we are looking for answers from the Government.

The second piece is to think about the long term. If our prediction—or our hope and expectation—is that most companies will come into line, that is fine; the internet will carry on as it does today but in a safer way. However, if we have misjudged the mood, and a significant numbers of services just stick their thumb up at Ofcom and say, “We are not going to play—block us if you dare”, that potentially has significant consequences for the internet as it will operate in the United Kingdom. It would be helpful to understand from the Minister whether the Government have any projections or predictions as to which way we are going to go. Are we talking about the vast majority of the internet continuing as it is today within the new regime, with the odd player that will be outside that, or is it the Government’s expectation that there may need to be blocking of significant numbers of services, essentially for the foreseeable future?

Other countries such as France and Germany have been dealing with this recently, as the noble Lord, Lord Bethell, is probably aware of. They have sought to restrict access to pornography services, and there have been all sorts of consequent knock-on effects and challenges at a technical level. It would be helpful to understand whether our expectation is that we will see the same in the United Kingdom or that something else is going to happen. If the Government do not have that information today, or if they have not made those projections, it would be helpful to know their thinking on where that might happen. Who will be able to inform us as to what that the future landscape is likely to look like as it evolves, and as Ofcom gains these powers and starts to instruct companies that they must obtain licences, and then seeks to take enforcement action against those that choose not to play the game?

Lord Curry of Kirkharle Portrait Lord Curry of Kirkharle (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I support Amendment 217 in the name of the noble Lord, Lord Bethell, and very much support the comments that he has made. I will speak to Amendments 218C, 218E, 218H and 218K in my name within this group. I also support the intent of the other amendments in this group tabled by the noble Lord, Lord Bethell.

I appreciate the process helpfully outlined by the noble Lord, Lord Allan. However, when looking at Ofcom’s implementation of existing provisions on video-sharing platforms, the overwhelming impression is of a very drawn-out process, with Ofcom failing to hold providers to account. Despite being told by Ofcom that a simple tick-box declaration by the user confirming that they are over 18 is not sufficient age verification, some providers are still using only that system. Concerningly, Ofcom has not taken decisive action.

When children are at severe risk, it is not appropriate to wait. Why, for example, should we allow porn sites to continue to host 10 million child sexual abuse videos while Ofcom simply reports that it is continuing to partner with these platforms to get a road map of action together? As has been mentioned by the noble Lord, Lord Bethell, Visa and Mastercard did not think it was appropriate to wait in such circumstances—they just acted.

Similarly, when systems are not in place to protect children from accessing pornography, we cannot just sit by and allow all the egregious associated harms to continue. Just as in Formula 1, when a red flag is raised and the cars must stop and go into the pits until the dangerous debris is cleared, sometimes it is too dangerous to allow platforms to operate until the problems are fixed. It seems to me that platforms would act very swiftly to put effective systems and processes in place if they could not operate in the interim.

The Bill already contains this emergency handbrake; the question is when it should be used. My answer is that it should be used when the evidence of severe harm presents itself, and not only when the regulator has a moment of self-doubt that its “road maps”, which it is normally so optimistic about, will eventually fix the problem. Ofcom should not be allowed to sit on the evidence hoping, with a wing and a prayer, that things will fix themselves in the end.

--- Later in debate ---
Introducing mandatory requirements would undermine Ofcom’s independence and discretion to manage enforcement on a case-by-case basis. This would also frustrate Ofcom’s ability to regulate in a proportionate way and could make its enforcement processes unnecessarily punitive or inflexible. It could also overwhelm the courts if Ofcom is strictly forced to apply for business disruption measures where any grounds apply, even where the breach may be minor. Instead, Ofcom will act proportionately in performing its regulatory functions, targeting action where it is needed and adjusting timeframes as necessary. I am mindful that on the final day in Committee, the noble Lord, Lord Grade of Yarmouth, continues to be in his place, following the Committee’s deliberations very closely on behalf of the regulator.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I am reminded by my noble friend Lord Foster of Bath, particularly relating to the gambling sector, that some of these issues may run across various regulators that are all seeking business disruption. He reminded me that if you type into a search engine, which would be regulated and subject to business disruption measures here, “Casinos not regulated by GAMSTOP”, you will get a bunch of people who are evading GAMSTOP’s regulation. Noble Lords can imagine similar for financial services—something that I know the noble Baroness, Lady Morgan of Cotes, is also very interested in. It may not be for answer now, but I would be interested to understand what thinking the Government have on how all the different business disruption regimes—financial, gambling, Ofcom-regulated search services, et cetera—will all mesh together. They could all come before the courts under slightly different legal regimes.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

When I saw the noble Lord, Lord Foster of Bath, and the noble Baroness, Lady Armstrong of Hill Top, in their places, I wondered whether they were intending to raise these points. I will certainly take on board what the noble Lord says and, if there is further information I can furnish your Lordships with, I certainly will.

The noble Baroness, Lady Kidron, asked whether the powers can be used on out-of-scope services. “No” is the direct answer to her direct question. The powers can be used only in relation to regulated services, but if sites not regulated by the Bill are publishing illegal content, existing law enforcement powers—such as those frequently deployed in cases of copyright infringement—can be used. I could set out a bit more in writing if that would be helpful.

My noble friend Lord Bethell’s amendments seek to set out in the Bill that Ofcom will be able to make a single application to the courts for an order enabling business disruption measures that apply against multiple platforms and operators. I must repeat, as he anticipated, the point made by my right honourable friend Chris Philp that the civil procedure rules allow for a multi-party claim to be made. These rules permit any number of claimants or defendants and any number of claims to be covered by one claim form. The overriding objective of the civil procedure rules is that cases are dealt with justly and proportionately. I want to reassure my noble friend that the Government are confident that the civil procedure rules will provide the necessary flexibility to ensure that services can be blocked or restricted.

The amendment in the name of the noble Lord, Lord Allan of Hallam, seeks to clarify what services might be subject to access restriction orders by removing the two examples provided in the Bill: internet access services and application stores. I would like to reassure him that these are simply indicative examples, highlighting two kinds of service on which access restriction requirements may be imposed. It is not an exhaustive list. Orders could be imposed on any services that meet the definition—that is, a person who provides a facility that is able to withdraw, adapt or manipulate it in such a way as to impede access to the regulated service in question. This provides Ofcom with the flexibility to identify where business disruption measures should be targeted, and it future-proofs the Bill by ensuring that the power remains functional and effective as technologies develop.

As the noble Lord highlighted, these are significant powers that can require that services be blocked in the UK. Clearly, limiting access to services in this way substantially affects the business interests of the service in question and the interests of the relevant third-party service, and it could affect users’ freedom of expression. It is therefore essential that appropriate safeguards are included and that due process is followed. That is why Ofcom will be required to seek a court order to be able to use these powers, ensuring that the courts have proper oversight.

To ensure that due process is upheld, an application by the regulator for a court order will have to specify the non-compliant provider, the grounds of the order and the steps that Ofcom considers should be imposed on the third parties in order to withdraw services and block users’ access. These requirements will ensure that the need to act quickly to tackle harm is appropriately balanced against upholding fundamental rights.

It might be useful to say a little about how blocking works—

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes; he made a helpful point, and I will come back on it.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

We share a common interest in understanding whether it would be used against VPNs, but we may not necessarily have the same view about whether it should be. Do not take that as an encouragement—take it as a request for information.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, on behalf of my noble friend Lord Clement-Jones, I will speak in support of Amendments 195, 239, 263 and 286, to which he added his name. He wants me to thank the Carnegie Trust and the Institution of Engineering and Technology, which have been very helpful in flagging relevant issues for the debate.

Some of the issues in this group of amendments will range much more widely than simply the content we have before us in the Online Safety Bill. The right reverend Prelate the Bishop of Chelmsford is right to flag the question of a risk assessment. People are flagging to us known risks. Once we have a known risk, it is incumbent on us to challenge the Minister to see whether the Government are thinking about those risks, regardless of whether the answer is something in the Online Safety Bill or that there needs to be amendments to wider criminal law and other pieces of legislation to deal with it.

Some of these issues have been dealt with for a long time. If you go back and look at the Guardian for 9 May 2007, you will see the headline,

“Second Life in virtual child sex scandal”.


That case was reported in Germany about child role-playing in Second Life, which is very similar to the kind of scenarios described by various noble Lords in this debate. If Second Life was the dog that barked but did not bite, we are in quite a different scenario today, not least because of the dramatic expansion in broadband technology, for which we can thank the noble Baroness, Lady Harding, in her previous role. Pretty much everybody in this country now has incredible access, at huge scale, to high-speed broadband, which allows those kinds of real life, metaverse-type environments to be available to far more people than was possible with Second Life, which tended to be confined to a smaller group.

The amendments raise three significant groups of questions: first, on scope, and whether the scope of the Online Safety Bill will stretch to what we need; secondly, on behaviour, including the kinds of new behaviours, which we have heard described, that could arise as these technologies develop; and, finally, on agency, which speaks to some of the questions raised by the noble Baroness, Lady Fox, on AIs, including the novel questions about who is responsible when something happens through the medium of artificial intelligence.

On scope, the key question is whether the definition of “user-to-user”, which is at the heart of the Bill, covers everything that we would like to see covered by the Bill. Like the noble Baroness, Lady Harding, I look forward to the Minister’s response; I am sure that he has very strongly prepared arguments on that. We should take a moment to give credit to the Bill’s drafters for coming up with these definitions for user-to-user behaviours, rather than using phrases such as, “We are regulating social media or specific technology”. It is worth giving credit, because a lot of thought has gone into this, over many years, with organisations such as the Carnegie Trust. Our starting point is a better starting point than many other legislative frameworks which list a set of types of services; we at least have something about user-to-user behaviours that we can work with. Having said that, it is important that we stress-test that definition. That is what we are doing today: we are stress-testing, with the Minister, whether the definition of “user-to-user” will still apply in some of the novel environments.

It certainly seems likely—and I am sure that the Minister will say this—that a lot of metaverse activity would be in scope. But we need detailed responses from the Minister to explain why the kinds of scenario that have been described—if he believes that this is the case; I expect him to say so—would mean that Ofcom would be able to demand things of a metaverse provider under the framework of the user-to-user requirements. Those are things we all want to see, including the risk assessments, the requirement to keep people away from illegal content, and any other measures that Ofcom deems necessary to mitigate the risks on those platforms.

It will certainly be useful for the Minister to clarify one particular area. Again, we are fortunate in the UK that pseudo-images of child sexual abuse are illegal and have been illegal for a long time. That is not the case in every country around the world, and the noble Lord, Lord Russell, is quite right to say that this an area where we need international co-operation. Having dealt with it on the platforms, some countries have actively chosen not to criminalise pseudo-images; others just have not considered it.

In the UK, we were ahead of the game in saying, “If it looks like a photo of child abuse, we don’t care whether you created it on Photoshop, or whatever—it is illegal”. I hope that the Minister can confirm that avatars in metaverse-type environments would fall under that definition. My understanding is that the legislation refers to photographs and videos. I would interpret an avatar or activity in a metaverse as a photo or video, and I hope that is what the Government’s legal officers are doing.

Again, it is important in the context of this debate and the exchange that we have just had between the noble Baronesses, Lady Harding and Lady Fox, that people out there understand that they do not get away with it. If you are in the UK and you create a child sexual abuse image, you can be taken to court and go to prison. People should not think that, if they do it in the metaverse, it is okay—it is not okay, and it is really important that that message gets out there.

This brings us to the second area of behaviours. Again, some of the behaviours that we see online will be extensions of existing harms, but some will be novel, based on technical capabilities. Some of them we should just call by their common or garden term, which is sexual harassment. I was struck by the comments of the noble Baroness, Lady Berridge, on this. If people go online and start approaching other people in sexual terms, that is sexual harassment. It does not matter whether it is happening in a physical office, on public transport, on traditional social media or in the metaverse—sexual harassment is wrong and, particularly when directed at minors, a really serious offence. Again, I hope that all the platforms recognise that and take steps to prevent sexual harassment on their platforms.

That is quite a lot of the activity that people are concerned about, but others are much more complex and may require updates to legislation. Those are particularly activities such as role-playing online, where people play roles and carry out activities that would be illegal if done in the real world. That is particularly difficult when it is done between consenting adults, when they choose to carry out a role-playing activity that replicates an illegal activity were it to take place in the real world. That is hard—and those with long memories may remember a group of cases around Operation Spanner in the 1990s, whereby a group of men was prosecuted for consensual sadomasochistic behaviour. The case went backwards and forwards, but it talked to something that the noble Baroness, Lady Fox, may be sympathetic to—the point at which the state should intervene on sexual activities that many people find abhorrent but which take place between consenting adults.

In the context of the metaverse, I see those questions coming front and centre again. There are all sorts of things that people could role-play in the metaverse, and we will need to take a decision on whether the current legislation is adequate or needs to be extended to cater for the fact that it now becomes a common activity. Also important is the nature of it. The fact that it is so realistic changes the nature of an activity; you get a gut feeling about it. The role-playing could happen today outside the metaverse, but once you move it in there, something changes. Particularly when children are involved, it becomes something that should be a priority for legislators—and it needs to be informed by what actually happens. A lot of what the amendments seek to do is to make sure that Ofcom collects the information that we need to understand how serious these problems are becoming and whether they are, again, something that is marginal or something that is becoming mainstream and leading to more harm.

The third and final question that I wanted to cover is the hardest one—the one around agency. That brings us to thinking about artificial intelligence. When we try to assign responsibility for inappropriate or illegal behaviour, we are normally looking for a controlling mind. In many cases, that will hold true online as well. I know that the noble Lord, Lord Knight of Weymouth, is looking at bots—and with a classic bot, you have a controlling mind. When the bots were distributing information in the US election on behalf of Russia, that was happening on behalf of individuals in Russia who had created those bots and sent them out there. We still had a controlling mind, in that instance, and a controlling mind can be prosecuted. We have that in many instances, and we can expect platforms to control them and expect to go after the individuals who created the bots in the same way that we would go after things that they do as a first party. There is a lot of experience in the fields of spam and misinformation, where “bashing the bots” is the daily bread and butter of a lot of online platforms. They have to do it just to keep their platforms safe.

We can also foresee a scenario with artificial intelligence whereby it is less obvious that there is a controlling mind or who the controlling mind should be. I can imagine a situation whereby an artificial intelligence has created illegal content, whether that is child sexual abuse material or something else that is in the schedule of illegal content in the Bill, without the user having expected it to happen or the developer having believed or contemplated that it could happen. Let us say that the artificial intelligence goes off and creates something illegal, and that both the user and the developer can show the question that they asked of the artificial intelligence and show how they coded it, showing that neither of them intended for that thing to happen. In the definition of artificial intelligence, it has its own agency in that scenario. The artificial intelligence cannot be fined or sent to prison. There are some things that we can do: we can try to retrain it, or we can kill it. There is always a kill switch; we should never forget that with artificial intelligence. Sam Altman at OpenAI can turn off ChatGPT if it is behaving in an illegal way.

There are some really important questions around that issue. There is the liability for the specific instance of the illegality happening. Who do we hold liable? Even if everyone says that it was not their intention, is there someone that we can hold liable? What should the threshold be at which we can execute that death sentence on the AI? If an AI is being used by millions of people and on a small number of occasions it does something illegal, is that sufficient? At what point do we say that the AI is rogue and that, effectively, it needs to be taken out of operation? Those are much wider questions than we are dealing with immediately with in the Bill, but I hope that the Minister can at least point to what the Government are thinking about these kind of legal questions, as we move from a world of user-to-user engagement to user-to-user-to-machine engagement, when that machine is no longer a creature of the user.

Baroness Berridge Portrait Baroness Berridge (Con)
- Hansard - - - Excerpts

I have had time just to double-check the offences. The problem that exists—and it would be helpful if my noble friend the Minister could confirm this—is that the criminal law is defined in terms of person. It is not automatic that sexual harassment, particularly if you do not have a haptic suit on, would actually fall within the criminal law, as far as I understand it, which is why I am asking the Minister to clarify. That was the point that I was making. Harassment per se also needs a course of conduct, so if it was not a touch of your avatar in a sexual nature, it clearly falls outside criminal law. That is the point of clarification that we might need on how the criminal law is framed at the moment.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I am grateful to the noble Baroness. That is very helpful.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

That is exactly the same issue with child sexual abuse images—it is about the way in which criminal law is written. Not surprisingly, it is not up to date with evolution of technology.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I am grateful for that intervention as well. That summarises the core questions that we have for the Minister. Of the three areas that we have for him, the first is the question of scope and the extent to which he can assure us that the Bill as drafted will be robust in covering the metaverse and bots, which are the issues that have been raised today. The second is on behaviours and to the two interventions that we have just had. We have been asking whether, with the behaviours that are criminal today, that criminality will stretch to new, similar forms of behaviour taking place in new environments—let us put it that way. The behaviour, the intent and the harm are the same, but the environment is different. We want to understand the extent to which the Government are thinking about that, where that thinking is happening and how confident they are that they can deal with that.

Finally, on the question of agency, how do the Government expect to deal with the fact that we will have machines operating in a user-to-user environment when the connection between the machine and another individual user is qualitatively different from anything that we have seen before? Those are just some small questions for the Minister on this Thursday afternoon.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, the debate on this group has been a little longer, deeper and more important than I had anticipated. It requires all of us to reflect before Report on some of the implications of the things we have been talking about. It was introduced masterfully by the noble Baroness, Lady Harding, and her comments—and those from the noble Baronesses, Lady Finlay and Lady Berridge—were difficult to listen to at times. I also congratulate the Government Whip on the way he handled the situation so that innocent ears were not subject to some of that difficult listening. But the questions around the implications of virtual reality, augmented reality and haptic technology are really important, and I hope the Minister will agree to meet with the noble Baroness, Lady Berridge, and the people she referenced to reflect on some of that.

--- Later in debate ---
Clause 159 requires the Secretary of State to undertake a review into the operation of the regulatory framework between two and five years after the provisions come into effect. This review will consider any new emerging trends or technologies, such as AI, which could have the potential to compromise the efficacy of the Bill in achieving its objectives. I am happy to assure the noble Viscount, Lord Colville of Culross, and the right reverend Prelate the Bishop of Chelmsford that the review will cover all content and activity being regulated by the Bill, including legal content that is harmful to children and content covered by user-empowerment tools. The Secretary of State must consult Ofcom when she carries out this review.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

Will the review also cover an understanding of what has been happening in criminal cases where, in some of the examples that have been described, people have tried to take online activity to court? We will at that point understand whether the judges believe that existing offences cover some of these novel forms of activity. I hope the review will also extend not just to what Ofcom does as a regulator but to understand what the courts are doing in terms of the definitions of criminal activity and whether they are being effective in the new online spaces.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I believe it will. Certainly, both government and Parliament will take into account judgments in the court on this Bill and in related areas of law, and will, I am sure, want to respond.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, as we have said many times, this is a complex Bill. As we reflect on the priorities for Report, we can be more relaxed about some of the specifics on how Ofcom may operate, thereby giving it more flexibility—the flexibility it needs to be agile in the online world—if we as a Parliament trust Ofcom. Building trust, I believe, is a triangulation. First, there is independence from government—as discussed in respect of Secretary of State powers. Secondly, we need proper scrutiny by Parliament. Earlier today I talked about my desire for there to be proper post-legislative scrutiny and a permanent Joint Committee to do that. The third leg of the stool is the transparency to assist that scrutiny.

Clause 68 contains the provisions which would require category 1, 2A and 2B services to produce an annual transparency report containing information described by Ofcom in a notice given to the service. Under these provisions, Ofcom would be able to require these services to report on, among other things: information about the incidence of illegal content and content that is harmful to children; how many users are assumed to have encountered this content by means of the service; the steps and processes for users to report this content; and the steps and processes which a provider uses for dealing with this content.

We welcome the introduction of transparency reporting in relation to illegal content and content that is harmful to children. We agree with the Government that effective transparency reporting plays a crucial role in building Ofcom’s understanding of online harms and empowering users to make a more informed choice about the services they use.

However, despite the inclusion of transparency reporting in the Bill representing a step in the right direction, we consider that these requirements could and should be strengthened to do the trust building we think is important. First, the Bill should make clear that, subject to appropriate redactions, companies will be required to make their transparency reports publicly available—to make them transparent—hence Amendment 160A.

Although it is not clear from the Bill whether companies will be required to make these reports publicly available, we consider that, in most instances, such a requirement would be appropriate. As noted, one of the stated purposes of transparency reporting is that it would enable service users to make more informed choices about their own and their children’s internet use—but they can only do so if the reports are published. Moreover, in so far as transparency reporting would facilitate public accountability, it could also act as a powerful incentive for service providers to do more to protect their users.

We also recognise that requiring companies to publish the incidences of CSEA content on their platforms, for instance, may have the effect of encouraging individuals seeking such material towards platforms on which there are high incidences of that content—that must be avoided. I recognise that simply having a high incidence of CSEA content on a platform does not necessarily mean that that platform is problematic; it could just mean that it is better at reporting it. So, as ever with the Bill, there is a balance to be struck.

Therefore, we consider that the Bill should make it explicit that, once provided to Ofcom, transparency reports are to be made publicly available, subject to redactions. To support this, Ofcom should be required to produce guidance on the publication of transparency reports and the redactions that companies should make before making reports publicly accessible. Ofcom should also retain the power to stop a company from publishing a particular transparency report if it considers that the risk of directing individuals to illegal materials outweighs the benefit of making a report public—hence Amendments 160B and 181A.

Amendments 165 and 229 are in my noble friend Lord Stevenson’s name. Amendment 165 would broaden the transparency requirements around user-to-user services’ terms of service, ensuring that information can be sought on the scope of these terms, not just their application. As I understand it, scope is important to understand, as it is significant in informing Ofcom’s regulatory approach. We are trying to guard against minimal terms of service where detail is needed for users and Ofcom.

The proposed clause in Amendment 229 probes how Ofcom will review the effectiveness of the transparency requirements in the Bill. It would require Ofcom to undertake a review of the effectiveness of transparency reports within three years and every five years thereafter, and it would give the Secretary of State powers to implement any recommendations made by the regulator. The Committee should note that we also include a requirement that a Select Committee, charged by the relevant House, must consider and report on the regulations, with an opportunity for Parliament to debate them. So we link the three corners of the triangle rather neatly there.

If we agree that transparency is an important part of building trust in Ofcom in doing this difficult and innovative regulatory job—it is always good to see the noble Lord, Lord Grade, in his place; I know he is looking forward to getting on with this—then this proposed clause is sensible. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I am pleased that the noble Lord, Lord Knight of Weymouth, has given us an opportunity to talk about transparency reports with these amendments, which are potentially a helpful addition to the Bill. Transparency is one of the huge benefits that the legislation may bring. One of the concerns that the public have and that politicians have always had with online platforms is that they appear to be a black box—you cannot see what is going on in them.

In the entire edifice that we are constructing in the Online Safety Bill, there are huge opportunities to change that. The platforms will have to do risk assessments —there are measures in the Bill to make sure that information about these is put out—and they will have to take active steps to mitigate any risks they find. Again, we may get directions and guidance from Ofcom that will explain to the public exactly what is expected of them. The final piece of the jigsaw is the transparency reports that show the outcomes—how a platform has performed and what it has done to meet its obligations in dealing with content and behaviour on its services.

For the record, I previously worked for one of the platforms, and I would have said that I was on the pro-transparency wing of the transparency party inside the company. I believed that it was in the platform’s interest: if you do not tell people what you are doing, they will make things up about you, and what they make up will generally be worse than what you are actually doing. So there are huge advantages to the platforms from being transparent.

The noble Lord, Lord Knight, has picked up on some important points in his Amendment 160B, which talks about making sure that the transparency report is not counterproductive by giving the bad guys information that they could use to ill effect. That is a valid point; it is often debated inside the platforms. Sometimes, I argued furiously with my colleagues in the platforms about why we should disclose information. They would ask, “What about the bad guys?” Sometimes I challenged that, but other times it would have been a genuine and accurate concern. The noble Lord mentioned things such as child sexual abuse material, and we have to recognise that the bad guys are incredibly devious and creative, and if you show them anything that they can use against you to get around your systems, they will try to do that. That is a genuine and valid concern.

The sort of thing that you might put into a transparency report is, for example, whether you have banned particular organisations. I would be in favour of indicating to the public that an organisation is banned, but you can see that the potential impact of that is that all the people you are concerned about would create another organisation with a different name and then get back on to your platform. We need to be alive to those kinds of concerns.

It is also relevant to Amendment 165 and the terms of service that the more granular and detailed your terms of service are, the better they are for public information, but there are opportunities to get around them. Again, we would have that argument internally. I would say, “If we are prohibiting specific hate speech terms, tell people that, and then they won’t use them”. For me, that would be a success, as they are not using those hate speech terms anymore, but, of course, they may then find alternative hate speech terms that they can use instead. You are facing that battle all the time. That is a genuine concern that I hope we will be able to debate. I hope that Ofcom will be able to mitigate that risk by discussing with platforms what these transparency reports should look like. In a sense, we are doing a risk assessment of the transparency report process.

Amendment 229 on effectiveness is really interesting. My experience was that if you did not have a transparency report, you were under huge pressure to produce one and that once you produced one, nobody was interested. For fear of embarrassing anyone in the Committee, I would be curious to know how many noble Lords participating in this debate have read the transparency reports already produced by Meta Platforms, Google and others. If they have not read them, they should not be embarrassed, because my experience was that I would talk to regulators and politicians about something they had asked me to come in to talk about, such as hate speech or child sexual abuse material, and I learned to print off the transparency report. I would go in and say, “Well, you know what we are doing; it’s in our transparency report”. They would ask, “What transparency report?”, and I would have to show them. So, having produced a transparency report, every time we published it, we would expect there to be public interest, but little use was made of it. That is not a reason not to do them—as I said, I am very much in favour of doing them—but, on their own, they may not be effective, and Amendment 229 touches on that.

I was trying to think of a collective noun for transparency reports and, seeing as they shed light, I think it may be a “chandelier”. Where we may get the real benefit is if Ofcom can produce a chandelier of transparency reports, taking all the information it gets from the different platforms, processing it and selecting the most relevant information—the reports are often too long for people to work their way through—so that it can enable comparisons. That is really good and it is quite good for the industry that people know that platform A did this, platform B did that, and platform C did something else. They will take note of that, compare with each other and want to get into the best category. It is also critical that Ofcom puts this into user-friendly language, and Ofcom has quite a good record of producing intelligible reports. In the context of Amendment 229, a review process is good. One of the things that might come out of that, thinking ahead, would be Ofcom’s role in producing meta transparency reports, the chandelier that will shed light on what the whole sector is doing.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, for once I want to be really positive. I am actually very positive about this whole group of amendments because more transparency is essential in what we are discussing. I especially like Amendment 165 from the noble Lord, Lord Stevenson of Balmacara, because it is around terms of service for user-to-user services and ensures that information can be sought on the scope as well as the application. This is important because so much has been put on user-to-user services as well as on terms of service. You need to know what is going on.

I want particularly to compliment Amendment 229 that says that transparency reports should be

“of sufficient quality to enable service users and researchers to make informed judgements”,

et cetera. That is a very elegant way in which to say that they should not be gobbledegook. If we are going to have them, they should be clear and of a quality that we can read. Obviously, we do not want them to be unreadable and full of jargon and legalistic language. I am hoping that that is the requirement.

--- Later in debate ---
Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

My Lords, I have Amendments 185A and 268AA in this group. They are on different subjects, but I will deal with them in the same contribution.

Amendment 185A is a new clause that would introduce duties on online marketplaces to limit child access to listings of knives and take proactive steps to identify and remove any listings of knives or products such as ornamental zombie knives that are suggestive of acts of violence or self-harm. I am sure the Minister will be familiar with the Ronan Kanda case that has given rise to our bringing this amendment forward. The case is particularly horrible; as I understand it, sentencing is still outstanding. Two young boys bought ninja blades and machetes online and ultimately killed another younger boy with them. It has been widely featured in news outlets and is particularly distressing. We have had some debate on this in another place.

As I understand it, the Government have announced a consultation on this, among other things, looking at banning the sale of machetes and knives that appear to have no practical use other than being designed to look menacing or suitable for combat. We support the consultation and the steps set out in it, but the amendment provides a chance to probe the extent to which this Bill will apply to the dark web, where a lot of these products are available for purchase. The explanatory statement contains a reference to this, so I hope the Minister is briefed on the point. It would be very helpful to know exactly what the Government’s intention is on this, because we clearly need to look at the sites and try to regulate them much better than they are currently regulated online. I am especially concerned about the dark web.

The second amendment relates to racist abuse; I have brought the subject before the House before, but this is rather different. It is a bit of a carbon copy of Amendment 271, which noble Lords have already debated. It is there for probing purposes, designed to tease out exactly how the Government see public figures, particularly sports stars such as Marcus Rashford and Bukayo Saka, and how they think they are supposed to deal with the torrents of racist abuse that they receive. I know that there have been convictions for racist content online, but most of the abuse goes unpunished. It is not 100% clear that much of it will be identified and removed under the priority offence provisions. For instance, does posting banana emojis in response to a black footballer’s Instagram post constitute an offence, or is it just a horrible thing that people do? We need to understand better how the law will act in this field.

There has been a lot of debate about this issue, it is a very sensitive matter and we need to get to the bottom of it. A year and a half ago, the Government responded to my amendment bringing online racist abuse into the scope of what is dealt with as an offence, which we very much welcomed, but we need to understand better how these provisions will work. I look forward to the Minister setting that out in his response. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I rise to speak primarily to the amendments in the name of my noble friend Lord Clement-Jones, but I will also touch on Amendment 268AA at the same time. The amendments that I am particularly interested in are Amendments 200 and 201 on regulatory co-operation. I strongly support the need for this, and I will illustrate that with some concrete examples of why this is essential to bring to life the kinds of challenges that need to be dealt with.

The first example relates to trying to deal with the sexual grooming of children online, where platforms are able to develop techniques to do that. They can do that by analysing the behaviour of users and trying to detect whether older users are consistently trying to approach younger users, and the kind of content of the messages they may be sending to them where that is visible. These are clearly highly intrusive techniques. If a platform is subject to the general data protection regulation, or the UK version of that, it needs to be very mindful of privacy rights. We clearly have, there, two potentially interested bodies in the UK environment. We have the child protection agencies, and we will have, in future, Ofcom seeking to ensure that the platform has met its duty of care, and we will have the Information Commission’s Office.

A platform, in a sense, can be neutral as to what it is instructed to do by the regulator. Certainly, my experience was that the platforms wanted to do those kinds of activities, but they are neutral in the sense that they will do what they are told is legal. There, you need clarity from the regulators together to say, “Yes, we have looked at this and you are not going to do something on the instruction of the child safety agency and then get criticised, and potentially fined, by the Data Protection Agency for doing the thing you have been instructed to do”—so we need those agencies to work together.

The second example is in the area of co-operation around antiterrorism, another key issue. The platforms have created something called the Global Internet Forum to Counter Terrorism. Within that forum, they share tools and techniques—things such as databases of information about terrorist content and systems that you can use to detect them—and you are encouraged within that platform to share those tools and techniques with smaller platforms and competitors. Clearly, again, there is a very significant set of questions, and if you are in a discussion around that, the lawyers will say, “Have the competition lawyers cleared this?” Again, therefore, something that is in the public interest—that all the platforms should be using similar kinds of technology to detect terrorist content—is something where you need a view not just from the counterterrorism people but also, in our case, from the Competition and Markets Authority. So, again, you need those regulators to work together.

The final example is one which I know is dear to the heart of the noble Baroness, Lady Morgan of Cotes, which is fraudsters, which we have dealt with, where you might have patterns of behaviour where you have information that comes from the telecoms companies regulated by Ofcom, the internet service providers, regulated by Ofcom, and financial institutions, regulated by their own family of regulators—and they may want to share data with each other, which is something that is subject to the Information Commission’s Office again. So, again, if we are going to give platforms instructions, which we rightly do in this legislation, and say, “Look, we want you to get tougher on online fraudsters; we want you to demonstrate a duty of care there”, the platforms will need—certainly those regulators: financial regulators, Ofcom and the Information Commissioner’s Office—to sort those things out.

Having a forum such as the one proposed in Amendment 201, where these really difficult issues can be thrashed out and clear guidance can be given to online services, will be much more efficient than what sometimes happened in the past, where you had the left hand and the right hand of the regulatory world pulling you in different directions. I know that we have the Digital Regulation Cooperation Forum. If we can build on those institutions, it is essential and ideal that they have their input before the guidance is issued, rather than have a platform comply with guidance from regulator A and then get dinged by regulator B for doing the thing that they have been instructed to do.

That leads to the very sensible Amendment 201 on skilled persons. Again, Ofcom is going to be able to call in skilled persons. In an area such as data protection, that might be a data protection lawyer, but, equally, it might be that somebody who works at the Information Commissioner’s Office is actually best placed to give advice. Amendment 200—the first of the two that talks about skilled persons being able to come from regulators—makes sense.

Finally, I will touch on the issues raised in Amendment 268AA—I listened carefully and understand that it is a probing amendment. It raises some quite fundamental questions of principle—I suspect that the noble Baroness, Lady Fox, might want to come in on these—and it has been dealt with in the context of Germany and its network enforcement Act: I know the noble Lord, Lord Parkinson of Whitley Bay, can say that in the original German. That Act went in the same direction, motivated by similar concerns around hate speech.

--- Later in debate ---
We have heard the term “rabbit hole”; there is a rabbit hole, where people intent on self-harm or indeed those who suffer from eating disorders go from larger platforms to smaller and niche ones where they encounter the very content that feeds their addiction, or which fuels and enables their desire to self-harm. As I said in a previous grouping, this cannot be the intention of the Bill, I do not believe it is the intention of the Government, and I hope that the Minister will listen to the arguments that the noble Baroness, Lady Morgan of Cotes, set out so effectively.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I am a poor substitute for the noble Baroness, Lady Parminter, in terms of the substance of the issues covered by these amendments, but I am pleased that we have been able to hear from the noble Baroness, Lady Bull, on that. I will make a short contribution on the technology and the challenges of classification, because there are some important issues here that the amendments bring out.

We will be creating rules for categorising platforms. As I understand it, the rules will have a heavy emphasis on user numbers but will not be exclusively linked to user numbers. It would be helpful if the Minister could tease out a little more about how that will work. However, it is right even at this stage to consider the possibility that there will need to be exceptions to those rules and to have a mechanism in place for that.

We need to recognise that services can grow very quickly these days, and some of the highest-risk moments may be those when services have high growth but still very little revenue and infrastructure in place to look after their users. This is a problem generally with stepped models, where you have these great jumps; in a sense, a sliding scale would be more rational, so that responsibilities increase over time, but clearly from a practical view it is hard to do that, so we are going to end up with some kind of step model.

We also need to recognise that, from a technical point of view, it is becoming cheaper and easier to build new user-to-user services all the time. That has been the trend for years, but it is certainly the case now. If someone wants to create a service, they can rent the infrastructure from a number of providers rather than buying it, they can use a lot of code that is freely available—they do not need to write as much code as they used to—and they can promote their new service using all the existing social networks, so you can go from zero to significant user numbers in very quick time, and that is getting quicker all the time. I am interested to hear how the Minister expects such services to be regulated.

The noble Baroness, Lady Morgan, referred to niche platforms. There will be some that have no intention to comply, even if we categorise them as a 2B service. The letter will arrive from Ofcom and go in the bin. They will have no interest whatever. Some of the worst services will be like that. The advantage of us ensuring that we bring them into scope is that we can move through the enforcement process quickly and get to business disruption, blocking, or whatever we need to do to get them out of the UK market. Other niche services will be willing to come into line if they are told they are categorised as 2B but given a reasonable set of requirements. Some of Ofcom’s most valuable work might be precisely to work with them: services that are borderline but recognise that they want to have a viable business, and they do not have a viable business by breaking the law. We need to get hold of them and bring them into the net to be able to work with them.

Finally, there is another group which is very mainstream but in the growing phase and busy growing and not worrying about regulation. For that category of company, we need to work with them as they grow, and the critical thing is to get them early. I think the amendments would help Ofcom to be able get to them early—ideally, in partnership with other regulators, including the European Union, which is now regulating in a similar way under the Digital Services Act. If we can work with those companies as they come into 2B, then into category 1—in European speak, that is a VLOP, a very large online platform—and get them used to the idea that they will have VLOP and category 1 responsibilities before they get there, we can make a lot more progress. Then we can deliver what we are all trying to, which is a safer internet for people in the UK

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

I shall speak very briefly at this hour, just to clarify as much as anything. It seems important to me that there is a distinction between small platforms and large platforms, but my view has never been that if you are small, you have no potential harms, any more than if you are large, you are harmful. The exception should be the rule. We have to be careful of arbitrary categorisation of “small”. We have to decide who is going to be treated as though they are a large category 1 platform. I keep saying but stress again: do not assume that everybody agrees what significant risk of harm or hateful content is. It is such highly disputed political territory outside the online world and this House that we must recognise that it is not so straightforward.

I am very sympathetic, by the way, to the speeches made about eating disorders and other issues. I see that very clearly, but other categories of speech are disputed and argued over—I have given loads of examples. We end up where it is assumed that the manifestoes of mass shooters appear on these sites, but if you read any of those manifestoes of mass shooters, they will often be quoting from mainstream journalists in mainstream newspapers, the Bible and a whole range of things. Just because they are on 4Chan, or wherever, is not necessarily the problem; it is much more complicated.

I ask the Minister, and the proposers of the amendment, to some extent: would it not be straightforwardly the case that if there is a worry about a particular small platform, it might be treated differently—

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I just want to react to the manifestos of mass shooters. While source material such the Bible is not in scope, I think the manifesto of a shooter is clear incitement to terrorism and any platform that is comfortable carrying that is problematic in my view, and I hope it would be in the noble Baroness’s view as well.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I was suggesting that we have a bigger problem than it appearing on a small site. It quotes from mainstream media, but it ends up being broadly disseminated and not because it is on a small site. I am not advocating that we all go round carrying the manifestos of mass shooters and legitimising them. I was more making the point that it can be complicated. Would not the solution be that you can make appeals that a small site is treated differently? That is the way we deal with harmful material in general and the way we have dealt with, for example, RT as press without compromising on press freedom. That is the kind of point I am trying to make.

I understand lots of concerns but I do not want us to get into a situation where we destroy the potential of all smaller platforms—many of them doing huge amounts of social good, part of civil society and all the rest of it—by treating them as though they are large platforms. They just will not have the resources to survive, that is all my point is.

Therefore, using big business money, in this instance, as a weapon to dictate editorial content shows that press freedom is on the line in a variety of ways. Women arguing for protecting single-sex sport, and then being subject to vile misogyny, themselves being described as using transphobic hate speech shows me, at least, that in the name of fighting hate we should not have any attempts to assault press freedom. I will oppose all three of these amendments.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I support Amendment 227 in particular. I am pleased to contribute, as someone who gave evidence to the Leveson inquiry, explaining why social media should not be in scope for any new press regulation scheme. It is entertaining for me now to come through the looking glass and listen to the noble Lords, Lord Black of Brentwood and Lord Faulks, in particular making the kinds of argument I made then, as we discuss whether the press should be in scope for a new social media regulatory scheme.

These amendments are a helpful way to test how the Government expect their decision to afford certain privileges for online activity by journalists and news publishers to work. That is what the regime does, in effect, with the rationale, which was explained to us, that this is why certain bodies can be privileged when using user-to-user services and search engines in a way that, if they were not afforded that status, they would not be given those privileges. Again, it is noteworthy that there has often been criticism of social media precisely for giving special treatment to some users, including in stories in some of the press that we are talking about, and here we are creating not just a state sanction but a state-ordered two-tier system that all the social media companies will now have to adopt. That creates some interesting questions in itself.

I want to press the Minister primarily on definitions. It is certainly my experience that definitions of who is a journalist or a news media publisher are challenging and can be highly political. There have been several pressure points, pushing social media companies to try to define journalists and news publishers for themselves, outside of any regulatory scheme—notably following the disputes about misinformation and disinformation in the United States. The European Union also has a code of practice on misinformation and disinformation. Every time someone approaches this subject, they ask social media companies to try to distinguish journalists and news media from other publishers. So these efforts have been going on for some time, and many of them have run into disputes because there is no consistent agreement about who should be in or outside those regimes. This is one of those problems that seems clear and obvious when you stand back from it, but the more that you zoom in, the more complex and messy it becomes. We all say, “Oh yes, journalists and news publishers—that is fine”, and we write that in the legislation, but, in practice, it will be really hard when people have to make decisions about individuals.

Some news organisations are certainly highly problematic. Most terrorist organisations have news outlets and news agencies. They do not advertise themselves as such but, if you work in a social media platform, you have to learn to distinguish them. They are often presented entirely legitimately, and some of the information that you use to understand why they are problematic may be private, which creates all sorts of problems. Arguably, this is the Russia Today situation: it presented itself as legitimate and was registered with Ofcom for a period of time; we accepted that it was a legitimate news publisher, but we changed our view because we regard the Russian Government as a terrorist regime, in some senses. That is happening all of the time, with all sorts of bodies across the world that have created these news organisations. In the Middle East in particular, you have to be extraordinarily careful—you think that something is a news organisation but you then find that it has a Hezbollah connection and, there you go, you have to try to get rid of it. News organisations tied to extremist organisations is one area that is problematic, and my noble friend referred to it already.

There is also an issue with our domestic media environment. Certainly, most people would regard Gary Lineker as a journalist who works for a recognised news publisher—the BBC—but not everyone will agree with that definition. Equally, most people regard the gentleman who calls himself Tommy Robinson as not being a journalist; however much he protests that he is in front of judges and others, and however much support he has from recognised news publishers in the United States, most people would say that he is not a journalist. The community of people who agree that Gary Lineker is not a journalist and that of people who think that Tommy Robinson is not a journalist do not overlap much, but I make the point that there is continually this contention about individuals, and people have views about who should be in or out of any category that we create.

This is extraordinarily difficult, as in the Bill we are tasking online services with a very hard job. In a few lines of it, we say: “Create these special privileges for these people we call journalists and news publishers”. That is going to be really difficult for them to do in practice and they are going to make mistakes, either exclusionary or inclusionary. We are giving Ofcom an incredibly difficult role, which is why this debate is important, because it is going to have to adjudicate when that journalist or news publisher says to Ofcom: “I think this online platform is breaching the Online Safety Act because of the way it treated me”. Ofcom is going to have to take a view about whether that organisation or individual is legitimate. Given the individuals I named, you can bet your bottom dollar that someone is going to go to Ofcom and say, “I don’t think that Gary Lineker or the BBC are legitimate”. That one should be quite easy; others across the spectrum will be much more difficult for it to deal with.

That is the primary logic underlying Amendment 227: we have known unknowns. There will be unanticipated effects of this legislation and, until it is in place and those decisions are being made, we do not know how it will work. Frankly, we do not know whether, as a result of legal trickery and regulatory decisions, we have inadvertently created a loophole where some people will be able to go and win court cases by claiming protections that we did not intend them to have. I disagree with the noble Lord, Lord Black: I do not think Amendment 227 undermines press freedom in any sense at all. All it does is to say: “We have created an Online Safety Bill. We expect it to enhance people’s safety and within it we have some known unknowns. We do not know how this exemption is going to work. Why not ask Ofcom to see if any of those unintended consequences happen?”

I know that we are labouring our way through the Online Safety Bill version 1, so we do not want to think about an online safety Bill version 2, but there will at some point have to be a revision. It is entirely rational and sensible that, having put this meaningful exemption in there—it has been defended, so I am sure that the Government will not want to give it up—the least we can do is to take a long, hard look, without interfering with press freedom, and get Ofcom to ask, “Did we see those unintended consequences? Do we need to look at the definitions again?”

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the noble Lord, Lord Allan, has clearly and comprehensively painted a picture of the complex world in which we now live, and I do not think that anybody can disagree with that or deny it. We are in a world which is going to keep evolving; we have talked in lots of other contexts about the pace of change, and so on. However, in recognising all that, what the noble Lord has just described—the need for constant evaluation of whether this regime is working effectively—is a job for Parliament, not for Ofcom. That is where I come back to in starting my response to this group of amendments.

Briefly—in order that we can get to the wind-ups and conclude business for the day—ensuring that recognised news publishers and organisations are not subject to Ofcom or any form of state regulation is a vital principle. I am pleased that the Government have included the safeguards which they have in the legislation, while also making it much harder for the tech platforms to restrict the freedom of recognised news publishers and users’ access to them.

I reiterate that I understand that this is becoming increasingly complicated, but these are important principles. We have to start in the world that we currently understand and know, ensure that we protect those publications which we recognise as trusted news providers now, and do not give way on those principles. As my noble friend Lord Black said, regarding debates about Section 40 of the Crime and Courts Act, there will be an opportunity to re-evaluate that in due course when we come to the media Bill. For what it is worth, my personal view is that I support the Government’s intention to remove it.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

I think it is actually quite important that there is—to use the language of the Bill—a risk assessment around the notion that people might game it. I thought the noble Baroness, Lady Gohir, made a very good point. People are very inventive and, if you have ever engaged with the people who run some of those big US misinformation sites—let us just call them that—you will know that they have very inventive, very clever people. They will be looking at this legislation and if they figure out that by opening a UK office and ticking all the boxes they will now get some sorts of privileges in terms of distributing their misinformation around the world, they will do it. They will try it, so I certainly think it is worth there being at least some kind of risk assessment against that happening.

In two years’ time we will be able to see whether the bad thing happened, but whether or not it is the Minister having a conversation with Ofcom now, I just think that forewarned is forearmed. We know that that is a possibility and it would be helpful for some work to be done now to make sure that that is not a loophole that none of us want, I think.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

I am mindful of the examples the noble Lord gave in his speech. Looking at some of the provisions set out in subsection (2) about a body being

“subject to a standards code”

or having

“policies and procedures for handling and resolving complaints”,

I think on first response that those examples he gave would be covered. But I will certainly take on board the comments he made and those the noble Baroness, Lady Gohir, made as well and reflect on them. I hope—

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I rise very briefly to support the amendments in the name of the noble Baroness, Lady Stowell, and the noble Lord, Lord Stevenson. Like other speakers, I put on record my support for the regulator being offered independence and Parliament having a role.

However, I want to say one very brief and minor thing about timing—I feel somewhat embarrassed after the big vision of the noble Baroness, Lady Stowell. Having had quite a lot of experience of code making over the last three years, I experienced the amount of time that the department was able to take in responding to the regulator as being a point of power, a point of lobbying, as others have said, and a point of huge distraction. For those of us who have followed the Bill for five years and as many Secretaries of State, we should be concerned that none of the amendments has quite tackled the question of time.

The idea of acting within a timeframe is not without precedent; the National Security and Investment Act 2021 is just one recent example. What was interesting about that Act was that the reason given for the Secretary of State’s powers being necessary was as a matter of national security—that is, they were okay and what we all agree should happen—but the reason for the time restriction was for business stability. I put it to the Committee that the real prospect of children and other users being harmed requires the same consideration as business stability. Without a time limit, it is possible that inaction can be used to control or simply fritter away.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I will make a short contribution on this substantive question of whether concerns about ministerial overreach are legitimate. Based on a decade of being on the receiving end of representations from Ministers, the short answer is yes. I want to expand on that with some examples.

My experience of working on the other side, inside a company, was that you often got what I call the cycle of outrage: something is shared on social media that upsets people; the media write a front-page story about it; government Ministers and other politicians get involved; that then feeds back into the media and the cycle spins up to a point where something must be done. The “something” is typically that the Minister summons people, such as me in my old job, and brings them into an office. That itself often becomes a major TV moment, where you are brought in, browbeaten and sent out again with your tail between your legs, and the Minister has instructed you to do something. That entire process takes place in the political rather than the regulatory domain.

I readily concede that, in many cases, something of substance needed to be addressed and there was a genuine problem. It is not that this was illegitimate, but these amendments are talking about the process for what we should do when that outrage is happening. I agree entirely with the tablers of the amendments that, to the extent that that process can be encapsulated within the regulator rather than a Minister acting on an ad hoc basis, it would be a significant improvement.

I also note that this is certainly not UK-specific, and it would happen in many countries with varying degrees of threat. I remember being summoned to the Ministry of the Interior in Italy to meet a gentleman who has now sadly passed. He brought me into his office, sat me down, pointed to his desk and said “You see that desk? That was Mussolini’s desk”. He was a nice guy and I left with a CD of his rhythm and blues band, but it was clear that I was not supposed to say no to him. He made a very clear and explicit political direction about content that was on the platform.

One big advantage of this Bill is that it has the potential to move beyond that world. It could move from individual people in companies—the noble Baroness, Lady Stowell of Beeston, made this point very powerfully—to changing the accountability model away from either platforms being entirely accountable themselves or platforms and others, including Ministers, somehow doing deals that will have an impact, as the noble Baroness, Lady Fox, and the noble Viscount, Lord Colville, said, on the freedom of expression of people across the country. We do not want that.

We want to move on in the Bill and I think we have a model which could work. The regulator will take on the outrage and go as far as it can under the powers granted in the Bill. If the regulator believes that it has insufficient powers, it will come back to Parliament and ask for more. That is the way in which the system can and should work. I think I referred to this at Second Reading; we have an opportunity to create clear accountability. Parliament instructs Ofcom, which instructs the platforms. The platforms do what Ofcom says, or Ofcom can sanction them. If Ofcom feels that its powers are deficient, it comes back to Parliament. The noble Lord, Lord Stevenson, and others made the point about scrutiny and us continually testing whether Ofcom has the powers and is exercising them correctly. Again, that is entirely beneficial and the Government should certainly be minded to accept those amendments.

With the Secretary of State powers, as drafted in the Bill and without the amendments we are considering today, we are effectively taking two steps forward and one step back on transparency and accountability. We have to ask: why take that step back when we are able to rely on Ofcom to do the job without these directions?

The noble Baroness, Lady Stowell of Beeston, made the point very clearly that there are other ways of doing this. The Secretary of State can express their view. I am sure that the Minister will be arguing that the Secretary of State’s powers in the Bill are better than the status quo because at least what the Secretary of State says will be visible; it will not be a back-room deal. The noble Baroness, Lady Stowell of Beeston, has proposed a very good alternative, where the Secretary of State makes visible their intentions, but not in the form of an order—rather in the form of advice. The public—it is their speech we are talking about—then have the ability to see whether they agree with Ofcom, the companies or the Secretary of State if there is any dispute about what should happen.

It is certainly the case that visible instructions from the Secretary of State would be better, but the powers as they are still leave room for arm-twisting. I can imagine a future scenario in which future employees of these platforms are summoned to the Secretary of State. But now the Secretary of State would have a draft order sitting there. The draft order is Mussolini’s desk. They say to the people from the platforms, “Look, you can do what I say, or I am going to send an order to Ofcom”. That takes us back to this world in which the public are not seeing the kind of instructions being given.

I hope that the Government will accept that some amendment is needed here. All the ones that have been proposed suggest different ways of achieving the same objective. We are trying to protect future Secretaries of State from an unhealthy temptation to intervene in ways that they should not.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, on day eight of Committee, I feel that we have all found our role. Each of us has spoken in a similar vein on a number of amendments, so I will try to be brief. As the noble Lord, Lord Allan, has spoken from his experience, I will once again reference my experience as the chief executive, for seven years, of a business regulated by Ofcom; as the chair of a regulator; and as someone who sat on the court of, arguably, the most independent of independent regulators, the Bank of England, for eight years.

I speak in support of the amendments in the name of my noble friend Lady Stowell, because, as a member of the Communications and Digital Committee, my experience, both of being regulated and as a regulator, is that independent regulators might be independent in name—they might even be independent in statute—but they exist in the political soup. It is tempting to think that they are a sort of granite island, completely immovable in the political soup, but they are more like a boat bobbing along in the turbulence of politics.

As the noble Lord, Lord Allan, has just described, they are influenced both overtly and subtly by the regulated companies themselves—I am sure we have both played that game—by politicians on all sides, and by the Government. We have played these roles a number of times in the last eight days; however, this is one of the most important groups of amendments, if we are to send the Bill back in a shape that will really make the difference that we want it to. This group of amendments challenges whether we have the right assignment of responsibility between Parliament, the regulator, government, the regulated and citizens.

It is interesting that we—every speaker so far—are all united that the Bill, as it currently stands, does not get that right. To explain why I think that, I will dwell on Amendment 114 in the name of my noble friend Lady Stowell. The amendment would remove the Secretary of State’s ability to direct Ofcom to modify a draft of the code of practice “for reasons of public policy”. It leaves open the ability to direct in the cases of terrorism, child sexual abuse, national security or public safety, but it stops the Secretary of State directing with regard to public policy. The reason I think that is so important is that, while tech companies are not wicked and evil, they have singularly failed to put internet safety, particularly child internet safety, high enough up their pecking order compared with delivering for their customers and shareholders. I do not see how a Secretary of State will be any better at that.

Arguably, the pressures on a Secretary of State are much greater than the pressures on the chief executives of tech companies. Secretaries of State will feel those pressures from the tech companies and their constituents lobbying them, and they will want to intervene and feel that they should. They will then push that bobbing boat of the independent regulator towards whichever shore they feel they need to in the moment—but that is not the way you protect people. That is not the way that we treat health and safety in the physical world. We do not say, “Well, maybe economics is more important than building a building that’s not going to fall down if we have a hurricane”. We say that we need to build safe buildings. Some 200 years ago, we were having the same debates about the physical world in this place; we were debating whether you needed to protect children working in factories, and the consequences for the economics. Well, how awful it is to say that today. That is the reality of what we are saying in the Bill now: that we are giving the Secretary of State the power to claim that the economic priority is greater than protecting children online.

I am starting to sound very emotional because at the heart of this is the suggestion that we are not taking the harms seriously enough. If we really think that we should be giving the Secretary of State the freedom to direct the regulator in such a broad way, we are diminishing the seriousness of the Bill. That is why I wholeheartedly welcome the remark from the noble Lord, Lord Stevenson, that he intends to bring this back with the full force of all of us across all sides of the Committee, if we do not hear some encouraging words from my noble friend the Minister.

--- Later in debate ---
Lord Farmer Portrait Lord Farmer (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I support the noble Baroness, Lady Benjamin, in bringing the need for consistent regulation of pornographic content to your Lordships’ attention and have added my name in support of Amendment 185. I also support Amendments 123A, 142, 161, 183, 184 and 306 in this group.

There should not be separate regimes for how pornographic content is regulated in this country. I remember discussions about this on Report of the Digital Economy Bill around six years ago. The argument for not making rules for the online world consistent with those for the offline world was that the CPS was no longer enforcing laws on offline use anyway. Then as now, this seems simply to be geared towards letting adults continue to have unrestricted access to an internet awash with pornographic material that depicts and/or promotes child sexual abuse, incest, trafficking, torture, and violent or otherwise harmful sexual acts: adult freedoms trumping all else, including the integrity of the legal process. In the offline world, this material is illegal or prohibited for very good reason.

The reason I am back here, arguing again for parity, is that, since 2017, an even deeper seam of academic research has developed which fatally undermines the case for untrammelled cyber-libertarianism. It has laid bare the far-reaching negative impacts that online pornography has had on individuals and relationships. One obvious area is the sharp rise in mental ill-health, especially among teenagers. Research from CEASE, the Centre to End All Sexual Exploitation, found that over 80% of the public would support new laws to limit free and easy access.

Before they get ensnared—and some patients of the Laurel Centre, a private pornography addiction clinic, watch up to 14 hours of pornography a day—few would have been aware that sexual arousal chained to pornography can make intimate physical sex impossible to achieve. Many experience pornography-induced erectile dysfunction and Psychology Today reports that

“anywhere from 17% to 58% of men who self-identify as heavy/compulsive/addicted users of porn struggle with some form of sexual dysfunction”.

As vice-chair of the APPG on Issues Affecting Men and Boys, I am profoundly concerned that very many men and boys are brutalised by depictions of rape, incest, violence and coercion, which are not niche footage on the dark web but mainstream content freely available on every pornography platform that can be accessed online with just a few clicks.

The harms to their growing sons, which include an inability to relate respectfully to girls, should concern all parents enough to dial down drastically their own appetite for porn. There is enormous peer pressure on teenage boys and young men to consume it, and its addictive nature means that children and young people, with their developing brains, are particularly susceptible. One survey of 14 to 18 year-olds found almost a third of boys who used porn said it had become a habit or addiction and a third had enacted it. Another found that the more boys watched porn and were sexually coercive, the less respect they had for girls.

Today’s headlines exposed the neurotoxins in some vaping products used by underage young people. There are neurotoxins in all the porn that would be caught by subsection 368E(2) of the Communications Act 2003, if it was offline—hence the need for parity and, just like the vapes, children as well as adults will continue to be exposed. Trustworthy age verification will stop children stumbling across it or finding it in searches, but adults who are negligent, or determined to despoil children’s innocence, will facilitate their viewing it if it remains available online. This Bill will not make the UK the safest place in the world for children online if we continue to allow content that should be prohibited, for good reason, to flood into our homes.

Helen Rumbelow, writing in the Times earlier this month, said the public debate—the backdrop to our own discussions in this Bill—is “spectacularly ill-informed” because we only talk about porn’s side-effects and not what is enacted. So here goes. Looking at the most popular pages of the day on Pornhub, she found that 12 out of 32 showed men physically abusing women. One-third of these showed what is known as “facial abuse”, where a woman’s airway is blocked by a penis: a porn version of waterboarding torture. She described how

“in one a woman is immobilised and bound by four straps and a collar tightened around her neck. She ends up looking like a dead body found in the boot of a car. In another a young girl, dressed to look even younger in a pair of bunny ears and pastel socks, is held down by an enormous man pushing his hand on her neck while she is penetrated. The sounds that came from my computer were those you might expect from a battle hospital: cries of pain, suction and “no, no, no”. I won’t tell you the worst video I saw as you may want to stop reading now. I started to have to take breaks to go outside and look at the sky and remember kindness”.

Turning briefly to the other amendments, I thank my noble friend Lord Bethell for his persistence in raising the need for the highest standard of age verification for pornography. I also commend the noble Baroness, Lady Kidron, for her continued commitment to protecting children from harmful online content and for representing so well the parents who have lost children, in the most awful of circumstances, because of online harms. I therefore fully support the package of amendments in this group tabled by the noble Baroness, Lady Kidron, and my noble friend Lord Bethell.

This Bill should be an inflection point in history and future generations will judge us on the decisions we make now. It is highly like they will say “Shame on them”. To argue that we cannot put the genie back in the bottle is defeatist and condemns many of our children and grandchildren to the certainty of a dystopic relational future. I say “certain” because it is the current reality of so many addicted adults who wish they could turn back the clock. Therefore, it is humane and responsible, not quaint or retrogressive, to insist that this Government act decisively to make online and offline laws consistent and reset the dial.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I will speak to my Amendment 232, as well as addressing issues raised more broadly by this group of amendments. I want to indicate support from these Benches for the broader package of amendments spoken to so ably by the noble Baroness, Lady Kidron. I see my noble friend Lord Clement-Jones has returned to check that I am following instructions during my temporary occupation of the Front Bench.

The comments I will make are going to focus on an aspect which I think we have not talked about so much in the debate, which is age assurance in the context of general purpose, user-to-user and search services, so-called Part 3, because we like to use confusing language in this Bill, rather than the dedicated pornography sites about which other noble Lords have spoken so powerfully. We have heard a number of contributions on that, and we have real expertise in this House, not least from my noble friend Lady Benjamin.

In the context of age assurance more generally, I start with a pair of propositions that I hope will be agreed to by all participants in the debate and build on what I thought was a very balanced and highly informative introduction from the noble Baroness, Lady Kidron. The first proposition is that knowledge about the age of users can help all online platforms develop safer services than they could absent that information—a point made by the right reverend Prelate the Bishop of Oxford earlier. The second is that there are always some costs to establishing age, including to the privacy of users and through some of the friction they encounter when they wish to use a service. The task before us is to create mechanisms for establishing age that maximise the safety benefits to users while minimising the privacy and other costs. That is what I see laid out in the amendment that the noble Baroness, Lady Kidron, has put before us.

My proposed new clause seeks to inform the way that we construct that balance by tasking Ofcom with carrying out regular studies into a broad range of approaches to age assurance. This is exactly the type of thinking that is complementary to that in Amendment 142; it is not an alternative but complementary to it. We may end up with varying views on exactly where that balance should be struck. Again, I am talking about general purpose services, many of which seek to prohibit pornography—whether they do so 100%, it is a different set of arguments from those that apply to services which are explicitly dedicated to pornography. We may come to different views about where we eventually strike the balance but I think we probably have a good, shared understanding of the factors that should be in play. I certainly appreciate the conversations I have had with the noble Baroness, Lady Kidron, and others about that, and think we have a common understanding of what we should be considering.

If we can get this formulation right, age assurance may be one of the most significant measures in the Bill in advancing online safety, but if we get it wrong, I fear we may create a cookie banner scenario, such as the one I warned about at Second Reading. This is my shorthand for a regulatory measure that brings significant costs without delivering its intended benefits. However keen we are to press ahead, we must always keep in mind that we do not want to create legislation that is well-intended but does not have the beneficial effect that we all in this Committee want.

Earlier, the noble Baroness, Lady Harding, talked about the different roles that we play. I think mine is to try to think about what will actually work, and whether the Bill will work as intended, and to try to tease out any grit in it that may get in the way. I want in these remarks to flag what I think are four key considerations that may help us to deliver something that is actually useful and avoid that cookie banner outcome, in the context of these general purpose, Part 3 services.

First, we need to recognise that age assurance is useful for enabling as well as disabling access to content—a point that the noble Baroness, Lady Kidron, rightly made. We rightly focus on blocking access to bad content, but other things are also really important. For example, knowing that a user is very young might mean that the protocol for the reporting system gets to that user report within one hour, rather than 24 hours for a regular report. Knowing that a user is young and is being contacted by an older user may trigger what is known as a grooming protocol. Certainly at Facebook we had that: if we understood that an older user was regularly contacting younger users, that enabled us to trigger a review of those accounts to understand whether something problematic was happening—something that the then child exploitation and online protection unit in the UK encouraged us to implement. A range of different things can then be enabled. The provision of information in terms that a 13 year-old would understand can be triggered if you know the age of that user.

Equally, perfectly legitimate businesses, such as alcohol and online gambling businesses, can use age assurance to make sure that they exclude people who should not be part of that. We in this House are considering measures such as junk food advertising restrictions, which again depend on age being known to ensure that junk food which can be legitimately marketed to older people is not marketed to young people. In a sense, that enables those businesses to be online because, absent the age-gating, they would struggle to meet their regulatory obligations.

Secondly, we need to focus on outcomes, using the risk assessment and transparency measures that the Bill creates for the first time. We should not lose sight of those. User-to-user and search services will have to do risk assessments and share them with Ofcom, and Ofcom now has incredible powers to demand information from them. Rather than asking, “Have you put in an age assurance system?”, we can ask, “Can you tell us how many 11 year-olds or 15 year-olds you estimate access the wrong kind of content?”, and, “How much pornography do you think there is on your service despite the fact that you have banned it?” If the executives of those companies mislead Ofcom or refuse to answer, there are criminal sanctions in the Bill.

The package for user-to-user and search services enables us to really focus on those outcomes and drill down. In many cases, that will be more effective. I do not care whether they have age-assurance type A or type B; I care whether they are stopping 99.9% of 11 year-olds accessing the wrong kind of content. Now, using the framework in the Bill, Ofcom will be able to ask those questions and demand the answers, for the first time ever. I think that a focus on outcomes rather than inputs—the tools that they put in place—is going to be incredibly powerful.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I had to miss a few sessions of the Committee but I am now back until the end. I remind fellow Members of my interests: I worked for one of the largest platforms for a decade, but I have no current interests. It is all in the register if people care to look. I want to contribute to this debate on the basis of that experience of having worked inside the platforms.

I start by agreeing with the noble Baroness, Lady Kidron, the noble Lord, Lord Stevenson, and my noble friend Lord Clement-Jones. The thrust of their amendments—the idea that something will be needed here—is entirely correct. We have created in the Online Safety Bill a mechanism that we in this Committee know is intended primarily to focus on systems and how Ofcom regulates them, but what the public out there hear is that we are creating a mechanism that will meet their concerns—and their concerns will not end with systems. As the noble Baroness, Lady Newlove, eloquently described, their concerns in some instances will be about specific cases and the question will be: who will take those up?

If there is no other mechanism and no way to signpost people to a place where they can seek redress, they will come to Ofcom. That is something we do not want. We want Ofcom to be focused on the big-ticket items of dealing with systems, not bogged down in dealing with thousands of individual complaints. So we can anticipate a situation in which we will need someone to be able to deal with those individual complaints.

I want to focus on making that workable, because the volume challenge might not be as people expect. I have seen from having worked on the inside that there is a vast funnel of reports, where people report content to platforms. Most of those reports are spurious or vexatious; that is the reality. Platforms have made their reporting systems easy, as we want them to do —indeed, in the Bill we say, “Make sure you have really easy-to-use reporting systems”—but one feature of that is that people will use them simply to express a view. Over the last couple of weeks, all the platforms will have been inundated with literally millions of reports about Turkish politicians. These will come from the supporters of either side, reporting people on the other side—claiming that they are engaged in hate speech or pornography or whatever. They will use whatever tool they can. That is what we used to see day in, day out: football teams or political groups that report each other. The challenge is to separate out the signal—the genuinely serious reports of where something is going wrong—from the vast amount of noise, of people simply using the reporting system because they can. For the ombudsman, the challenge will be that signal question.

Breaking that down, from the vast funnel of complaints coming in, we have a smaller subset that are actionable. Some of those will be substantive, real complaints, where the individual simply disagrees with the decision. That could be primarily for two reasons. The first is that the platform has made a bad decision and failed to enforce its own policies. For example, you reported something as being pornographic, and it obviously was, but the operator was having a bad day—they were tired, it was late in the day and they pressed “Leave up” instead of “Take down”. That happens on a regular basis, and 1% of errors like that across a huge volume means a lot of mistakes being made. Those kinds of issues, where there is a simple operator error, should get picked up by the platforms’ own appeal mechanisms. That is what they are there for, and the Bill rightly points to that. A second reviewer should look at it. Hopefully they are a bit fresher, understand that a mistake was made and can simply reverse it. Those operator error reports can be dealt with internally.

The second type would be where the platform enforces policies correctly but, from the complainant’s point of view, the policies are wrong. It may be a more pro-free speech platform where the person says, “This is hate speech”, but the platform says, “Well, according to our rules, it is not. Under our terms of service, we permit robust speech of this kind. Another platform might not, but we do”. In that case, the complainant is still unhappy but the platform has done nothing wrong—unless the policies the platform is enforcing are out of step with the requirements under the Online Safety Bill, in which case the complaint should properly come to Ofcom. Based on the individual complaint, a complainant may have something material for Ofcom. They are saying that they believe the platform’s policies and systems are not in line with the guidance issued by Ofcom—whether on hate speech, pornography or anything else. That second category of complaint would come to Ofcom.

The third class concerns the kind of complaint that the noble Baroness, Lady Newlove, described. In some ways, this is the hardest. The platform has correctly enforced its policies but, in a particular case, the effect is deeply unfair, problematic and harmful for an individual. The platform simply says, “Look, we enforced the policies. They are there. This piece of content did not violate them”. Any outsider looking at it would say, “There is an injustice here. We can clearly see that an individual is being harmed. A similar piece of content might not be harmful to another individual, but to this individual it is”. In those circumstances, groups such as the South West Grid for Learning, with which I work frequently, perform an invaluable task. We should recognise that there is a network of non-governmental organisations in the United Kingdom that do this day in, day out. Groups such as the Internet Watch Foundation and many others have fantastic relations and connections with the platforms and regularly bring exceptional cases to them.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

We are glad to have the noble Lord back. I want also to put on the record that the South West Grid for Learning is very supportive of this amendment.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

It has let me know as well. In a way, the amendment seeks to formalise what is already an informal mechanism. I was minded initially to support Amendment 56 in the name of my noble friend Lord Clement-Jones and the noble Lord, Lord Stevenson.

This landscape is quite varied. We have to create some kind of outlet, as the noble Baroness, Lady Kidron, rightly said. That parent or individual will want to go somewhere, so we have to send them somewhere. We want that somewhere to be effective, not to get bogged down in spurious and vexatious complaints. We want it to have a high signal-to-noise ratio—to pull out the important complaints and get them to the platforms. That will vary from platform to platform. In some ways, we want to empower Ofcom to look at what is and is not working and to be able to say, “Platform A has built up an incredible set of mechanisms. It’s doing a good job. We’re not seeing things falling through the cracks in the same way as we are seeing with platform B. We are going to have to be more directive with platform B”. That very much depends on the information coming in and on how well the platforms are doing their job already.

I hope that the Government are thinking about how these individual complaints will be dealt with and about the demand that will be created by the Bill. How can we have effective mechanisms for people in the United Kingdom who genuinely have hard cases and have tried, but where there is no intermediary for the platform they are worried about? In many cases, I suspect that these will be newer or smaller platforms that have arrived on the scene and do not have established relationships. Where are these people to go? Who will help them, particularly in cases where the platform may not systemically be doing anything wrong? Its policies are correct and it is enforcing them correctly, but any jury of peers would say that an injustice is being done. Either an exception needs to be made or there needs to be a second look at that specific case. We are not asking Ofcom to do this in the rest of the legislation.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, it is always somewhat intimidating to follow the noble Lord, Lord Allan, though it is wonderful to have him back from his travels. I too will speak in favour of Amendments 250A and 250B in the name of my noble friend, from not direct experience in the social media world but tangentially, from telecoms regulation.

I have lived, as the chief executive of a business, in a world where my customers could complain to me but also to an ombudsman and to Ofcom. I say this with some hesitation, as my dear old friends at TalkTalk will be horrified to hear me quoting this example, but 13 years ago, when I took over as chief executive, TalkTalk accounted for more complaints to Ofcom than pretty much all the other telcos put altogether. We were not trying to be bad—quite the opposite, actually. We were a business born out of very rapid growth, both organic and acquisitive, and we did not have control of our business at the time. We had an internal complaints process and were trying our hardest to listen to it and to individual customers who were telling us that we were letting them down, but we were not doing that very well.

While my noble friend has spoken so eloquently about the importance of complaints mechanisms for individual citizens, I am actually in favour of them for companies. I felt the consequences of having an independent complaints system that made my business listen. It was a genuine failsafe system. For someone to have got as far as complaining to the telecoms ombudsman and to Ofcom, they had really lost the will to live with my own business. That forced my company to change. It has forced telecoms companies to change so much that they now advertise where they stand in the rankings of complaints per thousand customers. Even in the course of the last week, Sky was proclaiming in its print advertising that it was the least complained-about to the independent complaints mechanism.

So this is not about thinking that companies are bad and are trying to let their customers down. As the noble Lord, Lord Allan, has described, managing these processes is really hard and you really need the third line of defence of an independent complaints mechanism to help you deliver on your best intentions. I think most companies with very large customer bases are trying to meet those customers’ needs.

For very practical reasons, I have experienced the power of these sorts of systems. There is one difference with the example I have given of telecoms: it was Ofcom itself that received most of those complaints about TalkTalk 13 years ago, and I have tremendous sympathy with the idea that we might unleash on poor Ofcom all the social media complaints that are not currently being resolved by the companies. That is exactly why, as Dame Maria Miller said, we need to set up an independent ombudsman to deal with this issue.

From a very different perspective from that of my noble friend, I struggle to understand why the Government do not want to do what they have just announced they want to do in other sectors such as gambling.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

As I said, we are happy to consider individual complaints and super-complaints further.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

Again, I am just pulling this together—I am curious to understand this. We have been given a specific case—South West Grid for Learning raising a case based on an individual but that had more generic concerns—so could the noble Viscount clarify, now or in writing, whether that is the kind of thing that he imagines would constitute a super-complaint? If South West Grid for Learning went to a platform with a complaint like that—one based on an individual but brought by an organisation—would Ofcom find that complaint admissible under its super-complaints procedure, as imagined in the Bill?

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

Overall, the super-complaints mechanism is more for groupings of complaints and has a broader range than the individual complaints process, but I will consider that point going forward.

Many UK regulators have successful super-complaints mechanisms which allow them to identify and target emerging issues and effectively utilise resources. Alongside the Bill’s research functions, super-complaints will perform a vital role in ensuring that Ofcom is aware of the issues users are facing, helping them to target resources and to take action against systemic failings.

On the steps required after super-complaints, the regulator will be required to respond publicly to the super-complaint. Issues raised in the super-complaint may lead Ofcom to take steps to mitigate the issues raised in the complaint, where the issues raised can be addressed via the Bill’s duties and powers. In this way, they perform a vital role in Ofcom’s horizon-scanning powers, ensuring that it is aware of issues as they emerge. However, super-complaints are not linked to any specific enforcement process.

--- Later in debate ---
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I particularly support Amendment 96, to which I have added my name; it is a privilege to do so. I also support Amendment 296 and I cannot quite work out why I have not added my name to it, because I wholeheartedly agree with it, but I declare my support now.

I want to talk again about an issue that the noble Baroness, Lady Finlay, set out so well and that we also touched on last week, about the regulation of suicide and self-harm content. We have all heard of the tragic case of Molly Russell, but a name that is often forgotten in this discussion is Frankie Thomas. Frankie was a vulnerable teenager with childhood trauma, functioning autism and impulsivity. After reading a story about self-harm on the app Wattpad, according to the coroner’s inquest, she went home and undertook

“a similar act, resulting in her death”.

I do not need to repeat the many tragic examples that have already been shared in this House, but I want to reiterate the point already made by the BMA in its very helpful briefing on these amendments: viewing self-harm and suicide content online can severely harm the user offline. As I said last week when we were debating the user empowerment tools, this type of content literally has life or death repercussions. It is therefore essential that the Bill takes this sort of content more seriously and creates specific duties for services to adhere to.

We will, at some point this evening—I hope—come on to debate the next group of amendments. The question for Ministers to answer on this group, the next one and others that we will be debating is, where we know that content is harmful to society—to individuals but also to broader society—why the Government do not want to take the step of setting out how that content should be properly regulated. I think it all comes from their desire to draw a distinction between content that is illegal and content that is not illegal but is undoubtedly, in the eyes of pretty well every citizen, deeply harmful. As we have already heard from the noble Baroness, and as we heard last week, adults do not become immune to suicide and self-harm content the minute they turn 18. In fact, I would argue that no adult is immune to the negative effects of viewing this type of content online.

This amendment, therefore, is very important, as it would create a duty for providers of regulated user-to-user services and search engines to manage harmful suicide or self-harm content applicable to both children and adults, recognising this cliff edge otherwise in the Bill, which we have already talked about. I strongly urge noble Lords, particularly the Minister, to agree that protecting users from this content is one of the most important things that the Bill can do. People outside this House are looking to us to do this, so I urge the Government to support this amendment today.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I am pleased that we have an opportunity, in this group of amendments, to talk about suicide and self-harm content, given the importance of it. It is important to set out what we expect to happen with this legislation. I rise particularly to support Amendment 225, to which my noble friend Lady Parminter added her name. I am doing this more because the way in which this kind of content is shared is incredibly complex, rather than simply because of the question of whether it is legal or illegal.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

From experience, I think it is true that companies get defensive and seek to defend the indefensible on occasion. I agree with the noble Baroness on that, but I will balance it a little as I also work with people who were agonising over not wanting to make a bad situation worse. They were genuinely struggling and seeking to do the right thing. That is where the experts come in. If someone would say to them, “Look, take this stuff down; that is always better”, it would make their lives easier. If they said, “Please leave it up”, they could follow that advice. Again, that would make their lives easier. On the excuses, I agree that sometimes they are defending the indefensible, but also there are people agonising over the right thing to do and we should help them.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I absolutely agree. Of course, good law is a good system, not a good person.

I turn to the comments that I was going to make. Uncharacteristically, I am a little confused about this issue and I would love the Minister’s help. My understanding on reading the Bill very closely is that self-harm and suicide content that meets a legal definition will be subject to the priority illegal content duties. In the case of children, we can safely anticipate that content of this kind will be named primary priority content. Additionally, if such content is against the terms of service of a regulated company, it can be held responsible to those terms. It will have to provide a user empowerment tool on category 1 services so that it can be toggled out if an adult user wishes. That is my understanding of where this content has already been dealt with in the Bill. To my mind, this leaves the following ways in which suicide and self-harm material, which is the subject of this group of amendments, is not covered by the Bill. That is what I would like the Minister to confirm, and I absolutely stand by to be corrected.

In the case of adults, if self-harm and suicide material does not meet a bar of illegal content and the service is not category 1, there is no mechanism to toggle it out. Ofcom has no power to require a service to ensure tools to toggle self-harm and suicide material out by default. This means that self-harm and suicide material can be as prevalent as they like—pushed, promoted and recommended, as I have just explained—if it is not contrary to the terms of service, so long as it does not reach the bar of illegal content.

Search services are not subject to these clauses— I am unsure about that. In the case of both children and adults, if self-harm and suicide material is on blogs or services with limited functionality, it is out of scope of the Bill and there is absolutely nothing Ofcom can do. For non-category 1 services—the majority of services which claim that an insignificant number of children access their site and thus that they do not have to comply with the child safety duties—there are no protections for a child against this content.

I put it like that because I believe that each of the statements I just made could have been fixed by amendments already discussed during the past six days in Committee. We are currently planning to leave many children without the protection of the safety duties, to leave vulnerable adults without even the cover of default protections against material that has absolutely no public interest and to leave companies to decide whether to promote or use this material to fuel user engagement—even if it costs well-being and lives.

I ask the Minister to let me know if I have misunderstood, but I think it is really quite useful to see what is left once the protections are in place, rather than always concentrating on the protections themselves.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, like everyone who spoke, I and the Government recognise the tragic consequences of suicide and self-harm, and how so many lives and families have been devastated by it. I am grateful to the noble Baroness and all noble Lords, as well as the bereaved families who campaigned so bravely and for so long to spare others that heartache and to create a safer online environment for everyone. I am grateful to the noble Baroness, Lady Finlay of Llandaff, who raised these issues in her Private Member’s Bill, on which we had exchanges. My noble friend Lady Morgan is right to raise the case of Frankie Thomas and her parents, and to call that to mind as we debate these issues.

Amendments 96 and 296, tabled by the noble Baroness, Lady Finlay, would, in effect, reintroduce the former adult safety duties whereby category 1 companies were required to assess the risk of harm associated with legal content accessed by adults, and to set and enforce terms of service in relation to it. As noble Lords will know, those duties were removed in another place after extensive consideration. Those provisions risked creating incentives for the excessive removal of legal content, which would unduly interfere with adults’ free expression.

However, the new transparency, accountability and freedom of expression duties in Part 4, combined with the illegal and child safety duties in Part 3, will provide a robust approach that will hold companies to account for the way they deal with this content. Under the Part 4 duties, category 1 services will need to have appropriate systems and processes in place to deal with content or activity that is banned or restricted by their terms of service.

Many platforms—such as Twitter, Facebook and TikTok, which the noble Baroness raised—say in their terms of service that they restrict suicide and self-harm content, but they do not always enforce these policies effectively. The Bill will require category 1 companies—the largest platforms—fully to enforce their terms of service for this content, which will be a significant improvement for users’ safety. Where companies allow this content, the user-empowerment duties will give adults tools to limit their exposure to it, if they wish to do so.

The noble Baroness is right to raise the issue of algorithms. As the noble Lord, Lord Stevenson, said, amplification lies at the heart of many cases. The Bill will require providers specifically to consider as part of their risk assessments how algorithms could affect children’s and adults’ exposure to illegal content, and content that is harmful to children, on their services. Providers will need to take steps to mitigate and effectively manage any risks, and to consider the design of functionalities, algorithms and other features to meet the illegal content and child safety duties in the Bill.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

Following our earlier discussion, we were going to have a response on super-complaints. I am curious to understand whether we had a pattern of complaints—such as those the noble Baroness, Lady Kidron, and others received—about a platform saying, under its terms of service, that it would remove suicide and self-harm content but failing to do so. Does the Minister think that is precisely the kind of thing that could be substantive material for an organisation to bring as a super-complaint to Ofcom?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My initial response is, yes, I think so, but it is the role of Ofcom to look at whether those terms of service are enforced and to act on behalf of internet users. The noble Lord is right to point to the complexity of some marginal cases with which companies have to deal, but the whole framework of the Bill is to make sure that terms of service are being enforced. If they are not, people can turn to Ofcom.

--- Later in debate ---
The noble Baroness asked about the metaverse, which is in scope of the Bill as a user-to-user service. The approach of the Bill is to try to remain technology neutral.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I will plant a flag in reference to the new offences, which I know we will come back to again. It is always helpful to look at real-world examples. There is a lot of meme-based self-harm content. Two examples are the Tide Pods challenge—the eating of detergent capsules—and choking games, both of which have been very common and widespread. It would be helpful, ahead of our debate on the new offences, to understand whether they are below or above the threshold of serious self-harm and what the Government’s intention is. There are arguments both ways: obviously, criminalising children for being foolish carries certain consequences, but we also want to stop the spread of the content. So, when we come to that offence, it would be helpful if the Minister could use specific examples, such as the meme-based self-harm content, which is quite common.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I thank the noble Lord for the advance notice to think about that; it is helpful. It is difficult to talk in general terms about this issue, so, if I can, I will give examples that do, and do not, meet the threshold.

The Bill goes even further for children than it does for adults. In addition to the protections from illegal material, the Government have indicated, as I said, that we plan to designate content promoting suicide, self-harm or eating disorders as categories of primary priority content. That means that providers will need to put in place systems designed to prevent children of any age encountering this type of content. Providers will also need specifically to assess the risk of children encountering it. Platforms will no longer be able to recommend such material to children through harmful algorithms. If they do, Ofcom will hold them accountable and will take enforcement action if they break their promises.

It is right that the Bill takes a different approach for children than for adults, but it does not mean that the Bill does not recognise that young adults are at risk or that it does not have protections for them. My noble friend Lady Morgan was right to raise the issue of young adults once they turn 18. The triple shield of protection in the Bill will significantly improve the status quo by protecting adults, including young adults, from illegal suicide content and legal suicide or self-harm content that is prohibited in major platforms’ terms and conditions. Platforms also have strong commercial incentives, as we discussed in previous groups, to address harmful content that the majority of their users do not want to see, such as legal suicide, eating disorder or self-harm content. That is why they currently claim to prohibit it in their terms and conditions, and why we want to make sure that those terms and conditions are transparently and accountably enforced. So, while I sympathise with the intention from the noble Baroness, Lady Finlay, her amendments raise some wider concerns about mandating how providers should deal with legal material, which would interfere with the careful balance the Bill seeks to strike in ensuring that users are safer online without compromising their right to free expression.

The noble Baroness’s Amendment 240, alongside Amendment 225 in the name of the noble Lord, Lord Stevenson, would place new duties on Ofcom in relation to suicide and self-harm content. The Bill already has provisions to provide Ofcom with broad and effective information-gathering powers to understand how this content affects users and how providers are dealing with it. For example, under Clause 147, Ofcom can already publish reports about suicide and self-harm content, and Clauses 68 and 69 empower Ofcom to require the largest providers to publish annual transparency reports.

Ofcom may require those reports to include information on the systems and processes that providers use to deal with illegal suicide or self-harm content, with content that is harmful to children, or with content which providers’ own terms of service prohibit. Those measures sit alongside Ofcom’s extensive information-gathering powers. It will have the ability to access the information it needs to understand how companies are fulfilling their duties, particularly in taking action against this type of content. Furthermore, the Bill is designed to provide Ofcom with the flexibility it needs to respond to harms—including in the areas of suicide, self-harm and eating disorders—as they develop over time, in the way that the noble Baroness envisaged in her remarks about the metaverse and new emerging threats. So we are confident that these provisions will enable Ofcom to assess this type of content and ensure that platforms deal with it appropriately. I hope that this has provided the sufficient reassurance to the noble Baroness for her not to move her amendment.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I support these amendments, which were set out wonderfully by the noble Lord, Lord Stevenson. I want to raise a point made on Tuesday when the noble Baroness, Lady Merron, said that only 3% of people read terms of service and I said that 98% of people do not read them, so one of us is wrong, but I think the direction of travel is clear. She also used a very interesting phrase about prominence, and I want to use this opportunity to ask the Minister whether there is some lever whereby Ofcom can insist on prominence for certain sorts of material—a hierarchy of information, if you like—because these are really important pieces of information, buried in the wrong place so that even 2% or 3% of people may not find them.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I am very pleased that the noble Lord, Lord Stevenson, has given us the opportunity to talk about terms of service, and I will make three points again, in a shorter intervention than on the previous group.

First, terms of service are critical as the impact of terms of service will generally be much greater in terms of the amount of intervention that occurs on content than it will ever be under the law. Terms of service create, in effect, a body of private law for a community, and they are nearly always a superset of the public law—indeed, it is very common for the first items of a terms of service to say, “You must not do anything illegal”. This raises the interesting question of “illegal where?”—what it generally means is that you must not do anything illegal in the jurisdiction in which the service provider is established. The terms of service will say, “Do not do anything illegal”, and then they will give a whole list of other things, as well as illegality, that you cannot do on the platform, and I think this is right because they have different characteristics.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, this is a very large and wide-ranging group of amendments. Within it, I have a number of amendments that, on their own, span three separate subjects. I propose to address these one after the other in my opening remarks, but other subjects will be brought in as the debate continues and other noble Lords speak to their own amendments.

If I split the amendments that I am speaking to into three groups, the first is Amendments 17 and 18. These relate to Clause 9, on page 7, where safety duties about illegal content are set out. The first of those amendments addresses the obligation to prevent individuals encountering priority illegal content by means of the service.

Earlier this week in Committee, I asked the Minister whether the Government understood “prevent” and “protect”, both of which they use in the legislation, to have different weight. I did not expect my noble friend to give an answer at that point, but I know that he will have reflected on it. We need clarity about this at some point, because courts will be looking at, listening to and reading what the Government say at the Dispatch Box about the weight to be given to these words. To my mind, to prevent something happening requires active measures in advance that ensure as far as reasonably and humanly possible that it does not actually happen, but one could be talking about something more reactive to protect someone from something happening.

This distinction is of great importance to internet companies—I am not talking about the big platforms—which will be placed, as I say repeatedly, under very heavy burdens by the Bill. It is possible that they simply will not be able to discharge them and will have to go out of business.

Let us take Wikipedia, which was mentioned earlier in Committee. It operates in 300 languages but employs 700 moderators globally to check what is happening. If it is required by Clause 9 to

“prevent individuals from encountering priority illegal content by means of the service”,

it will have to scrutinise what is put up on this community-driven website as or before it appears. Quite clearly, something such as Welsh Wikipedia—there is Wikipedia in Welsh—simply would not get off the ground if it had to meet that standard, because the number of people who would have to be employed to do that would be far more than the service could sustain. However, if we had something closer to the wording I suggest in my amendment, where services have to take steps to “protect” people—so they could react to something and take it down when they become aware of it—it all becomes a great deal more tolerable.

Similarly, Amendment 18 addresses subsection (3) of the same clause, where there is a

“duty to operate a service using proportionate systems and processes … to … minimise the length of time”

for which content is present. How do you know whether you are minimising the length of time? How is that to be judged? What is the standard by which that is to be measured? Would it not be a great deal better and more achievable if the wording I propose, which is that you simply are under an obligation to take it down, were inserted? That is my first group of amendments. I put that to my noble friend and say that all these amendments are probing to some extent at this stage. I would like to hear how he thinks that this can actually be operated.

My second group is quite small, because it contains only Amendment 135. Here I am grateful to the charity JUSTICE for its help in drawing attention to this issue. This amendment deals with Schedule 7, on page 202, where the priority offences are set out. Paragraph 4 of the schedule says that a priority offence includes:

“An offence under any of the following provisions of the Public Order Act 1986”.


One of those is Section 5 of that Act, “Harassment, alarm or distress”. Here I make a very different point and return to territory I have been familiar with in the past. We debated this only yesterday in Grand Committee, although I personally was unable to be there: the whole territory of hate crimes, harmful and upsetting words, and how they are to be judged and dealt with. In this case, my amendment would remove Section 5 of the Public Order Act from the list of priority offences.

If society has enough problems tolerating the police going round and telling us when we have done or said harmful and hurtful things and upbraiding us for it, is it really possible to consider—without the widest form of censorship—that it is appropriate for internet platforms to judge us, shut us down and shut down our communications on the basis of their judgment of what we should be allowed to say? We already know that there is widespread suspicion that some internet platforms are too quick to close down, for example, gender critical speech. We seem to be giving them something close to a legislative mandate to be very trigger-happy when it comes to closing down speech by saying that it engages, or could engage, Section 5 of the Public Order Act. I will come to the question of how they judge it in my third group, in a moment—but the noble Lord might be able to help me.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

Just to reinforce the point the noble Lord, Lord Moylan, made on that, I certainly had experience of where the police became the complainants. They would request, for example, that you take down an English Defence League event, claiming that it would be likely to cause a public order problem. I have no sympathy whatever with the English Defence League, but I am very concerned about the police saying “You must remove a political demonstration” to a platform and citing the legal grounds for doing that. The noble Lord is on to a very valid point to be concerned about that.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

I am grateful to the noble Lord. I really wonder whether the Government realise what they are walking into here. On the one hand, yesterday the Grand Committee was debating the statutory instrument putting in place new statutory guidance for the police on how to enforce, much more sensitively than in the past, non-crime hate incidents. However, on the other hand, the next day in this Chamber we are putting an obligation on a set of mostly foreign private companies to act as a police force to go around bullying us and closing us down if we say something that engages Section 5 of the Public Order Act. I think this is something the Government are going to regret, and I would very much like to hear what my noble friend has to say about that.

Finally, I come to my third group of amendments: Amendments 274, 278, 279 and 283. They are all related and on one topic. These relate to the text of the Bill on page 145, in Clause 170. Here we are discussing what judgments providers have to make when they come to decide what material to take down. Inevitably, they will have to make judgments. That is one of the unfortunate things about this Bill. A great deal of what we do in our lives is going to have to be based on judgments made by private companies, many of which are based abroad but which we are trying to legislate for.

It makes a certain sense that the law should say what they should take account of in making those judgments. But the guidance—or rather, the mandate—given to those companies by Clause 170 is, again, very hair-trigger. Clause 170(5), which I am proposing we amend, states:

“In making such judgements, the approach to be followed is whether a provider has reasonable grounds to infer that content is … of the kind in question”.


I am suggesting that “reasonable grounds to infer” should be replaced with “sufficient evidence to infer”, so that they have to be able to produce some evidence that they are justified in taking content down. The test should be higher than simply having “reasonable grounds”, which may rest on a suspicion and little evidence at all. So one of those amendments relates to strengthening that bar so that they must have real evidence before they can take censorship action.

I add only two words to subsection (6), which talks about reasonable grounds for the inference—it defines what the reasonable grounds are—that

“exist in relation to content and an offence if, following the approach in subsection (2)”

and so on. I am saying “if and only if”—in other words, I make it clear that this is the only basis on which material can be censored using the provisions in this section, so as to limit it from going more widely. The third amendment in my group is essentially consequential to that.

--- Later in debate ---
Lord Bishop of Guildford Portrait The Lord Bishop of Guildford
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendments 128, 130 and 132, as well as Amendments 143 to 153 in this grouping. They were tabled in the name of my right reverend colleague the Bishop of Derby, who is sorry that she cannot be here today.

The Church of England is the biggest provider of youth provision in our communities and educates around 1 million of our nation’s children. My colleague’s commitment to the principles behind these amendments also springs from her experience as vice chair of the Children’s Society. The amendments in this grouping are intended to strengthen legislation on online grooming for the purpose of child criminal exploitation, addressing existing gaps and ensuring that children are properly protected. They are also intended to make it easier for evidence of children being groomed online for criminal exploitation to be reported by online platforms to the police and the National Crime Agency.

Research from 2017 shows that one in four young people reported seeing illicit drugs advertised for sale on social media—a percentage that is likely to be considerably higher six years on. According to the Youth Endowment Fund in 2022, 20% of young people reported having seen online content promoting gang membership in the preceding 12 months, with 24% reporting content involving the carrying, use or promotion of weapons.

In relation to drugs, that later research noted that these platforms provide opportunities for dealers to build trust with potential customers, with young people reporting that they are more likely to see a groomer advertising drugs as a friend than as a dealer. This leaves young people vulnerable to exploitation, thereby reducing the scruples or trepidation they might feel about buying drugs in the first place. Meanwhile, it is also clear that social media is changing the operation of the county lines model. There is no longer the need to transport children from cities into the countryside to sell drugs, given that children who live in less populated areas can be groomed online as easily as in person. A range of digital platforms is therefore being used to target potential recruits among children and young people, with digital technologies also being deployed—for example, to monitor their whereabouts on a drugs run.

More research is being carried out by the Children’s Society, whose practitioners reported a notable increase in the number of perpetrators grooming children through social media and gaming sites during the first and second waves of the pandemic. Young people were being contacted with promotional material about lifestyles they could lead and the advantages of working within a gang, and were then asked to do jobs in exchange for money or status within this new group. It is true that some such offences could be prosecuted under the Modern Slavery Act 2015, but there remains a huge disparity between the scale of exploitation and the number of those being charged under the Act. Without a definition of child exploitation for criminal purposes, large numbers of children are being groomed online and paying the price for crimes committed by some of their most dangerous and unscrupulous elders.

It is vital that we protect our children from online content which facilitates that criminal exploitation, in the same way that we are looking to protect them from sexual exploitation. Platforms must be required to monitor for illegal content related to child criminal exploitation on their sites and to have mechanisms in place for users to flag it with those platforms so it can be removed. This can be achieved by including modern slavery and trafficking, of which child criminal exploitation is a form, into the scope of illegal content within the Bill, which is what these amendments seek to do. It is also vital that the law sets out clear expectations on platforms to report evidence of child criminal exploitation to the National Crime Agency in the same way as they are expected to report content involving child sexual exploitation and abuse to enable child victims to be identified and to receive support. Such evidence may enable action against the perpetrators without the need of a disclosure from child victims. I therefore fully support and endorse the amendments standing in the name of the right reverend Prelate.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, this is again a very helpful set of amendments. I want to share some experience that shows that legality tests are really hard. Often from the outside there is an assumption that it is easy to understand what is legal and illegal in terms of speech, but in practice that is very rarely the case. There is almost never a bright line, except in a small class of child sexual abuse material where it is always illegal and, as soon as you see the material, you know it is illegal and you can act on it. In pretty much every other case, you have to look at what is in front of you.

I will take a very specific example. Something we had to deal with was images of Abdullah Öcalan, the leader of the PKK in Turkey. If somebody shared a picture of Abdullah Öcalan, were they committing a very serious offence, which is the promotion of terrorism? Were they indicating support for the peace process that was taking place in Turkey? Were they showing that they support his socialist and feminist ideals? Were they supporting the YPG, a group in Syria to which we were sending arms, that venerates him? This is one example of many I could give where the content in front of you does not tell you very clearly whether or not the speech is illegal or speech that should be permitted. Indeed, we would take speech like that down and I would get complaints, including from Members of Parliament, saying, “Why have you removed that speech? I’m entitled to talk about Abdullah Öcalan”, and we would enter into an argument with them.

We would often ask lawyers in different countries whether they could tell us whether a speech was legal or illegal. The answer would come back as probably illegal, likely illegal, maybe illegal and, occasionally, definitely not illegal, but it was nearly always on the spectrum. The amendments we are proposing today are to try to understand where the Government intend people to draw that line when they get that advice. Let us assume the company wants to do the right thing and follow the instructions of the Bill and remove illegal content. At what level do they say it has met the test sufficiently, given that in the vast majority of cases, apart from the small class of illegal content, they are going to be given only a likelihood or a probability? As the noble Lord, Lord Moylan, pointed out, we have to try to insert this notion of sufficient evidence with Amendments 273, 275, 277, 280 and 281 in the names of my noble friend Lord Clement-Jones and the noble Viscount, Lord Colville, who is unable to be in his place today. I think the noble Baroness, Lady Kidron, may also have signed them. We are trying to flesh out the point at which that illegality standard should kick in.

Just to understand again how this often works when the law gets involved, I say that there is a law in Germany; the short version is NetzDG. If there are any German speakers who can pronounce the compound noun that is its full title, there will be a prize. It is a long compound word that means “network enforcement Act”. It has been in place for a few years and it tells companies to do something similar—to remove content that is illegal in Germany. There would be cases where we would get a report from somebody saying, “This is illegal”, and we would take action; then it went into the German system and three months later we would finally get told whether it was actually illegal in a 12-page judgment that a German court had figured out. In the meantime, all we could do was work on our best guess while that process was going on. I think we need to be very clear that illegality is hard.

Cross-jurisdictional issues present us with another set of challenges. If both the speaker and the audience are in the United Kingdom, it is fairly clear. But in many cases, when we are talking about online platforms, one or other, or even both the speaker and the audience, may be outside the United Kingdom. Again, when does the speech become illegal? It may be entirely legal speech between two people in the United States. I think—and I would appreciate clarification from the Minister—that the working assumption is that if the speech was reported by someone not in the United State but in the UK, the platform would be required to restrict access to it from the UK, even though the speech is entirely legal in the jurisdiction in which it took place. Because the person in the UK encountered it, there would be a duty to restrict it. Again, it has been clarified that there is certainly not a duty to take the speech down, because it is entirely legal speech outside the UK. These cross-jurisdictional issues are interesting; I hope the Minister can clarify that.

The amendments also try to think about how this would work in practice. Amendment 287 talks about how guidance should be drawn up in consultation with UK lawyers. That is to avoid a situation where platforms are guessing too much at what UK lawyers want; they should at least have sought UK legal advice. That advice will then be fed into the guidance given to their human reviewers and their algorithms. That is the way, in practice, in which people will carry out the review. There is a really interesting practical question—which, again, comes up under NetzDG—about the extent to which platforms should be investing in legal review of content that is clearly against their terms of service.

There will be two kinds of platform. There will be some platforms that see themselves as champions of freedom of expression and say they will only remove stuff that is illegal in the UK, and everything else can stay up. I think that is a minority of platforms—they tend to be on the fringes. As soon as a platform gets a mainstream audience, it has to go further. Most platforms will have terms of service that go way beyond UK law. In that case, they will be removing the hate speech, and they will be confident that they will remove UK-illegal hate speech within that. They will remove the terrorist content. They will be confident and will not need to do a second test of the legality in order to be able to remove that content. There is a practical question about the extent to which platforms should be required to do a second test if something is already illegal under their terms.

There will be, broadly speaking again, four buckets of content. There will be content that is clearly against a platform’s terms, which it will want to get rid of immediately. It will not want to test it again for legality; it will just get rid of it.

There will be a second bucket of content that is not apparently against a platform’s terms but clearly illegal in the UK. That is a very small subset of content: in Germany, that is Holocaust denial content; in the United Kingdom, this Parliament has looked at Holocaust denial and chosen not to criminalise it, so that will not be there, but an equivalent for us would be migration advice. Migration advice will not be against the terms of service of most platforms, but in the Government’s intention, the Illegal Migration Bill is to make it illegal and require it to be removed, and the consequent effect will be that it will have to be removed under the terms of this Bill. So there will be that small set of content that is illegal in the UK but not against terms of service.

There will be a third bucket of content that is not apparently against the terms or the law, and that actually accounts for most of the complaints that a platform gets. I will choose my language delicately: complaint systems are easy, and people complain to make a point. They use complaint systems such as dislike buttons. The reality is that one of the most common sets of complaints you get is when there is a football match and the two opposing teams report the content on each other’s pages as illegal. They will do that every time, and you get used to it, and that is why you learn to discount mass-volume complaints. But again, we should be clear that there are a great many complaints that are merely vexatious.

The final bucket is of content that is unclear and legal review will be needed. Our amendment is intended to deal with those. A platform will go out and get advice. It is trying to understand at what point something like migration advice tips over into the illegal as opposed to being advice about going on holiday, and it is trying to understand that based on what it can immediately see. Once it has sought that advice, it will feed that back into the guidance to reviewers and the algorithms to try and remove content more effectively and be compliant with the Bill as a whole and not get into trouble with Ofcom.

Some areas are harder than others. The noble Lord, Lord Moylan, already highlighted one: public order offences, which are extremely hard. If somebody says something offensive or holds an offensive political view—I suspect the noble Baroness, Lady Fox, may have something to say on this—people may well make contact and claim that it is in breach of public order law. On the face of it, they may have a reasonably arguable case but again, as a platform, you are left to make a decision.

--- Later in debate ---
I noted earlier that the noble Lord, Lord Bethell, made a passionate intervention about, of all things, Andrew Tate and his illegality in relation to this Bill. That prompted me to think a number of things. Andrew Tate is an influencer who I despise, as I do the kind of things he says. But, as far as I know, the criminal allegations he faces are not yet resolved, so he has to be seen as innocent until proven guilty. Most of what he has online that is egregious might well be in bad taste, as people say—I would say that it is usually misogynist—but it is not against the law. If we get to a situation where that is described as illegality, that is the kind of thing that I worry about. As we have heard from other noble Lords, removing so-called illegal content for the purpose of complying with this regulatory system will mean facing such dilemmas.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

In talking about individuals and investigations, the noble Baroness reminded me of one class of content where we do have clarity, and that is contempt of court. That is a frequent request. We know that it is illegal in that case because a judge writes to the company and says, “You must not allow this to be said because it is in contempt of court”, but that really is the exception. In most other cases, someone is saying, “I think it is illegal”. In live proceedings, in most cases it is absolutely clear because a judge has told you.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

That is very helpful.

I am concerned that removing so-called illegal content for the purpose of complying with the regulatory system covers not only that which reaches conviction in a criminal court but possibly anything that a platform determines could be illegal, and therefore it undermines our own legal system. As I have said, that marks a significant departure from the rule of law. It seems that the state is asking or mandating private companies to make determinations about what constitutes illegality.

The obligations on a platform to determine what constitutes illegality could obviously become a real problem, particularly in relation to limitations on free expression. As we have already heard, the Public Order Act 1986 criminalises, for example, those who stir up hatred through the use of words, behaviour or written material. That is contentious in the law offline. By “contentious”, I mean that it is a matter of difficulty that requires the full rigour of the criminal justice system, understanding the whole history of established case law. That is all necessary to make a conviction under that law for offences of this nature.

Now we appear to be saying that, without any of that, social media companies should make the decision, which is a nerve-racking situation to be in. We have already heard the slippery phrase “reasonable grounds to infer”. If that was the basis on which you were sent to prison—if they did not have to prove that you were guilty but they had reasonable grounds to infer that you might be, without any evidence—I would be worried, yet reasonable grounds to infer that the content could be illegal is the basis on which we are asking for those decisions to be made. That is significantly below the ordinary burden of proof required to determine that an illegal act has been committed. Under this definition, I fear that platforms will be forced to overremove and censor what ultimately will be entirely lawful speech.

Can the Minister consider what competency social media companies have to determine what is lawful? We have heard some of the dilemmas from somebody who was in that position—let alone the international complications, as was indicated. Will all these big tech companies have to employ lots of ex-policemen and criminal lawyers? How will it work? It seems to me that there is a real lack of qualifications in that sphere— that is not a criticism, because those people decided to work in big tech, not in criminal law, and yet we are asking them to pursue this. That is a concern.

I will also make reference to what I think are the controversies around government Amendments 136A and 136B to indicate the difficulties of these provisions. They concern illegal activity—such as “assisting unlawful immigration”, illegal entry, human trafficking and similar offences—but I am unsure as to how this would operate. While it is the case that certain entrances to the UK are illegal, I suddenly envisage a situation where a perfectly legitimate political debate—for example, about the small boats controversy—would be taken down, and that people advocating for a position against the Government’s new Illegal Migration Bill could be accused of supporting illegality. What exactly will be made illegal in those amendments to the Online Safety Bill?

The noble Baroness, Lady Buscombe, made a fascinating speech about an interesting group of amendments. Because of the way the amendments are grouped, I feel that we have moved to a completely different debate, so I will not go into any detail on this subject. Anonymous trolling, Twitter storms and spreading false information are incredibly unpleasant. I am often the recipient of them—at least once a week—so I know personally that you feel frustrated that people tell lies and your reputation is sullied. However, I do not think that these amendments offer the basis on which that activity should be censored, and I will definitely argue against removing anonymity clauses—but that will be in another group. It is a real problem, but I do not think that the solution is contained in these amendments.

--- Later in debate ---
Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

I defer to the noble Baroness, Lady Fox, on speech crime. That is not the area of my expertise, and it is not the purpose of my points. My points were to do with the kinds of crime that affect children in particular. His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services is very specific about that point. It says that “unacceptable delays are commonplace” and it gives a very large number of case studies. I will not go through them now because it is Thursday afternoon, but I think noble Lords can probably imagine the kinds of things we are talking about. They include years of delay, cases not taken seriously or overlooked, evidence lost, and so forth. The report found that too often children were put at risk because of this, and offenders were allowed to escape justice, and it gave 17 recommendations for how the police force should adapt in order to meet this challenge.

So my questions to the Minister are these. When we talk about things such as age verification for hardcore porn, we are quite often told that we do not need to worry about some of this because it is covered by illegal content provisions, and we should just leave it to the police to sort out. His Majesty’s Inspectorate gives clear evidence—this is a recent report from last month—that this is simply not happening in the way it should be. I therefore wondered what, if anything, is in the Bill to try to close down this particular gap. That would be very helpful indeed.

If it is really not for the purposes of this Bill at all—if this is actually to do with other laws and procedures, other departments and the way in which the resources for the police are allocated, as the noble Baroness, Lady Fox, alluded to—what can the Government do outside the boundaries of this legislation to mobilise the police and the prosecution services to address what I might term “digital crimes”: that is, crimes that would be followed up with energy if they occurred in the real world but, because they are in the digital world, are sometimes overlooked or forgotten?

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

My Lords, I would like to mention one issue that I forgot to mention, and I think it would be more efficient to pose the question now to the Minister rather than interject when he is speaking.

On the Government’s Amendments 136A, 136B and 136C on the immigration offences, the point I want to make is that online services can be literal life-savers for people who are engaged in very dangerous journeys, including journeys across the Channel. I hope the Minister will be clear that the intention here is to require platforms to deal only with content, for example, from criminals who are offering trafficking services, and that there is no intention to require platforms somehow to withdraw services from the victims of those traffickers when they are using those services in the interest of saving their own lives or seeking advice that is essential to preserving their own safety.

That would create—as I know he can imagine—real ethical and moral dilemmas, and we should not be giving any signal that we intend to require platforms to withdraw services from people who are in desperate need of help, whatever the circumstances.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, we seem to have done it again—a very long list of amendments in a rather ill-conceived group has generated a very interesting discussion. We are getting quite good at this, exchanging views across the table, across the Committee, even within the Benches—Members who perhaps have not often talked together are sharing ideas and thoughts, and that is a wonderful feeling.

I want to start with an apology. I think I may be the person who got the noble Baroness, Lady Kidron, shopped by the former leader—once a leader, always a leader. What I thought I was being asked was whether the Committee would be interested in hearing the views of the noble Viscount who could not be present, and I was very keen, because when he does speak it is from a point of view that we do not often hear. I did not know that it was a transgression of the rules—but of course it is not, really, because we got round it. Nevertheless, I apologise for anything that might have upset the noble Baroness’s blood pressure—it did not stop her making a very good contribution later.

We have covered so much ground that I do not want to try and summarise it in one piece, because you cannot do that. The problem with the group as it stands is that the right reverend Prelate the Bishop of Derby and myself must have some secret connection, because we managed to put down almost the same amendments. They were on issues that then got overtaken by the Minister, who finally got round to—I mean, who put down a nice series of amendments which exactly covered the points we made, so we can lose all those. But this did not stop the right reverend Prelate the Bishop of Guildford making some very good additional points which I think we all benefited from.

I welcome back the noble Baroness, Lady Buscombe, after her illness; she gave us a glimpse of what is to come from her and her colleagues, but I will leave the particular issue that she raised for the Minister to respond to. It raises an issue that I am not competent on, but it is a very important one—we need to get the right balance between what is causing the alarm and difficulty outside in relation to what is happening on the internet, and I think we all agree with her that we should not put any barrier in the way of dealing with that.

Indeed, that was the theme of a number of the points that have been raised on the question of what is or can constitute illegal content, and how we judge it. It is useful to hear again from the master about how you do it in practice. I cannot imagine being in a room of French lawyers and experts and retaining my sanity, let alone making decisions that affect the ability of people to carry on, but the noble Lord did it; he is still here and lives to tell the tale—bearded or otherwise.

The later amendments, particularly from the noble Lord, Lord Clement-Jones, are taking us round in a circle towards the process by which Ofcom will exercise the powers that it is going to get in this area. These are probably worth another debate on their own, and maybe it will come up in a different form, because—I think the noble Baroness, Lady Stowell, made this point as well—there is a problem in having an independent regulator that is also the go-to function for getting advice on how others have to make decisions that are theirs to rule on at the end if they go wrong. That is a complicated way of saying that we may be overloading Ofcom if we also expect it to provide a reservoir of advice on how you deal with the issues that the Bill puts firmly on the companies—I agree that this is a problem that we do not really have an answer to.

My amendments were largely overtaken by the Government’s amendments, but the main one I want to talk about was Amendment 272. I am sorry that the noble Baroness, Lady Morgan, is not here, because her expertise is in an area that I want to talk about, which is fraud—cyber fraud in particular—and how that is going to be brought into the Bill. The issue, which I think has been raised by Which?, but a number of other people have also written to us about it, is that the Bill in Clauses 170 and 171 is trying to establish how a platform should identify illegal content in relation to fraud—but it is quite prescriptive. In particular, it goes into some detail which I will leave for the Minister to respond to, but uniquely it sets out a specific way for gathering information to determine whether content is illegal in this area, although it may have applicability in other areas.

One of the points that have to be taken into account is whether the platform is using human moderators, automated systems or a combination of the two. I am not quite sure why that is there in the Bill; that is really the basis for the tabling of our amendments. Clearly, one would hope that the end result is whether or not illegality has taken place, not how that information has been gathered. If one must make concessions to the process of law because a judgment is made that, because it is automated, it is in some way not as valid as if it had been done by a human moderator, there seems to be a whole world there that we should not be going into. I certainly hope that that is not going to be the case if we are talking about illegality concerning children or other vulnerable people, but that is how the Bill reads at present; I wonder whether the Minister can comment on that.

There is a risk of consumers being harmed here. The figures on fraud in the United Kingdom are extraordinary; the fact that it is not the top priority for everybody, let alone the Government, is extraordinary. It is something like the equivalent of consumers being scammed at the rate of around £7.5 billion per year. A number of awful types of scamming have emerged only because of the internet and social media. They create huge problems of anxiety and emotional distress, with lots of medical care and other things tied in if you want to work out the total bill. So we have a real problem here that we need to settle. It is great that it is in the Bill, but it would be a pity if the movement towards trying to resolve it is in any way infringed on by there being imperfect instructions in the Bill. I wonder whether the Minister would be prepared to respond to that; I would be happy to discuss it with him later, if that is possible.

As a whole, this is an interesting question as we move away from what a crime is towards how people judge how to deal with what they think is a crime but may not be. The noble Lord, Lord Allan, commented on how to do it in practice but one hopes that any initial problems will be overcome as we move forward and people become more experienced with this.

When the Joint Committee considered this issue, we spent a long time talking about why we were concerned about having certainty on the legal prescription in the Bill; that is why we were very much against the idea of “legal but harmful” because it seemed too subjective and too subject to difficulties. Out of that came another thought, which answers the point made by the noble Baroness, Lady Stowell: so much of this is about fine judgments on certain things that are there in stone and that you can work to but you then have to interpret them.

There is a role for Parliament here, I think; we will come on to this in later amendments but, if there is a debate to be had on this, let us not forget the points that have been made here today. If we are going to think again about Ofcom’s activity in practice, that is the sort of thing that either a Joint Committee or Select Committees of the two Houses could easily take on board as an issue that needs to be reflected on, with advice given to Parliament about how it might be taken forward. This might be the answer in the medium term.

In the short term, let us work to the Bill and make sure that it works. Let us learn from the experience but let us then take time out to reflect on it; that would be my recommendation but, obviously, that will be subject to the situation after we finish the Bill. I look forward to hearing the Minister’s response.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, that would be welcome.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

Can I suggest one of mine?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I thank the noble Lord.

I was pleased to hear about Wicipedia Cymraeg—there being no “k” in Welsh. As the noble Lord, Lord Stevenson, said, there has been a very good conversational discussion in this debate, as befits Committee and a self-regulating House. My noble friend Lady Stowell is right to point out matters of procedure, although we were grateful to know why the noble Viscount, Lord Colville, supports the amendments in question.

--- Later in debate ---
Companies will need to ensure that they have effective systems to enable them to check the broader context relating to content when deciding whether or not to remove it. This will provide greater certainty about the standard to be applied by providers when assessing content, including judgments about whether or not content is illegal. We think that protects against over-removal by making it clear that platforms are not required to remove content merely on the suspicion of it being illegal. Beyond that, the framework also contains provisions about how companies’ systems and processes should approach questions of mental states and defences when considering whether or not content is an offence in the scope of the Bill.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I am struggling a little to understand why the Minister thinks that sufficient evidence is subjective, and therefore, I assume, reasonable grounds to infer is objective. Certainly, in my lexicon, evidence is more objective than inference, which is more subjective. I was reacting to that word. I am not sure that he has fully made the case as to why his wording is better.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

Or indeed any evidence.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

Again, I think that that is clear. I understood from the Bill that, if an American says something that would be illegal were they to be in the United Kingdom, we would still want to exclude that content. But that still leaves it open, and I just ask the question again, for confirmation. If all of the activities are outside the United Kingdom—Americans talking to each other, as it were—and a British person objects, at what point would the platform be required to restrict the content of the Americans talking to each other? Is it pre-emptively or only as and when somebody in the United Kingdom objects to it? We should flesh out that kind of practical detail before this becomes law.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

If it has been committed in the UK and is viewed by a UK user, it can be treated as illegal. I will follow up on the noble Lord’s further points ahead of the next stage.

Amendment 272 explicitly provides that relevant information that is reasonably available to a provider includes information submitted by users in complaints. Providers will already need to do this when making judgments about content, as it will be both relevant and reasonably available.

My noble friend Lord Moylan returned to the question that arose on day 2 in Committee, querying the distinction between “protect” and “prevent”, and suggesting that a duty to protect would or could lead to the excessive removal of content. To be clear, the duty requires platforms to put in place proportionate systems and processes designed to prevent users encountering content. I draw my noble friend’s attention to the focus on systems and processes in that. This requires platforms to design their services to achieve the outcome of preventing users encountering such content. That could include upstream design measures, as well as content identification measures, once content appears on a service. By contrast, a duty to protect is a less stringent duty and would undermine the proactive nature of the illegal content duties for priority offences.

--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I support this group of amendments, so ably introduced by my noble friend and other noble Lords this afternoon.

I am not a lawyer and I would not say that I am particularly experienced in this business of legislating. I found this issue incredibly confusing. I hugely appreciate the briefings and discussions—I feel very privileged to have been included in them—with my noble friend the Minister, officials and the Secretary of State herself in their attempt to explain to a group of us why these amendments are not necessary. I was so determined to try to understand this properly that, yesterday, when I was due to travel to Surrey, I took all my papers with me. I got on the train at Waterloo and started to work my way through the main challenges that officials had presented.

The first challenge was that, fundamentally, these amendments cut across the Bill’s definitions of “primary priority content” and “priority content”. I tried to find them in the Bill. Unfortunately, in Clause 54, there is a definition of primary priority content. It says that, basically, primary priority content is what the Secretary of State says it is, and that content that is harmful to children is primary priority content. So I was none the wiser on Clause 54.

One of the further challenges that officials have given us is that apparently we, as a group of noble Lords, were confusing the difference between harm and risk. I then turned to Clause 205, which comes out with the priceless statement that a risk of harm should be read as a reference to harm—so maybe they are the same thing. I am still none the wiser.

Yesterday morning, I found myself playing what I can only describe as a parliamentary game of Mornington Crescent, as I went round and round in circles. Unfortunately, it was such a confusing game of Mornington Crescent that I forgot that I needed to change trains, ended up in Richmond instead of Redhill, and missed my meeting entirely. I am telling the Committee this story because, as the debate has shown, it is so important that we put in the Bill a definition of the harms that we are intending to legislate for.

I want to address the points made by the noble Baroness, Lady Fox. She said that we might not all agree on what harms are genuinely harmful for children. That is precisely why Parliament needs to decide this, rather than abdicate it to a regulator who, as other noble Lords said earlier today, is then put into a political space. It is the job of Parliament to decide what is dangerous for our children and what is not. That is the approach that we take in the physical world, and it should be the approach that we take in the online world. We should do that in broad categories, which is why the four Cs is such a powerful framework. I know that we are all attempting to predict the known unknowns, which is impossible, but this framework, which gives categories of harm, is clear that it can be updated, developed and, as my noble friend Lord Bethell, said, properly consulted on. We as parliamentarians should decide; that is the purpose of voting in Parliament.

I have a couple of questions for my noble friend the Minister. Does he agree that Parliament needs to decide what the categories of online harms are that the Bill is attempting to protect our children from? If he does, why is it not the four Cs? If he really thinks it is not the four Cs, will he bring back an alternative schedule of harms?

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I will echo the sentiments of the noble Baroness, Lady Harding, in my contribution to another very useful debate, which has brought to mind the good debate that we had on the first day in Committee, in response to the amendment tabled by the noble Lord, Lord Stevenson, in which we were seeking to get into the Bill what we are actually trying to do.

I thought that the noble Baroness, Lady Fox, was also welcoming additional clarity, specifically in the area of psychological harm, which I agree with. Certainly in its earlier incarnations, the Bill was scattered throughout with references, some of which have been removed, but they are very much open to interpretation. I hope that we will come back to that.

I was struck by the point made by the noble Lord, Lord Russell, around what took place in that coroner’s hearing. You had two different platforms with different interpretations of what they thought that their duty of care would be. That is very much the point. In my experience, platforms will follow what they are told to follow. The challenge is when each of them comes to their own individual view around what are often complex areas. There we saw platforms presenting different views about their risk assessments. If we clarify that for them through amendments such as these, we are doing everyone a favour.

I again compliment my noble friend Lady Benjamin for her work in this area. Her speech was also a model of clarity. If we can bring some of that clarity to the legislation and to explaining what we want, that will be an enormous service.

The noble Lord, Lord Knight, made some interesting points around how this would add value to the Bill, teasing out some of the specific gaps that we have there. I look forward to hearing the response on that.

I was interested in the comments from the noble Lord, Lord Bethell, on mobile phone penetration. We should all hold in common that we are not going back to a time BC—before connection. Our children will be connected, which creates the imperative for us to get this right. There has perhaps been a tendency for us to bury our heads in the sand, and occasionally you hear that still—it is almost as if we would wish this world away. However, the noble Baroness, Lady Kidron, is at the other end of the spectrum; she has come alive on this subject, precisely because she recognises that that will not happen. We are in a world where our children will be connected, so it is on us to figure out how we want those connections to work and to instruct the people who provide those connective services on what they should do. It is certainly not for us to imagine that somehow they will all go away. We will come to that in later groups when we talk about minimum ages; if younger children are online, there is a real issue around how we are going to deal with that.

The right reverend Prelate the Bishop of Oxford highlighted some really important challenges based on real experiences that families today are suffering—let us use the word as it should be—and made the case for clarity. I do not know how much we are allowed to talk in praise of EU legislation, but I am looking at the Digital Services Act—I have looked at a lot of EU legislation—and this Bill, and there is a certain clarity to EU regulation, particularly the process of adding recitals, which are attached to the law and explain what it is meant to do. That is sometimes missing here. I know that there are different legal traditions, but you can sometimes look at an EU regulation and the UK law and the former appears to be much clearer in its intent.

That brings me to the substance of my comments in response to this group, so ably introduced by the noble Baroness, Lady Kidron. I hope that the Government heed and recognise that, at present, no ordinary person can know what is happening in the Bill—other than, perhaps, the wife of the noble Lord, Lord Stevenson, who will read it for fun—and what we intend to do.

I was thinking back to the “2B or not 2B” debate we had earlier about the lack of clarity around something even as simple as the classification of services. I was also thinking that, if you ask what the Online Safety Bill does to restrict self-harm content, the answer would be this: if it is a small social media platform, it will probably be categorised as a 2B service, then we can look at Schedule 7, where it is prohibited from assisting suicide, but we might want to come back to some of the earlier clauses with the specific duties—and it will go on and on. As the noble Baroness, Lady Harding, described, you are leaping backwards and forwards in the Bill to try to understand what we are trying to do with the legislation. I think that is a genuine problem.

In effect, the Bill is Parliament setting out the terms of service for how we want Ofcom to regulate online services. We debated terms of service earlier. What is sauce for the goose is sauce for the gander. We are currently failing our own tests of simplicity and clarity on the terms of service that we will give to Ofcom.

As well as platforms, if ordinary people want to find out what is happening, then, just like those platforms with the terms of service, we are going to make them read hundreds of pages before they find out what this legislation is intended to do. We can and should make this simpler for children and parents. I was able to meet Ian Russell briefly at the end of our Second Reading debate. He has been an incredibly powerful and pragmatic voice on this. He is asking for reasonable things. I would love to be able to give a Bill to Ian Russell, and the other families that the right reverend Prelate the Bishop of Oxford referred to, that they can read and that tells them very clearly how Parliament has responded to their concerns. I think we are a long way short of that simple clarity today.

It would be extraordinarily important for service providers, as I already mentioned in response to the noble Lord, Lord Russell. They need that clarity, and we want to make sure that they have no reason to say, “I did not understand what I was being asked to do”. That should be from the biggest to the smallest, as the noble Lord, Lord Moylan, keeps rightly raising with us. Any small service provider should be able to very clearly and simply understand what we are intending to do, and putting more text into the Bill that does that would actually improve it. This is not about adding a whole load of new complications and the bells and whistles we have described but about providing clarity on our intention. Small service providers would benefit from that clarity.

The noble Baroness, Lady Ritchie, rightly raised the issue of the speed of the development of technology. Again, we do not want the small service provider in particular to think it has to go back and do a whole new legal review every time the technology changes. If we have a clear set of principles, it is much quicker and simpler for it to say, “I have developed a new feature. How does it match up against this list?”, rather than having to go to Clause 12, Clause 86, Clause 94 and backwards and forwards within the Bill.

It will be extraordinarily helpful for enforcement bodies such as Ofcom to have a yardstick—again, this takes us back to our debate on the first day—for its prioritisation, because it will have to prioritise. It will not be able to do everything, everywhere, all at once. If we put that prioritisation into the legislation, it will, frankly, save potential arguments between Parliament, the Government and Ofcom later on, when they have decided to prioritise X and we wanted them to prioritise Y. Let us all get aligned on what we are asking them to do up front.

Dare I say—the noble Baroness, Lady Harding, reminded me of this—that it may also be extraordinarily helpful for us as politicians so that we can understand the state of the law. I mean not just the people who are existing specialists or are becoming specialists in this area and taking part in this debate but the other hundreds of Members of both Houses, because this is interesting to everyone. I have experience of being in the other place, and every Member of the other place will have constituents coming to them, often with very tragic circumstances, and asking what Parliament has done. Again, if they have the Online Safety Bill as currently drafted, I think it is hard for any Member of Parliament to be able to say clearly, “This is what we have done”. With those words and that encouraging wind, I hope the Government are able to explain, if not in this way, that they have a commitment to ensuring that we have that clarity for everybody involved in this process.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will build on my noble friend’s comments. We have what I call the Andrew Tate problem. That famous pornographer and disreputable character started a business in a shed in Romania with a dozen employees. By most people’s assessment, it would have been considered a small business but, through his content of pornography and the physical assault of women, he extremely quickly built something that served an estimated 3 billion pages, and it has had a huge impact on the children of the English-speaking world. A small business became a big, nasty business very quickly. That anecdote reinforces the point that small does not mean safe, and, although I agree with many of my noble friend’s points, the lens of size is perhaps not the right one to look through.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I did not want to interrupt the noble Lord, Lord Moylan, in full flow as he introduced the amendments, but I believe he made an error in terms of the categorisation. The error is entirely rational, because he took the logical position rather than the one in the Bill. It is a helpful error because it allows us to quiz the Minister on the rationale for the categorisation scheme.

As I read it, in Clause 86, the categories are: category 1, which is large user-to-user services; category 2A, which is search or combined services; and category 2B, which is small user-to-user services. To my boring and logical binary brain, I would expect it to be: “1A: large user-to-user”; “1B: small user-to-user”; “2A: large search”; and “2B: small search”. I am curious about why a scheme like that was not adopted and we have ended up with something quite complicated. It is not only that: we now have this Part 3/Part 5 thing. I feel that we will be confused for years to come: we will be deciding whether something is a Part 3 2B service or a Part 5 service, and we will end up with a soup of numbers and letters that do not conform to any normal, rational approach to the world.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I must first apologise for my slightly dishevelled appearance as I managed to spill coffee down my shirt on my way to the Chamber. I apologise for that—as the fumes from the dried coffee suffuse the air around me. It will certainly keep me caffeinated for the day ahead.

Search services play a critical role in users’ online experience, allowing them easily to find and access a broad range of information online. Their gateway function, as we have discussed previously, means that they also play an important role in keeping users safe online because they have significant influence over the content people encounter. The Bill therefore imposes stringent requirements on search services to tackle the risks from illegal content and to protect children.

Amendments 13, 15, 66 to 69 and 73 tabled by my noble friend Lord Moylan seek to narrow the scope of the Bill so that its safety search duties apply only to the largest search services—categorised in the Bill as category 2A services—rather than to all search services. Narrowing the scope in this way would have an adverse impact on the safety of people using search services, including children. Search services, including combined services, below the category 2A threshold would no longer have a duty to minimise the risk of users encountering illegal content or children encountering harmful content in or via search results. This would increase the likelihood of users, including children, accessing illegal content and children accessing harmful content through these services.

The Bill already takes a targeted approach and the duties on search services will be proportionate to the risk of harm and the capacity of companies. This means that services which are smaller and lower-risk will have a lighter regulatory burden than those which are larger and higher-risk. All search services will be required to conduct regular illegal content risk assessments and, where relevant, children’s risk assessments, and then implement proportionate mitigations to protect users, including children. Ofcom will set out in its codes of practice specific steps search services can take to ensure compliance and must ensure that these are proportionate to the size and capacity of the service.

The noble Baroness, Lady Kidron, and my noble friend Lady Harding of Winscombe asked how search services should conduct their risk assessments. Regulated search services will have a duty to conduct regular illegal content risk assessments, and where a service is likely to be accessed by children it will have a duty to conduct regular children’s risk assessments, as I say. They will be required to assess the level and nature of the risk of individuals encountering illegal content on their service, to implement proportionate mitigations to protect people from illegal content, and to monitor them for effectiveness. Services likely to be accessed by children will also be required to assess the nature and level of risk of their service specifically for children to identify and implement proportionate mitigations to keep children safe, and to monitor them for effectiveness as well.

Companies will also need to assess how the design and operation of the service may increase or reduce the risks identified and Ofcom will have a duty to issue guidance to assist providers in carrying out their risk assessments. That will ensure that providers have, for instance, sufficient clarity about what an appropriate risk assessment looks like for their type of service.

The noble Lord, Lord Allan, and others asked about definitions and I congratulate noble Lords on avoiding the obvious

“To be, or not to be”


pun in the debate we have just had. The noble Lord, Lord Allan, is right in the definition he set out. On the rationale for it, it is simply that we have designated as category 1 the largest and riskiest services and as category 2 the smaller and less risky ones, splitting them between 2A, search services, and 2B, user-to-user services. We think that is a clear framework. The definitions are set out a bit more in the Explanatory Notes but that is the rationale.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I am grateful to the Minister for that clarification. I take it then that the Government’s working assumption is that all search services, including the biggest ones, are by definition less risky than the larger user-to-user services. It is just a clarification that that is their thinking that has informed this.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

As I said, the largest and riskiest sites may involve some which have search functions, so the test of large and most risky applies. Smaller and less risky search services are captured in category 2A.

Amendment 157 in the name of my noble friend Lord Pickles, and spoken to by the noble Baroness, Lady Deech, seeks to apply new duties on the largest search services. I agree with the objectives in my noble friend’s amendment of increasing transparency about the search services’ operations and enabling users to hold them to account. It is not, however, an amendment I can accept because it would duplicate existing duties while imposing new duties which we do not think are appropriate for search services.

As I say, the Bill will already require search services to set out how they are fulfilling their illegal content and child safety duties in publicly available statements. The largest search services—category 2A—will also be obliged to publish a summary of their risk assessments and to share this with Ofcom. That will ensure that users know what to expect on those search services. In addition, they will be subject to the Bill’s requirements relating to user reporting and redress. These will ensure that search services put in place effective and accessible mechanisms for users to report illegal content and content which is harmful to children.

My noble friend’s amendment would ensure that the requirements to comply with its publicly available statements applied to all actions taken by a search service to prevent harm, not just those relating to illegal content and child safety. This would be a significant expansion of the duties, resulting in Ofcom overseeing how search services treat legal content which is accessed by adults. That runs counter to the Government’s stated desire to avoid labelling legal content which is accessed by adults as harmful. It is for adult users themselves to determine what legal content they consider harmful. It is not for us to put in place measures which could limit their access to legal content, however distasteful. That is not to say, of course, that where material becomes illegal in its nature that we do not share the determination of the noble Baroness, my noble friend and others to make sure that it is properly tackled. The Secretary of State and Ministers have had extensive meetings with groups making representations on this point and I am very happy to continue speaking to my noble friend, the noble Baroness and others if they would welcome it.

I hope that that provides enough reassurance for the amendment to be withdrawn at this stage.

--- Later in debate ---
Moved by
14: Clause 6, page 5, line 38, at end insert—
“(6A) Providers of regulated user-to-user services are required to comply with duties under subsections (2) to (6) for each such service which they provide to the extent that is proportionate and technically feasible without making fundamental changes to the nature of the service (for example, by removing or weakening end-to-end encryption on an end-to-end encrypted service).”Member’s explanatory statement
This amendment is part of a series of amendments by Lord Clement-Jones intended to ensure risk assessments are not used as a tool to undermine users’ privacy and security.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

My Lords, I propose Amendment 14 on behalf of my noble friend Lord Clement-Jones and the noble Lord, Lord Hunt of Kings Heath, who are not able to be present today due to prior commitments. I notice that the amendment has been signed also by the noble Baroness, Lady Fox, who I am sure will speak to it herself. I shall speak to the group of amendments as a whole.

I shall need to speak at some length to this group, as it covers some quite complex issues, even for this Bill, but I hope that the Committee will agree that this is appropriate given the amendments’ importance. I also expect that this is one area where noble Lords are receiving the most lobbying from different directions, so we should do it justice in our Committee.

We should start with a short summary of the concern that lies behind the amendments: that the Bill, as drafted, particularly under Clause 110, grants Ofcom the power to issue technical notices to online services that could, either explicitly or implicitly, require them to remove privacy protections—and, in particular, that this could undermine a technology that is increasingly being deployed on private messaging services called end-to-end encryption. The amendments in this group use various mechanisms to reduce the likelihood of that being an outcome. Amendments 14 and 108 seek to make it clear in the Bill that end-to-end encryption would be out of scope—and, as I understand it, Amendment 205, tabled by the noble Lord, Lord Moylan, seeks to do something similar.

A second set of amendments would add in extra controls over the issuing of technical notices. While not explicitly saying that these could not target E2EE—if noble Lords will excuse the double negative—they would make it less likely by ensuring that there is more scrutiny. They include a whole series of amendments—Amendments 202 and 206, tabled by the noble Lord, Lord Stevenson, and Amendment 207—that have the effect of ensuring that there is more scrutiny and input into issuing such a notice.

The third set of amendments aim to ensure that Ofcom gives weight more generally to privacy and to all the actions it takes in relation to it. In particular, Amendment 190 talks about a broader privacy duty, and Amendment 285—which I think noble Lord, Lord Moylan, will be excited about—seeks to restrict general monitoring.

I will now dig into why this is important. Put simply, there is a risk that under the Bill a range of internet services will feel that they are unable to offer their products in the UK. This speaks to a larger question as we debate the measures in the Bill, as it can sometimes feel as though we are comfortable ratcheting up the requirements in the Bill under the assumption that services will have no choice but to meet them and carry on. While online services will not have a choice about complying if they wish to be lawfully present in the UK, they will be free to exit the market altogether if they believe that the requirements are excessively onerous or impossible to meet.

In the Bill, we are constructing, in effect, a de facto licensing mechanism, where Ofcom will contact in-scope services—the category 2A, category 2B, Part 3 and Part 5 services we discussed in relation to the previous group of amendments—will order them to follow all the relevant regulation and guidance and will instruct them to pay a fee for that supervision. We have to consider that some services, on receipt of that notice, will take steps to restrict access by people in the UK rather than agree to such a licence. Where those are rogue services, this reaction is consistent with the aims of the Bill. We do not want services which are careless about online safety to be present in the UK market. But I do not believe that it is our aim to force mainstream services out of the UK market and, if there is a chance of that happening, it should give us pause for thought.

As a general rule, I am not given to apocalyptic warnings, but I believe there is a real risk that some of the concerns that noble Lords will be receiving in their inboxes are genuine, so I want to unpick why that may be the case. We should reflect for a moment on the assumptions we may have about the people involved in this debate and their motivations. We often see tech people characterised as oblivious to harms, and security services people as uncaring about human rights. In my experience, both caricatures are off the mark, as tech people hate to see their services abused and security service representatives understand that they need to be careful about how they exercise the great powers we have given them. We should note that, much of the time, those two communities work well together in spaces such the Global Internet Forum to Counter Terrorism.

If this characterisation is accurate, why do I think we may have a breakdown over the specific technology of end-to-end encryption? To understand this subject, we need to spend a few moments looking at trends in technology and regulation over recent years. First, we can look at the growth of content-scanning tools, which I think may have been in the Government’s mind when they framed and drafted the new Clause 110 notices. As social media services developed, they had to consider the risks of hosting content on the services that users had uploaded. That content could be illegal in all sorts of ways, including serious forms, such as child sexual abuse material and terrorist threats, as well as things such as copyright infringement, defamatory remarks and so on. Platforms have strong incentives to keep that material off their servers for both moral and legal reasons, so they began to develop and deploy a range of tools to identify and remove it. As a minimum, most large platforms now deploy systems to capture child sexual abuse material and copyright-infringing material, using technologies such as PhotoDNA and Audible Magic.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I point out that one of the benefits of end-to-end encryption is that it precisely stops companies doing things such as targeted advertising based on the content of people’s communications. Again, I think there is a very strong and correct trend to push companies in that direction.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank the noble Lord for the intervention. For those noble Lords who are not following the numbers, Amendment 285, which I support, would prevent general monitoring. Apart from anything else, I am worried about equivalence and other issues in relation to general monitoring. Apart from a principled position against it, I think to be explicit is helpful.

Ofcom needs to be very careful, and that is what Amendment 190 sets out. It asks whether the alternatives have been thought about, whether the conditions have been thought about, and whether the potential impact has been thought about. That series of questions is essential. I am probably closer to the community that wants to see more powers and more interventions, but I would like that to be in a very monitored and regulated form.

I thank the noble Lord for his contribution. Some of these amendments must be supported because it is worrying for us as a country to have—what did the noble Lord call it?—ambiguity about whether something is possible. I do not think that is a useful ambiguity.

--- Later in debate ---
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I agree with the noble Baroness, which is precisely why I am suggesting that we need to consider whether privacy should be sacrificed totally in relation to the argument around encryption. It is difficult, and I feel awkward saying it. When I mentioned a silver bullet I was not talking about the noble Baroness or any other noble Lords present, but I have heard people say that we need this Bill because it will deal with child abuse. In this group of amendments, I am raising the fact that when I have talked about encryption with people outside of the House they have said that we need to do something to tackle the fact that these messages are being sent around. It is not just child abuse; it is also terrorism. There is a range of difficult situations.

Things can go wrong with this, and that is what I was trying to raise. For example, we have a situation where some companies are considering using, or are being asked to use, machine learning to detect nudity. Just last year, a father lost his Google account and was reported to the police for sending a naked photo of their child to the doctor for medical reasons. I am raising these as examples of the problems that we have to consider.

Child abuse is so abhorrent that we will do anything to protect children, but let me say this to the Committee, as it is where the point on privacy lies: children are largely abused in their homes, but as far as I understand it we are not as yet arguing that the state should put CCTV cameras in every home for 24/7 surveillance to stop child abuse. That does not mean that we are glib or that we do not understand the importance of child abuse; it means that we understand the privacy of your home. There are specialist services that can intervene when they think there is a problem. I am worried about the possibility of putting a CCTV camera in everyone’s phone, which is the danger of going down this route.

My final point is that these services, such as WhatsApp, will potentially leave the UK. It is important to note that. I agree with the noble Lord, Lord Allan: this is not like threatening to storm off. It is not done in any kind of pique in that way. In putting enormous pressure on these platforms to scan communications, we must remember that they are global platforms. They have a system that works for billions of people all around the world. A relatively small market such as the UK is not something for which they would compromise their billions of users around the world. As I have explained, they would not put up with it if the Chinese state said, “We have to see people’s messages”. They would just say, “We are encrypted services”. They would walk out of China and we would all say, “Well done”. There is a real, strong possibility of these services leaving the UK so we must be very careful.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I just want to add to the exchange between the noble Baronesses, Lady Kidron and Lady Fox. The noble Baroness, Lady Fox, referred to WhatsApp’s position. Again, it is important for the public out there also to understand that if someone sends them illegal material—in particular child sexual abuse material; I agree with the noble Baroness, Lady Kidron, that this is a real problem—and they report it to WhatsApp, which has a reporting system, that material is no longer encrypted. It is sent in clear text and WhatsApp will give it to the police. One of the things I am suggesting is that, rather than driving WhatsApp out of the country, because it is at the more responsible end of the spectrum, we should work with it to improve these kinds of reporting systems and put the fear of God into people so that they know that this issue is not cost-free.

As a coda to that, if you ever receive something like that, you should report it to the police straightaway because, once it is on your phone, you are liable and you have a problem. The message from here should be: if you receive it, report it and, if it is reported, make sure that it gets to the police. We should be encouraging services to put those systems in place.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

The noble Lord has concluded with my conclusion, which was to say that those services will be driven out, but not because they are irresponsible around horrible, dangerous messages. They do not read our messages because they are private. However, if we ever receive anything that makes us feel uncomfortable, they should be put under pressure to act. Many of them already do and are actually very responsible, but that is different from demanding that they scan our messages and we breach that privacy.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, I think it is right. The investigatory powers Act is a tool for law enforcement and intelligence agencies, whereas the Bill is designed to regulate technology companies—an important high-level distinction. As such, the Bill does not grant investigatory powers to state bodies. It does not allow the Government or the regulator to access private messages. Instead, it requires companies to implement proportionate systems and processes to tackle illegal content on their platforms. I will come on to say a little about legal redress and the role of the courts in looking at Ofcom’s decisions so, if I may, I will respond to that in a moment.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

The investigatory powers Act includes a different form of technical notice, which is to put in place surveillance equipment. The noble Lord, Lord Stevenson, has a good point: we need to ensure that we do not have two regimes, both requiring companies to put in place technical equipment but with quite different standards applying.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will certainly take that point away and I understand, of course, that different Acts require different duties of the same platforms. I will take that away and discuss it with colleagues in other departments who lead on investigatory powers.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am about to talk about the safeguards for journalists in the context of the Bill and the questions posed by the noble Baroness, Lady Bennett. However, I take my noble friend’s point about the implications of other Acts that are already on the statute book in that context as well.

Just to finish the train of thought of what I was saying on Amendment 202, making a reference to encryption, as it suggests, would be out of step with the wider approach of the Bill, which is to remain technology-neutral.

I come to the safeguards for journalistic protections, as touched on by the noble Baroness, Lady Bennett. The Government are fully committed to protecting the integrity of journalistic sources, and there is no intention or expectation that the tools required to be used under this power would result in a compromising of those sources. Any tools required on private communications must be accredited by Ofcom as highly accurate only in detecting child sexual abuse and exploitation content. These minimum standards of accuracy will be approved and published by the Secretary of State, following advice from Ofcom. We therefore expect it to be very unlikely that journalistic content will be falsely detected by the tools being required.

Under Clause 59, companies are obliged to report child sexual abuse material which is detected on their service to the National Crime Agency; this echoes a point made by the noble Lord, Lord Allan, in an earlier contribution. That would include child sexual abuse and exploitation material identified through tools required by a notice and, even in this event, the appropriate protections in relation to journalistic sources would be applied by the National Crime Agency if it were necessary to identify individuals involved in sharing illegal material.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I want to flag that in the context of terrorist content, this is quite high risk for journalists. It is quite common for them, for example, to be circulating a horrific ISIS video not because they support ISIS but because it is part of a news article they are putting together. We should flag that terrorist content in particular is commonly distributed by journalists and it could be picked up by any system that is not sufficiently sophisticated.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I see that my noble friend Lord Murray of Blidworth has joined the Front Bench in anticipation of the lunch-break business for the Home Office. That gives me the opportunity to say that I will discuss some of these points with him, my noble friend Lord Sharpe of Epsom and others at the Home Office.

Amendment 246 aims to ensure that there is no requirement for a provider to comply with a notice until the High Court has determined the appeal. The Government have ensured that, in addition to judicial review through the High Court, there is an accessible and relatively affordable alternative means of appealing Ofcom’s decisions via the Upper Tribunal. We cannot accept amendments such as this, which could unacceptably delay Ofcom’s ability to issue a notice, because that would leave children vulnerable.

To ensure that Ofcom’s use of its powers under Clause 110, and the technology that underpins it, are transparent, Ofcom will produce an annual report about the exercise of its functions using these powers. This must be submitted to the Secretary of State and laid before Parliament. The report must also provide the details of technology that has been assessed as meeting minimum standards of accuracy, and Ofcom may also consider other factors, including the impact of technologies on privacy. That will be separate to Ofcom’s annual report to allow for full scrutiny of this power.

The legislation also places a statutory requirement on Ofcom to publish guidance before its functions with regard to Clause 110 come into force. This will be after Royal Assent, given that the legislation is subject to change until that point. Before producing the guidance, Ofcom must consult the Information Commissioner. As I said, there are already strong safeguards regarding Ofcom’s use of these powers, so we think that this additional oversight is unnecessary.

Amendments 203 and 204, tabled by the noble Lord, Lord Clement-Jones, seek to probe the privacy implications of Ofcom’s powers to require technology under Clause 110. I reiterate that the Bill will not ban or weaken any design, including end-to-end encryption. But, given the scale of child sexual abuse and exploitation taking place on private communications, it is important that Ofcom has effective powers to require companies to tackle this abhorrent activity. Data from the Office for National Statistics show that in nearly three-quarters of cases where children are contacted online by someone they do not know, this takes place by private message. This highlights the scale of the threat and the importance of technology providers taking steps to safeguard children in private spaces online.

As already set out, there are already strong safeguards regarding the use of this power, and these will prevent Ofcom from requiring the use of any technology that would undermine a platform’s security and put users’ privacy at risk. These safeguards will also ensure that platforms will not be required to conduct mass scanning of private communications by default.

Until the regime comes into force, it is of course not possible to say with certainty which tools would be accredited. However, some illustrative examples of the kinds of current tools we might expect to be used—providing that they are highly accurate and compatible with a service’s design—are machine learning or artificial intelligence, which assess content to determine whether it is illegal, and hashing technology, which works by assigning a unique number to an image that has been identified as illegal.

Given the particularly abhorrent nature of the crimes we are discussing, it is important that services giving rise to a risk of child sexual abuse and exploitation in the UK are covered, wherever they are based. The Bill, including Ofcom’s ability to issue notices in relation to this or to terrorism, will therefore have extraterritorial effect. The Bill will apply to any relevant service that is linked to the UK. A service is linked to the UK if it has a significant number of UK users, if UK users form a target market or if the service is capable of being used in the UK and there is a material risk of significant harm to individuals in the UK arising from the service. I hope that that reassures the noble Lord, on behalf of his noble friend, about why that amendment is not needed.

Amendments 209 to 214 seek to place additional requirements on Ofcom to consider the effect on user privacy when using its powers under Clause 110. I agree that tackling online harm needs to take place while protecting privacy and security online, which is why Ofcom already has to consider user privacy before issuing notices under Section 110, among the other stringent safeguards I have set out. Amendment 202A would impose a duty on Ofcom to issue a notice under Clause 110, where it is satisfied that it is necessary and proportionate to do so—this will have involved ensuring that the safeguards have been met.

Ofcom will have access to a wide range of information and must have the discretion to decide the most appropriate course of action in any particular scenario, including where this action lies outside the powers and procedures conferred by Clause 110; for instance, an initial period of voluntary engagement. This is an in extremis power. It is essential that we balance users’ rights with the need to enable a strong response, so Ofcom must be able to assess whether any alternative, less intrusive measures would effectively reduce the level of child sexual exploitation and abuse or terrorist content occurring on a service before issuing a notice.

I hope that that provides reassurance to noble Lords on the amendments in this group, and I invite the noble Lord to withdraw Amendment 14.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

My Lords, this has been a very useful debate and serves as a good appetite builder for lunch, which I understand we will be able to take shortly.

I am grateful to the Minister for his response and to all noble Lords who have taken part in the debate. As always, the noble Baroness, Lady Kidron, gave us a balanced view of digital rights—the right to privacy and to security—and the fact that we should be trying to advance these two things simultaneously. She was right again to remind us that this is a real problem and there is a lot we can do. I know she has worked on this through things such as metadata—understanding who is communicating with whom—which might strike that nice balance where we are not infringing on people’s privacy too grossly but are still able to identify those who wish harm on our society and in particular on our children.

The noble Baroness, Lady Bennett, was right to pick up this tension between everything, everywhere, all at once and targeted surveillance. Again, that is really interesting to tease out. I am personally quite comfortable with quite intrusive targeted surveillance. I do not know whether noble Lords have been reading the Pegasus spyware stories: I am not comfortable with some Governments placing such spyware on the phones of human rights defenders but I would be much more relaxed about the British authorities placing something similar on the phones of people who are going to plant bombs in Manchester. We need to be really honest about where we are drawing our red lines if we want to go in the direction of targeted surveillance.

The noble Lord, Lord Moylan, was right again to remind us about the importance of private conversations. I cited the example of police officers whose conversations have been exposed. Although it is hard, we should remember that if ordinary citizens want to exchange horrible racist jokes with each other and so on in private groups that is not a matter for the state, but it is when it is somebody in a position of public authority; we have a right to intervene there. Again, we have to remember that as long as it is not illegal people can say horrible things in private, and we should not encourage any situation where we suggest that the state would interfere unless there are legitimate grounds—for example, it is a police officer or somebody is doing something that crosses the line of legality.

The noble Baroness, Lady Fox, reminded us that it is either encrypted or it is not. That is really helpful, as things cannot be half encrypted. If a service provider makes a commitment it is critical that it is truthful. That is what our privacy law tells us. If I say, “This service is encrypted between you and the person you send the message to”, and I know that there is somebody in between who could access it, I am lying. I cannot say it is a private service unless it is truly private. We have to bear that in mind. Historically, people might have been more comfortable with fudging it, but not in 2023, when have this raft of privacy legislation.

The noble Baroness is also right to remind us that privacy can be safety. There is almost nothing more devastating than the leaking of intimate images. When services such as iCloud move to encrypted storage that dramatically reduces the risk that somebody will get access to your intimate images if you store them there, which you are legally entitled to do. Privacy can be a critical part of an individual maintaining their own security and we should not lose that.

The noble Baroness, Lady Stowell, was right again to talk about general monitoring. I am pleased that she found the WhatsApp briefing useful. I was unable to attend but I know from previous contact that there are people doing good work and it is sad that that often does not come out. We end up with this very polarised debate, which my noble friend Lord McNally was right to remind us is unhelpful. The people south of the river are often working very closely in the public interest with people in tech companies. Public rhetoric tends to focus on why more is not being done; there are very few thanks for what is being done. I would like to see the debate move a little more in that direction.

The noble Lord, Lord Knight, opened up a whole new world of pain with VPNs, which I am sure we will come back to. I say simply that if we get the regulatory frameworks right, most people in Britain will continue to use mainstream services as long as they are allowed to be offered. If those services are regulated by the European Union under its Digital Services Act and pertain to the UK and the US in a similar way, they will in effect have global standards, so it will not matter where you VPN from. The scenario the noble Lord painted, which I worry about, is where those mainstream services are not available and we drive people into small, new services that are not regulated by anyone. We would then end up inadvertently driving people back to the wild west that we complain about, when most of them would prefer to use mainstream services that are properly regulated by Ofcom, the European Commission and the US authorities.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I want to inject into the debate some counterarguments, which I hope will be received in the constructive spirit in which they are intended. Primarily, I want to argue that a level playing field is not the right solution here and that there is a strong logic for a graduated response. It is often tempting to dial everything up to 11 when you have a problem, and we clearly do have an issue around child access to pornography. But from a practical point of view, the tools we are giving our regulator are better served by being able to treat different kinds of services differently.

I think there are three classes of service that we are thinking about here. The first is a service with the primary purpose and explicit intent to provide pornography and nothing else. A regime dedicated to those sites is quite appropriate. Such a service might have not just the strongest levels of age verification but a whole other set of requirements, which I know we will debate later, around content verification and all sorts of other things that kick into play. The second category is made up of services that are primarily designed for social interaction which prohibit pornography and make quite strenuous efforts to keep it off. Facebook is such a service. I worked there, and we worked hard to try to keep pornography off. We could not guarantee that it was never present, but that was our intent: we explicitly wanted to be a non-pornographic site. Then there are—as the noble Lord, Lord Bethell, pointed out—other services, such as Twitter, where the primary purpose is social but a significant proportion of adult content is allowed.

I suggest that one of the reasons for having a graduated response is that, from our point of view, we would like services to move towards porn reduction, and for those general-purpose services to prohibit porn as far as possible. That is our intent. If we have a regulatory system that says, “Look, we’re just going to treat you all the same anyway”, we may provide a perverse incentive for services not to move up the stack, as it were, towards a regime where by having less pornographic or sexualised content, they are able to see some benefit in terms of their relationship with the regulator. That is the primary concern I have around this: that by treating everybody the same, we do not create any incentive for people to deal with porn more effectively and thereby get some relief from the regulator.

From a practical point of view, the relationship that the regulator has is going to be critical to making all these things work. Look at what has been happening in continental Europe. There have been some real issues around enforcing laws that have been passed in places such as France and Germany because there has not been the kind of relationship that the regulator needs with the providers. I think we would all like to see Ofcom in a better position, and one of the ways it can do that is precisely by having different sets of rules. When it is talking to a pure pornography site, it is a different kind of conversation from the one it is going to have with a Twitter or a Facebook. Again, they need to have different rules and guidance that are applied separately.

The intent is right: we want to stop under-18s getting on to those pure porn sites, and we need one set of tools to do that. When under-18s get on to a social network that has porn on it, we want the under-18s, if they meet the age requirement, to have access—that is perfectly legitimate—but once they get there, we want them kept out of the section that is adult. For a general-purpose service that prohibits porn, I think we can be much more relaxed, at least in respect of pornography but not in respect of other forms of harmful content—but we want the regulator to be focused on that and not on imposing porn controls. That graduated response would be helpful to the regulator.

Some of the other amendments that the noble Lord, Lord Bethell, has proposed will help us to talk about those kinds of measures—what Twitter should do inside Twitter, and so on—but the amendments we have in front of us today are more about dialling it all up to 11 and not allowing for that graduation. That is the intent I heard from the amendments’ proposers. As I say, that is the bit that, respectfully, may end up being counterproductive.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

Could the noble Lord advise us on how he would categorise a site such as Twitter, on which it is estimated that 13% of the page deliveries are to do with pornography? Does it qualify as a pornography site? To me, it is ambiguous. Such a large amount of its financial revenue comes from pages connected with pornography that it seems it has a very big foot in the pornography industry. How would he stop sites gaming definitions to benefit from one schedule or another? Does he think that puts great pressure on the regulator to be constantly moving the goalposts in order to capture who it thinks might be gaming the system, instead of focusing on content definition, which has a 50-year pedigree, is very well defined in law and is an altogether easier status to analyse and be sure about?

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

The Twitter scenario, and other scenarios of mixed sites, are some of the most challenging that we have to deal with. But I would say, straightforwardly, “Look, 13% is a big chunk, but the primary purpose of Twitter is not the delivery of pornography”. I use Twitter on a daily basis and I have never seen pornography on it. I understand that it is there and that people can go for it, and that is an issue, but I think people out there would say that for most people, most of the time, the primary purpose of Twitter is not pornography.

What we want to do—in answer to the noble Lord’s second point—is create an incentive for people to be recategorised in the right direction. There is an assumption here that it is all going to be about gaming the system. I actually think that there is an opportunity here for genuine changes. There will be a conversation with Twitter. It will be interesting, given Twitter’s current management—apparently it is run by a dog, so there will be a conversation with the dog that runs Twitter. In that conversation, the regulator, Ofcom, on our behalf, will be saying, “You could change your terms of service and get rid of pornography”. Twitter will say yes or no. If it says no, Ofcom will say, “Well, here are all the things we expect you to do in order to wall off that part of the site”.

That is a really healthy and helpful conversation to have with Twitter. I expect it is listening now and already thinking about how it will respond. But it would expect that kind of treatment and conversation to be different; and I think the public would expect that conversation to be a different and better conversation than just saying “Twitter, you’re Pornhub. We’re just going to treat you like Pornhub”.

That is the distinction. As I say, we have an opportunity to get people to be more robust about either limiting or removing pornography, and I fear that the amendments we have in front of us would actually undermine rather than enhance that effort.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

At the centre of this is the question of whether we are trying to block the entire service or block at the level of porn content. It is the purpose of a set of amendments in the names of the noble Lord, Lord Bethell, myself and a number of other noble Lords to do exactly the latter. But I have to say to the noble Baroness that I am very much in sympathy with, first, putting porn behind an age gate; secondly, having a commencement clause; and, thirdly and very importantly—this has not quite come up in the conversation—saying that harms must be on the face of the Bill and that porn is not the only harm. I say, as a major supporter of the Bereaved Families for Online Safety, that “Porn is the only harm children face” would be a horrendous message to come from this House. But there is nothing in the noble Baroness’s amendments, apart from where the action happens, that I disagree with.

I also felt that the noble Baroness made an incredibly important point when she went into detail on Amendment 125A. I will have to read her speech in order to follow it, because it was so detailed, but the main point she made is salient and relates to an earlier conversation: the reason we have Part 5 is that the Government have insisted on this ridiculous thing about user-to-user and search, instead of doing it where harm is. The idea that you have Part 5, which is to stop the loophole of sites that do not have user-to-user, only to find that they can add user-to-user functionality and be another type of site, is quite ludicrous. I say to the Committee and the Minister, who I am sure does not want me to say it, “If you accept Amendment 2, you’d be out of that problem”—because, if a site was likely to be accessed by children and it had harm and we could see the harm, it would be in scope. That is the very common-sense approach. We are where we are, but let us be sensible about making sure the system cannot be gamed, because that would be ludicrous and would undermine everybody’s efforts—those of the Government and of all the campaigners here.

I just want to say one more thing because I see that the noble Lord, Lord Moylan, is back in his place. I want to put on the record that age assurance and identity are two very separate things. I hope that, when we come to debate the package of harms—unfortunately, we are not debating them all together; we are debating harms first, then AV—we get to the bottom of that issue because I am very much in the corner of the noble Lord and the noble Baroness, Lady Fox, on this. Identity and age assurance must not be considered the same thing by the House, and definitely not by the legislation.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, it falls to me to inject some grit into what has so far been a very harmonious debate, as I will raise some concerns about Amendments 2 and 22.

I again declare my interest: I spent 10 years working for Facebook, doing the kind of work that we will regulate in this Bill. At this point noble Lords are probably thinking, “So it’s his fault”. I want to stress that, if I raise concerns about the way the regulation is going, it is not that I hold those views because I used to work for the industry; rather, I felt comfortable working in the industry because I always had those views, back to 2003 when we set up Ofcom. I checked the record, and I said things then that are remarkably consistent with how I feel today about how we need to strike the balance between the power of the state and the power of the citizen to use the internet.

I also should declare an interest in respect of Amendment 2, in that I run a blog called regulate.tech. I am not sure how many children are queueing up to read my thoughts about regulation of the tech industry, but they would be welcome to do so. The blog’s strap- line is:

“How to regulate the internet without breaking it”.


It is very much in that spirit that I raise concerns about these two amendments.

I certainly understand the challenges for content that is outside of the user-to-user or search spaces. I understand entirely why the noble Baroness, Lady Kidron, feels that something needs to be done about that content. However, I am not sure that this Bill is the right vehicle to address that kind of content. There are principled and practical reasons why it might be a mistake to extend the remit here.

The principle is that the Bill’s fundamental purpose is to restrict access to speech by people in the United Kingdom. That is what legislation such as this does: it restricts speech. We have a framework in the Human Rights Act, which tells us that when we restrict speech we have to pass a rigorous test to show that those restrictions are necessary and proportionate to the objective we are trying to achieve. Clearly, when dealing with children, we weight very heavily in that test whether something is necessary and proportionate in favour of the interest of the welfare of the children, but we cannot do away with the test altogether.

It is clear that the Government have applied that test over the years that they have been preparing this Bill and determined that there is a rationale for intervention in the context of user-to-user services and search services. At the same time, we see in the Bill that the Government’s decision is that intervention is not justified in all sorts of other contexts. Email and SMS are excluded. First-party publisher content is excluded, so none of the media houses will be included. We have a Bill that is very tightly and specifically framed around dealing with intermediaries, whether that is user-to-user intermediaries who intermediate in user-generated content, or search as an intermediary, which scoops up content from across the internet and presents it to you.

This Bill is about regulating the regulators; it is not about regulating first-party speakers. A whole world of issues will come into play if we move into that space. It does not mean that it is not important, just that it is different. There is a common saying that people are now bandying around, which is that freedom of speech is not freedom of reach. To apply a twist to that, restrictions on reach are not the same as restrictions on speech. When we talk about restricting intermediaries, we are talking about restricting reach. If I have something I want to say and Facebook or Twitter will not let me say it, that is a problem and I will get upset, but it is not the same as being told that I cannot say it anywhere on the internet.

My concern about Amendment 2 is that it could lead us into a space where we are restricting speech across the internet. If we are going to do that—there may be a rationale for doing it—we will need to go back and look at our necessity and proportionality test. It may play out differently in that context from user-to-user or intermediary-based services.

From a practical point of view, we have a Bill that, we are told, will give Ofcom the responsibility of regulating 25,000 more or less different entities. They will all be asked to pay money to Ofcom and will all be given a bunch of guidance and duties that they have to fulfil. Again, those duties, as set out in painful length in the Bill, are very specifically about the kind of things that an intermediary should do to its users. If we were to be regulating blogs or people’s first-party speech, or publishers, or the Daily Telegraph, or whoever else, I think we would come up with a very different set of duties from the duties laid out in the Bill. I worry that, however well-motivated, Amendment 2 leads us into a space for which this Bill is not prepared.

I have a lot of sympathy with the views of the noble Baroness, Lady Harding, around the app stores. They are absolutely more like intermediaries, or search, but again the tools in the Bill are not necessarily dedicated to how one would deal with app stores. I was interested in the comments of the noble Baroness, Lady Stowell, on what will be happening to our competition authorities; a lot will be happening in that space. On app stores, I worry about what is in Amendment 22: we do not want app stores to think that it is their job to police the content of third-party services. That is Ofcom’s job. We do not want the app stores to get in the middle, not least because of these commercial considerations. We do not want Apple, for instance, thinking that, to comply with UK legislation, it might determine that WhatsApp is unsafe while iMessage is safe. We do not want Google, which operates Play Store, to think that it would have a legal rationale for determining that TikTok is unsafe while YouTube is safe. Again, I know that this is not the noble Baroness’s intention or aim, but clearly there is a risk that we open that up.

There is something to be done about app stores but I do not think that we can roll over the powers in the Bill. When we talk about intermediaries such as user-to-user services and search, we absolutely want them to block bad content. The whole thrust of the Bill is about forcing them to restrict bad content. When it comes to app stores, the noble Baroness set out some of her concerns, but I think we want something quite different. I hesitate to say this, as I know that my noble friend is supportive of it, but I think that it is important as we debate these issues that we hear some of those concerns.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Could it not be argued that the noble Lord is making a case for regulation of app stores? Let us take the example of Apple’s dispute with “Fortnite”, where Apple is deciding how it wants to police things. Perhaps if this became a more regulated space Ofcom could help make sure that there was freedom of access to some of those different products, regardless of the commercial interests of the people who own the app stores.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

The noble Lord makes a good point. I certainly think we are heading into a world where there will be more regulation of app stores. Google and Apple are commercial competitors with some of the people who are present in their stores. A lot of the people in their stores are in dispute with them over things such as the fees that they have to pay. It is precisely for that reason that I do not think we should be throwing online safety into the mix.

There is a role for regulating app stores, which primarily focuses on these commercial considerations and their position in the market. There may be something to be done around age-rating; the noble Baroness made a very good point about how age-rating works in app stores. However, if we look at the range of responsibilities that we are describing in this Bill and the tools that we are giving to intermediaries, we see that they are the wrong, or inappropriate, set of tools.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

Would the noble Lord acknowledge that app stores are already undertaking these age-rating and blocking decisions? Google has unilaterally decided that, if it assesses that you are under 18, it will not serve up over-18 apps. My concern is that this is already happening but it is happening indiscriminately. How would the noble Lord address that?

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

The noble Baroness makes a very good point; they are making efforts. There is a role for app stores to play but I hope she would accept that it is qualitatively different from that played by a search engine or a user-to-user service. If we were to decide, in both instances, that we want app stores to have a greater role in online safety and a framework that allows us to look at blogs and other forms of content, we should go ahead and do that. All I am arguing is that we have a Bill that is carefully constructed around two particular concepts, a user-to-user service and a search engine, and I am not sure it will stretch that far.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I want to reassure the noble Lord: I have his blog in front of me and he was quite right—there were not a lot of children on that site. It is a very good blog, which I read frequently.

I want to make two points. First, age-rating and age-gating are two different things, and I think the noble Lord has conflated them. There is a lot of age- rating going on, and it is false information. We need good information, and we have not managed to get it by asking nicely. Secondly, I slightly dispute his idea that we have a very structured Bill regarding user-to-user and so on. We have a very structured Bill from a harms perspective that describes the harms that must be prevented—and then we got to commercial porn, and we can also get to these other things.

I agree with the noble Lord’s point about freedom of speech, but we are talking about a fixed set of harms that will, I hope, be in the Bill by the end. We can then say that if children are likely to be accessed by this test, and known harm is there, that is what we are looking at. We are certainly not looking at the noble Lord’s blog.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I appreciate the intervention by the noble Baroness; I hope through this grit we may conjure up a pearl of some sort. The original concept of the Bill, as championed by the noble Baroness, would have been a generalised set of duties of care which could have stretched much more broadly. It has evolved in a particular direction and become ever more specific and tailored to those three services: user-to-user, search, and pornography services. Having arrived at that point, it is difficult to then open it back up and stretch it to reach other forms of service.

My intention in intervening in this debate is to raise some of those concerns because I think they are legitimate. I may be at the more sceptical end of the political world, but I am at the more regulation-friendly end of the tech community. This is said in a spirit of trying to create a Bill that will actually work. I have done the work, and I know how hard Ofcom’s job will be. That sums up what I am trying to say: my concern is that we should not give Ofcom an impossible job. We have defined something quite tight—many people still object to it, think it is too loose and do not agree with it—but I think we have something reasonably workable. I am concerned that, however tempting it is, by re-opening Pandora’s box we may end up creating something less workable.

That does not mean we should forget about app stores and non-user-to-user content, but we need to think of a way of dealing with those which does not necessarily just roll over the mechanism we have created in the Online Safety Bill to other forms of application.

Baroness Healy of Primrose Hill Portrait Baroness Healy of Primrose Hill (Lab)
- View Speech - Hansard - - - Excerpts

I strongly support the amendments in the name of the noble Baroness, Lady Kidron, because I want to see this Bill implemented but strengthened in order to fulfil the admirable intention that children must be safe wherever they are online. This will not be the case unless child safety duties are applicable in all digital environments likely to be accessed by children. This is not overly ambitious or unrealistic; the platforms need clarity as to these new responsibilities and Ofcom must be properly empowered to enforce the rules without worrying about endless legal challenges. These amendments will give that much-needed clarity in this complex area.

As the Joint Committee recommended, this regulatory alignment would simplify compliance with businesses while giving greater clarity to people who use the service and greater protection for children. It would give confidence to parents and children that they need not work out if they are in a regulated or unregulated service while online. The Government promised that the onus for keeping young people safe online would sit squarely on the tech companies’ shoulders.

Without these amendments, there is a real danger that a loophole will remain whereby some services, even those that are known to harm, are exempt, leaving thousands of children exposed to harm. They would also help to future-proof the Bill. For example, some parts of the metaverse as yet undeveloped may be out of scope, but already specialist police units have raised concerns that abuse rooms, limited to one user, are being used to practise violence and sexual violence against women and girls.

We can and must make this good Bill even better and support all the amendments in this group.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Services already have to comply with their duties to keep children safe. If they do not comply, Ofcom has powers of enforcement set out, which require app stores to remove applications that are harmful to children. We think this already addresses the point, but I am happy to continue discussing it offline with the noble Lord, my noble friend and others who want to explore how. As I say, we think this is already covered. A more general duty here would risk distracting from Ofcom’s existing priorities.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

My Lords, on that point, my reading of Clauses 131 to 135, where the Bill sets out the business disruption measures, is that they could be used precisely in that way. It would be helpful for the Minister responding later to clarify that Ofcom would use those business disruption measures, as the Government explicitly anticipate, were an app store, in a rogue way, to continue to list a service that Ofcom has said should not be made available to people in the United Kingdom.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will be very happy to set that out in more detail.

Amendments 33A and 217A in the name of the noble Lord, Lord Storey, would place a new duty on user-to-user services that predominantly enable online gaming. Specifically, they would require them to have a classification certificate stating the age group for which they are suitable. We do not think that is necessary, given that there is already widespread, voluntary uptake of approval classification systems in online gaming.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, before I speak to my Amendment 9, which I will be able to do fairly briefly because a great deal of the material on which my case rests has already been given to the Committee by the noble Baroness, Lady Fox of Buckley, I will make the more general and reflective point that there are two different views in the Committee that somehow need to be reconciled over the next few weeks. There is a group of noble Lords who are understandably and passionately concerned about child safety. In fact, we all share that concern. There are others of us who believe that this Bill, its approach and the measures being inserted into it will have massive ramifications outside the field of child safety, for adults, of course, but also for businesses, as the noble Baroness explained. The noble Baroness and I, and others like us, believe that these are not sufficiently taken into account either by the Bill or by those pressing for measures to be harsher and more restrictive.

Some sort of balance needs to be found. At Second Reading my noble friend the Minister said that the balance had been struck in the right place. It is quite clear that nobody really agrees with that, except on the principle, which I think is always a cop-out, that if everyone disagrees with you, you must be right, which I have never logically understood in any sense at all. I hope my noble friend will not resort to claiming that he has got it right simply because everyone disagrees with him in different ways.

My amendment is motivated by the considerations set out by the noble Baroness, which I therefore do not need to repeat. It is the Government’s own assessment that between 20,000 and 25,000 businesses will be affected by the measures in this Bill. A great number of those—some four-fifths—are small businesses or micro-businesses. The Government appear to think in their assessment that only 120 of those are high risk. The reason they think they are high risk is not that they are engaged in unpleasant activities but simply that they are engaged in livestreaming and contacting new people. That might be for nefarious purposes but equally, it might not, so the 120 we need to worry about could actually be a very small number. We handle this already through our own laws; all these businesses would still be subject to existing data protection laws and complying with the law generally on what they are allowed to publish and broadcast. It would not be a free-for-all or a wild west, even among that very small number of businesses.

My Amendment 9 takes a slightly different approach to dealing with this. I do not in any way disagree with or denigrate the approach taken by the noble Baroness, Lady Fox, but my approach would be to add two categories to the list of exemptions in the schedules. The first of these is services provided by small and medium-sized enterprises. We do not have to define those because there is already a law that helps define them for us: Section 33 of the Small Business, Enterprise and Employment Act 2015. My proposal is that we take that definition, and that those businesses that comply with it be outside the scope of the Bill.

The second area that I would propose exempting was also referred to by the noble Baroness, Lady Fox of Buckley: community-based services. The largest of these, and the one that frequently annoys us because it gets things wrong, is Wikipedia. I am a great user of Wikipedia but I acknowledge that it does make errors. Of course, most of the errors it makes, such as saying, “Lord Moylan has a wart on the end of his nose”, would not be covered by the Bill anyway. Nothing in the Bill will force people to correct factual statements that have been got wrong—my year of birth or country of birth, or whatever. That is not covered. Those are the things they usually get wrong and that normally annoy us when we see them.

However, I do think that these services are extremely valuable. Wikipedia is an immense achievement and a tremendous source of knowledge and information for people. The fact that it has been put together in this organic, community-led way over a number of years, in so many languages, is a tremendous advantage and a great human advance. Yet, under the proposed changes, Wikipedia would not be able to operate its existing model of people posting their comments.

Currently, you go on Wikipedia and you can edit it. Now, I know this would not apply to any noble Lords but, in the other place, it has been suggested that MPs have discovered how to do this. They illicitly and secretly go on to and edit their own pages, usually in a flattering way, so it is possible to do this. There is no prior restraint, and no checking in advance. There are moderators at Wikipedia—I do not know whether they are employed—who review what has been done over a period, but they do not do what this Bill requires, which is checking in advance.

It is not simply about Wikipedia; there are other community sites. Is it sensible that Facebook should be responsible if a little old lady alters the information on a community Facebook page about what is happening in the local parish? Why should Facebook be held responsible for that? Why would we want it to be responsible for it—and how could it do it without effectively censoring ordinary activities that people want to carry out, using the advantages of the internet that have been so very great?

What I am asking is not dramatic. We have many laws in which we very sensibly create exemptions for small and medium-sized enterprises. I am simply asking that this law be considered under that heading as well, and similarly for Wikipedia and community-based sites. It is slightly unusual that we have had to consider that; it is not normal, but it is very relevant to this Bill and I very much hope the Government will agree to it.

The answer that I would not find satisfactory—I say this in advance for the benefit of my noble friend the Minister, in relation to this and a number of other amendments I shall be moving in Committee—is that it will all be dealt with by Ofcom. That would not be good enough. We are the legislators and we want to know how these issues will be dealt with, so that the legitimate objectives of the Bill can be achieved without causing massive disruption, cost and disbenefit to adults.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I rise to speak in support of Amendment 9, tabled by the noble Lord, Lord Moylan, and in particular the proposed new paragraph 10A to Schedule 1. I hope I will find myself more in tune with the mood of the Committee on this amendment than on previous ones. I would be interested to know whether any noble Lords believe that Ofcom should be spending its limited resources supervising a site like Wikipedia under the new regime, as it seems to me patently obvious that that is not what we intend; it is not the purpose of the legislation.

The noble Lord, Lord Moylan, is right to remind us that one of the joys of the internet is that you buy an internet connection, plug it in and there is a vast array of free-to-use services which are a community benefit, produced by the community for the community, with no harm within them. What we do not want to do is interfere with or somehow disrupt that ecosystem. The noble Baroness, Lady Fox, is right to remind us that there is a genuine risk of people withdrawing from the UK market. We should not sidestep that. People who try to be law-abiding will look at these requirements and ask themselves, “Can I meet them?” If the Wikimedia Foundation that runs Wikipedia does not think it can offer its service in a lawful way, it will have to withdraw from the UK market. That would be to the detriment of children in the United Kingdom, and certainly not to their benefit.

There are principle-based and practical reasons why we do not want Ofcom to be operating in this space. The principle-based one is that it makes me uncomfortable that a Government would effectively tell their regulator how to manage neutral information sites such as Wikipedia. There are Governments around the world who seek to do that; we do not want to be one of those.

The amendment attempts to define this public interest, neutral, informational service. It happens to be user-to-user but it is not like Facebook, Instagram or anything similar. I would feel much more comfortable making it clear in law that we are not asking Ofcom to interfere with those kinds of services. The practical reason is the limited time Ofcom will have available. We do not want it to be spending time on things that are not important.

Definitions are another example of how, with the internet, it can often be extremely hard to draw bright lines. Functionalities bleed into each other. That is not necessarily a problem, until you try to write something into law; then, you find that your definition unintentionally captures a service that you did not intend to capture, or unintentionally misses out a service that you did intend to be in scope. I am sure the Minister will reject the amendment because that is what Ministers do; but I hope that, if he is not willing to accept it, he will at least look at whether there is scope within the Bill to make it clear that Wikipedia is intended to be outside it.

Paragraph 4 of Schedule 1 refers to “limited functionality services”. That is a rich vein to mine. It is clear that the intention is to exclude mainstream media, for example. It refers to “provider content”. In this context, Encyclopaedia Britannica is not in scope but Wikipedia is, the difference being that Wikipedia is constructed by users, while Encyclopaedia Britannica is regarded as being constructed by a provider. The Daily Mail is outside scope; indeed, all mainstream media are outside scope. Anyone who declares themselves to be media—we will debate this later on—is likely to be outside scope.

Such provider exemption should be offered to other, similar services, even if they happen to be constructed from the good will of users as opposed to a single professional author. I hope the Minister will be able to indicate that the political intent is not that we should ask Ofcom to spend time and energy regulating Wikipedia-like services. If so, can he point to where in the legislation we might get that helpful interpretation, in order to ensure that Ofcom is focused on what we want it to be focused on and not on much lower priority issues?

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I will speak to a couple of the amendments in this group. First, small is not safe, and you cannot necessarily see these platforms in isolation. For example, there is an incel group that has only 4,000 active users, but it posts a great deal on YouTube and has 24.2 million users in that context. So we have to be clear that small and safe are not the same thing.

However, I am sympathetic to the risk-based approach. I should probably have declared an interest as someone who has given money to Wikipedia on several occasions to keep it going. I ask the Minister for some clarity on the systems and processes of the Bill, and whether the risk profile of Wikipedia—which does not entice you in and then follow you for the next six months once you have looked at something—is far lower than something very small that gets hold of you and keeps on going. I say that particularly in relation to children, but I feel it for myself also.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Baroness, Lady Harding. If one is permitted to say this in the digital age, I am on exactly the same page as she is.

There are two elements to the debate on this group. It is partly about compliance, and I absolutely understand the point about the costs of that, but I also take comfort from some of the things that the noble Lord. Lord Vaizey, said about the way that Ofcom is going to deliver the regulation and the very fact that this is going to be largely not a question of interpretation of the Act, when it comes down to it, but is going to be about working with the codes of practice. That will be a lot more user-friendly than simply having to go to expensive expert lawyers, as the noble Baroness, Lady Fox, said—not that I have anything against expensive expert lawyers.

I am absolutely in agreement with the noble Baroness, Lady Kidron, that small is not safe. As the noble Baroness, Lady Harding, described, small can become big. We looked at this in our Joint Committee and recommended to the Government that they should take a more nuanced approach to regulation, based not just on size and high-level functionality but on factors such as risk, reach, user base, safety performance and business model. All those are extremely relevant but risk is the key, right at the beginning. The noble Baroness, Lady Fox, also said that Reddit should potentially be outside, but Reddit has had its own problems, as we know. On that front, I am on absolutely the same page as those who have spoken about keeping us where we are.

The noble Lord, Lord Moylan, has been very cunning in the way that he has drawn up his Amendment 9. I am delighted to be on the same page as my noble friend —we are making progress—but I agree only with the first half of the amendment because, like the noble Baroness, Lady Kidron, I am a financial contributor to Wikipedia. A lot of us depend on Wikipedia; we look up the ages of various Members of this House when we see them in full flight and think, “Good heavens!” Biographies are an important part of this area. We have all had Jimmy Wales saying, as soon as we get on to Wikipedia, “You’ve already looked at Wikipedia 50 times this month. Make a contribution”, and that is irresistible. There is quite a strong case there. It is risk-based so it is not inconsistent with the line taken by a number of noble Lords in all this. I very much hope that we can get something out of the Minister—maybe some sort of sympathetic noises for a change—at this stage so that we can work up something.

I must admit that the briefing from Wikimedia, which many of us have had, was quite alarming. If the Bill means that we do not have users in high-risk places then we will find that adults get their information from other sources that are not as accurate as Wikipedia —maybe from ChatGPT or GPT-4, which the noble Lord, Lord Knight, is clearly very much an expert in—and that marginalised websites are shut down.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

For me, one of the features of the schedule’s list of exempted sites is foreign state entities. Therefore, we could end up in the absurd situation where you could not read about the Ukraine war on Wikipedia, but you would be able to read about the Ukraine war on the Russian Government website.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, if we needed an example of something that gave us cause for concern, that would be it; but a very good case has been made, certainly for the first half of the amendment in the name of the noble Lord, Lord Moylan, and we on these Benches support it.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, I have to start with a slightly unprofessional confession. I accepted the Bill team’s suggestion on how my amendments might be grouped after I had grouped them rather differently. The result is that I am not entirely clear why some of these groupings are quite as they are. As my noble friend the Minister said, my original idea of having Amendments 9, 10 and 11 together would perhaps have been better, as it would have allowed him to give a single response on Wikipedia. Amendments 10 and 11 in this group relate to Wikipedia and services like it.

I am, I hope, going to cause the Committee some relief as I do not intend to repeat remarks made in the previous group. The extent to which my noble friend wishes to amplify his comments in response to the previous group is entirely a matter for him, since he said he was reserving matter that he would like to bring forward but did not when commenting on the previous group. If I do not speak further on Amendments 10 and 11, it is not because I am not interested in what my noble friend the Minister might have to say on the topic of Wikipedia.

To keep this fairly brief, I turn to Amendment 26 on age verification. I think we have all agreed in the Chamber that we are united in wanting to see children kept safe. On page 10 of the Bill, in Clause 11(3), it states that there will be a duty to

“prevent children of any age from encountering”

this content—“prevent” them “encountering” is extremely strong. We do not prevent children encountering the possibility of buying cigarettes or encountering the possibility of being injured crossing the road, but we are to prevent children from these encounters. It is strongly urged in the clause—it is given as an example—that age verification will be required for that purpose.

Of course, age verification works only if it applies to everybody: one does not ask just the children to prove their age; one has to ask everybody online. Unlike when I go to the bar in a pub, my grey hair cannot be seen online. So this provision will almost certainly have to extend to the entire population. In Clause 11(3)(b), we have an obligation to protect. Clearly, the Government intend a difference between “prevent” and “protect”, or they would not have used two different verbs, so can my noble friend the Minister explain what is meant by the distinction between “prevent” and “protect”?

My amendment would remove Clause 11(3) completely. But it is, in essence, a probing amendment and what I want to hear from the Government, apart from how they interpret the difference between “prevent” and “protect”, is how they expect this duty to be carried out without having astonishingly annoying and deterring features built into every user-to-user platform and website, so that every time one goes on Wikipedia—in addition to dealing with the GDPR, accepting cookies and all the other nonsense we have to go through quite pointlessly—we then have to provide age verification of some sort.

What mechanism that might be, I do not know. I am sure that there are many mechanisms available for age verification. I do not wish to get into a technical discussion about what particular techniques might be used—I accept that there will be a range and that they will respond and adapt in the light of demand and technological advance—but I would like to know what my noble friend the Minister expects and how wide he thinks the obligation will be. Will it be on the entire population, as I suspect? Focusing on that amendment—and leaving the others to my noble friend the Minister to respond to as he sees fit—and raising those questions, I think that the Committee would like to know how the Government imagine that this provision will work. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I will speak to the amendments in the name of the noble Lord, Lord Moylan, on moderation, which I think are more important than he has given himself credit for—they go more broadly than just Wikipedia.

There is a lot of emphasis on platform moderation, but the reality is that most moderation of online content is done by users, either individually or in groups, acting as groups in the space where they operate. The typical example, which many Members of this House have experienced, is when you post something and somebody asks, “Did you mean to post that?”, and you say, “Oh gosh, no”, and then delete it. A Member in the other place has recently experienced a rather high-profile example of that through the medium of the newspaper. On a much smaller scale, it is absolutely typical that people take down content every day, either because they regret it or, quite often, because their friends, families or communities tell them that it was unwise. That is the most effective form of moderation, because it is the way that people learn to change their behaviour online, as opposed to the experience of a platform removing content, which is often experienced as the big bad hand of the platform. The person does not learn to change their behaviour, so, in some cases, it can reinforce bad behaviour.

Community moderation, not just on Wikipedia but across the internet, is an enormous public good, and the last thing that we want to do in this legislation is to discourage people from doing it. In online spaces, that is often a volunteer activity: people give up their time to try to keep a space safe and within the guidelines they have set for that space. The noble Lord, Lord Moylan, has touched on a really important area: in the Bill, we must be absolutely clear to those volunteers that we will not create all kinds of new legal operations and liabilities on them. These are responsible people, so, if they are advised that they will incur all kinds of legal risk when trying to comply with the Online Safety Bill, they will stop doing the moderation—and then we will all suffer.

On age-gating, we will move to a series of amendments where we will discuss age assurance, but I will say at the outset, as a teaser to those longer debates, that I have sympathy with the points made by the noble Lord, Lord Moylan. He mentioned pubs—we often talk about real-world analogies. In most of the public spaces we enter in the real world, nobody does any ID checking or age checking; we take it on trust, unless and until you carry out an action, such as buying alcohol, which requires an age check.

It is legitimate to raise this question, because where we fall in this debate will depend on how we see public spaces. I see a general-purpose social network as equivalent to walking into a pub or a town square, so I do not expect to have my age and ID checked at the point at which I enter that public space. I might accept that my ID is checked at a certain point where I carry out various actions. Others will disagree and will say that the space should be checked as soon as you go into it—that is the boundary of the debate we will have across a few groups. As a liberal, I am certainly on the side that says that it is incumbent on the person wanting to impose the extra checks to justify them. We should not just assume that extra checks are cost-free and beneficial; they have a cost for us all, and it should be imposed only where there is a reasonable justification.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

Far be it for me to suggest that all the amendments tabled by the noble Lord, Lord Moylan, are in the wrong place, but I think that Amendment 26 might have been better debated with the other amendments on age assurance.

On community moderation, I underscore the point that Ofcom must have a risk profile as part of its operations. When we get to that subject, let us understand what Ofcom intends to do with it—maybe we should instruct Ofcom a little about what we would like it to do with it for community moderation. I have a lot of sympathy—but do not think it is a get-out clause—with seeing some spaces as less risky, or, at least, for determining what risky looks like in online spaces, which is a different question. This issue belongs in the risk profile: it is not about taking things out; we have to build it into the Bill we have.

On age assurance and AV, I do not think that today is the day to discuss it in full. I disagree with the point that, because we are checking kids, we have to check ourselves—that is not where the technology is. Without descending into technical arguments, as the noble Lord, Lord Moylan, asked us not to, we will bring some of those issues forward.

The noble Lords, Lord Bethell and Lord Stevenson, and the right reverend Prelate the Bishop of Oxford have a package of amendments which are very widely supported across the Committee. They have put forward a schedule of age assurance that says what the rules of the road are. We must stop pretending that age assurance is something that is being invented now in this Bill. If you log into a website with your Facebook login, it shares your age—and that is used by 42% of people online. However, if you use an Apple login, it does not share your age, so I recommend using Apple—but, interestingly, it is harder to find that option on websites, because websites want to know your age.

So, first, we must not treat age assurance as if it has just been invented. Secondly, we need to start to have rules of the road, and ask what is acceptable, what is proportionate, and when we will have zero tolerance. Watching faces around the Committee, I say that I will accept zero tolerance for pornography and some other major subjects, but, for the most part, age assurance is something that we need to have regulated. Currently, it is being done to us rather than in any way that is transparent or agreed, and that is very problematic.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, this group of government amendments relates to risk assessments; it may be helpful if I speak to them now as the final group before the dinner break.

Risk management is at the heart of the Bill’s regulatory framework. Ofcom and services’ risk assessments will form the foundation for protecting users from illegal content and content which is harmful to children. They will ensure that providers thoroughly identify the risks on their own websites, enabling them to manage and mitigate the potential harms arising from them. Ofcom will set out the risks across the sector and issue guidance to companies on how to conduct their assessments effectively. All providers will be required to carry out risk assessments, keep them up-to-date and update them before making a significant change to the design or operation of their service which could put their users at risk. Providers will then need to put in place measures to manage and mitigate the risks they identify in their risk assessments, including any emerging risks.

Given how crucial the risk assessments are to this framework, it is essential that we enable them to be properly scrutinised by the public. The government amendments in this group will place new duties on providers of the largest services—that is, category 1 and 2A services—to publish summaries of their illegal and child safety risk assessments. Through these amendments, providers of these services will also have a new duty to send full records of their risk assessments to Ofcom. This will increase transparency about the risk of harm on the largest platforms, clearly showing how risk is affected by factors such as the design, user base or functionality of their services. These amendments will further ensure that the risk assessments can be properly assessed by internet users, including by children and their parents and guardians, by ensuring that summaries of the assessments are publicly available. This will empower users to make informed decisions when choosing whether and how to use these services.

It is also important that Ofcom is fully appraised of the risks identified by service providers. That is why these amendments introduce duties for both category 1 and 2A services to send their records of these risk assessments, in full, to Ofcom. This will make it easier for Ofcom to supervise compliance with the risk assessment duties, as well as other duties linked to the findings of the risk assessments, rather than having to request the assessments from companies under its information-gathering powers.

These amendments also clarify that companies must keep a record of all aspects of their risk assessments, which strengthens the existing record-keeping duties on services. I hope that noble Lords will welcome these amendments. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

My Lords, it is risky to stand between people and their dinner, but I rise very briefly to welcome these amendments. We should celebrate the good stuff that happens in Committee as well as the challenging stuff. The risk assessments are, I think, the single most positive part of this legislation. Online platforms already do a lot of work trying to understand what risks are taking place on their platforms, which never sees the light of day except when it is leaked by a whistleblower and we then have a very imperfect debate around it.

The fact that platforms will have to do a formal risk assessment and share it with a third-party regulator is huge progress; it will create a very positive dynamic. The fact that the public will be able to see those risk assessments and make their own judgments about which services to use—according to how well they have done them—is, again, a massive public benefit. We should welcome the fact that risk assessments are there and the improvements that this group of amendments makes to them. I hope that was short enough.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I also welcome these amendments, but I have two very brief questions for the Minister. First, in Amendment 27A, it seems that the child risk assessment is limited only to category 1 services and will be published only in the terms of service. As he probably knows, 98% of people do not read terms of service, so I wondered where else we might find this, or whether there is a better way of dealing with it.

My second question is to do with Amendments 64A and 88A. It seems to me—forgive me if I am wrong—that the Bill previously stipulated that all regulated search and user services had to make and keep a written record of any measure taken in compliance with a relevant duty, but now it seems to have rowed back to only category 1 and 2A services. I may be wrong on that, but I would like to check it for the record.