Lord Allan of Hallam debates involving the Department for Digital, Culture, Media & Sport during the 2019 Parliament

Wed 6th Sep 2023
Wed 19th Jul 2023
Mon 17th Jul 2023
Wed 12th Jul 2023
Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1
Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 2
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 2
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1 & Report stage: Minutes of Proceedings
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 3
Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1

Online Safety Bill

Lord Allan of Hallam Excerpts
Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I, too, thank the Minister for his engagement and for the amendments he has tabled at various stages throughout the passage of the Bill.

Amendment 15 provides a definition:

““age assurance” means age verification or age estimation”.

When the Minister winds up, could he provide details of the framework or timetable for its implementation? While we all respect that implementation must be delivered quickly, age verification provisions will be worthless unless there is swift enforcement action against those who transgress the Bill’s provisions. Will the Minister comment on enforcement and an implementation framework with direct reference to Amendment 15?

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, as this is a new stage of the Bill, I need to refer again to my entry in the register of interests. I have no current financial interest in any of the regulated companies for which I used to work, in one of which I held a senior role for a decade.

I welcome Amendment 7 and those following from it which change the remote access provision. The change from “remote access” to “view remotely” is quite significant. I appreciate the Minister’s willingness to consider it and particularly the Bill team’s creativity in coming up with this new phrasing. It is much simpler and clearer than the phrasing we had before. We all understand what “view remotely” means. “Access” could have been argued over endlessly. I congratulate the Minister and the team for simplifying the Bill. It again demonstrates the value of some of the scrutiny we carried out on Report.

It is certainly rational to enable some form of viewing in some circumstances, not least where the operations of the regulated entities are outside the United Kingdom and where Ofcom has a legitimate interest in observing tests that are being carried out. The remote access, or the remote viewing facility as it now is, will mean it can do this without necessarily sending teams overseas. This is more efficient, as the Minister said. As this entire regime is going to be paid for by the regulated entities, they have an interest in finding cheaper and more efficient methods of carrying out the supervision than teams going from London to potentially lots of overseas destinations. Agreement between the provider and Ofcom that this form of remote viewing is the most efficient will be welcomed by everybody. It is certainly better than the other option of taking data off-site. I am glad to see that, through the provisions we have in place, we will minimise the instances where Ofcom feels it needs data from providers to be taken off-site to some other facility, which is where a lot of the privacy risks come from.

Can the Minister give some additional assurances at some stage either in his closing remarks or through any follow-up correspondence? First, the notion of proportionality is implicit, but it would help for it to be made explicit. Whenever Ofcom is using the information notices, it should always use the least intrusive method. Yes, it may need to view some tests remotely, but only where the information could not have been provided in written form, for example, or sent as a document. We should not immediately escalate to remote viewing if we have not tried less intrusive methods. I hope that notion of proportionality and least intrusion is implicit within it.

Secondly, concerns remain around live user data. I heard the Minister say that the intention is to use test data sets. That needs to be really clear. It is natural for people to be concerned that their live user data might be exposed to anyone, be it a regulator or otherwise. Of course, we expect Ofcom staff to behave with propriety, but there have sadly been instances where individuals have taken data that they have observed, whether they were working for the police, the NHS or any other entity, and abused it. The safest safeguard is for there to be no access to live user data. I hope the Minister will go as far as he can in saying that that is not the intention.

--- Later in debate ---
Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I shall ask my noble friend the Minister a question about encryption but, before I do, I will briefly make a couple of other points. First, I echo all the tributes paid around the House to those involved in this legislation. It is no secret that I would have preferred the Bill to be about only child safety, so I particularly congratulate the Government, and the various Members who focused their efforts in that area, on what has been achieved via the Bill.

That said, the Government should still consider other non-legislative measures, such as banning smartphones in schools and government guidance for parents on things such as the best age at which to allow their children to have their own smartphones. These may not be points for DCMS, but they are worth highlighting at this point, as the Bill leaves us, soon to become legislation.

As I said on Report, I remain concerned about the reintroduction of some protections for adults, in lieu of “legal but harmful”, without any corresponding amendments to reinforce to Ofcom that freedom of expression must be the top priority for adults. We now have to leave it to Ofcom and see what happens. I know that the current leadership is deeply conscious of its responsibilities.

On encryption, I was pleased to hear what my noble friend said when he responded to the debate at Third Reading. If he is saying that the technology not existing means that Clause 122 cannot be deployed, as it were, by Ofcom, does that mean that the oversight measures that currently exist would not be deployed? As my noble friend will recall, one of the areas that we were still concerned about in the context of encryption was that what was in the Bill did not mirror what exists for RIPA. I am not sure whether that means that, because Clause 122 has been parked, our oversight concerns have been parked too. It would be helpful if the Minister could clarify that.

In the meantime, in the absence of Clause 122, it is worth us all reinforcing again that we want the tech firms to co-operate fully with law enforcement, either because a user has alerted them to illegal activity or when law enforcement suspects criminal behaviour and seeks their help. In that latter context, it would be helpful to understand what the Minister has said and to know what oversight that might involve. I congratulate my noble friend on this marathon Bill, and I am sorry to have delayed its passing.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I will make a short contribution so that I do not disappoint the noble Lord, Lord Moylan; I will make a few direct and crunchy comments. First, I thank colleagues who participated in the debate for giving me a hearing, especially when I raised concerns about their proposals. It has been a constructive process, where we have been, as the Minister said, kicking the tyres, which is healthy in a legislature. It is better to do it now than to find faults when something has already become law.

I am in the unusual position of having worked on problems comparable to those we are now placing on Ofcom’s desk. I have enormous empathy for it and the hard work we are giving it. I do not think we should underestimate just how difficult this job is.

I want to thank the Minister for the additional clarification of how Ofcom will give orders to services that provide private communications. Following on from what the noble Baroness, Lady Stowell, said, I think this is a challenging area. We want Ofcom to give orders where this is easy—for example, to an unencrypted service hosting child sexual abuse material. The technology can be deployed today and is uncontroversial, so it is important that we do not forget that.

I heard the Minister say that we do not want Ofcom to move so fast that it breaks encryption. It should be moving but it should be careful. Those are the fears that have been expressed outside: on the day that this becomes law, Ofcom will issue orders to services providing encrypted communications that they will not be able to accept and therefore they will leave the UK. I think I heard from the Minister today that this is not what we want Ofcom to do. At the same time, as the noble Baroness, Lady Stowell said, we are not expecting Ofcom to ease off; any online service should be doing everything technically possible and feasible to deal with abhorrent material.

I humbly offer three pieces of advice to Ofcom as we pass the baton to it. This is based on having made a lot of mistakes in the past. If I had been given this advice, I might have done a better job in my previous incarnation. First, you cannot overconsult; Ofcom should engage with all interested parties, including those who have talked to us throughout the process of the Bill. It should engage with them until it is sick of engaging with them and then it should engage some more. In particular, Ofcom should try to bring together diverse groups, so I hope it gets into a room the kind of organisations that would be cheering on the noble Lord, Lord Moylan, as well as those that would be cheering on the noble Baroness, Lady Kidron. If Ofcom can bring them into the room, it has a chance of making some progress with its regulations.

Secondly, be transparent. The more information that Ofcom provides about what it is doing, the less space it will leave for people to make up things about what it is doing. I said this in the previous debate about the access request but it applies across the piece. We are starting to see some of this in the press. We are here saying that it is great that we now have a government regulator—independent but part of the UK state—overseeing online services. As soon as that happens, we will start to see the counterreaction of people being incredibly suspicious that part of the UK state is now overseeing their activity online. The best way to combat that is for Ofcom to be as transparent as possible.

Thirdly, explain the trade-offs you are making. This legislation necessarily involves trade-offs. I heard it again in the Minister’s opening remarks: we have indulged in a certain amount of cakeism. We love freedom of expression but we want the platforms to get rid of all the bad stuff. The rubber is going to hit the road once Ofcom has the powers and, in many cases, it will have to decide between one person’s freedom of expression and another’s harm. My advice is not to pretend that you can make both sides happy; you are going to disappoint someone. Be honest and frank about the trade-offs you have made. The legislation has lots of unresolved trade-offs in it because we are giving lots of conflicting instructions. As politicians, we can ride that out, but when Ofcom gets this and has to make real decisions, my advice would be to explain the trade-offs and be comfortable with the fact that some people will be unhappy. That is the only way it will manage to maintain confidence in the system. With that, I am pleased that the Bill has got to this stage and I have a huge amount of confidence in Ofcom to take this and make a success of it.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

I rise briefly to raise the question of access to data by academics and research organisations. Before I do so, I want to express profound thanks to noble Lords who have worked so collaboratively to create a terrific Bill that will completely transform and hold to account those involved in the internet, and make it a safer place. That was our mission and we should be very proud of that. I cannot single out noble Peers, with the exception of the noble Baroness, Lady Kidron, with whom I worked collaboratively both on age assurance and on harms. It was a partnership I valued enormously and hope to take forward. Others from all four corners of the House contributed to the parts of the Bill that I was particularly interested in. As I look around, I see so many friends who stuck their necks out and spoke so movingly, for which I am enormously grateful.

The question of data access is one of the loose ends that did not quite make it into the Bill. I appreciate the efforts of my noble friend the Minister, the Secretary of State and the Bill team in this matter and their efforts to try and wangle it in; I accept that it did not quite make it. I would like to hear reassurance from my noble friend that this is something that the Government are prepared to look at in future legislation. If he could provide any detail on how and in which legislation it could be revisited, I would be enormously grateful.

Online Safety Bill

Lord Allan of Hallam Excerpts
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I am mindful of the comments of the noble Lord, Lord Stevenson, to be brief. I add a note of welcome to the mechanism that has been set out.

In this legislation, we are initiating a fundamental change to the way in which category 1 providers will run their reporting systems, in that prior to this they have not had any external oversight. Ofcom’s intervention will be material, given that online service providers will have to explain to Ofcom what they are doing and why.

We should note that we are also asking providers to do some novel prioritisation. The critical thing with all these reporting systems is that they operate at such huge volumes. I will not labour the points, but if noble Lords are interested they can look at the Meta and YouTube transparency reports, where it is explained that they are actioning tens of millions of pieces of content each month, on the basis of hundreds of millions of reports. If you get even 1% of 10 million reports wrong, that is 100,000 errors. We should have in mind the scale we are operating at. Ofcom will not be able to look at each one of those, but I think it will be able to produce a valuable system and make sure that quality control is improved across those systems, working with the providers. Having additional powers to create an alternative dispute resolution mechanism where one does not exist and would prove to be useful is helpful. However, the slow and steady approach of seeing what will happen with those systems under Ofcom supervision before jumping into the next stage is right.

I also note that we are asking platforms to do some prioritisation in the rest of the Online Safety Bill. For example, we are saying that we wish journalistic and politician content to be treated differently from ordinary user content. All of those systems need to be bedded in, so it makes sense to do it at a reasonable pace.

I know that the noble Baroness, Lady Newlove, who cannot be here today, was also very interested in this area and wanted to make sure we made the point that the fact there is a reasonable timescale for the review does not mean that we should take our foot off the pedal now for our expectations for category 1 service providers. I think I heard that from the Minister, but it would be helpful for him to repeat it. We will be asking Ofcom to keep the pressure on to get these systems right now, and not just wait until it has done the report and then seek improvements at that stage. With that—having been about as brief as I can be— I will sit down.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, Amendments 238A and 238D seek to change the parliamentary process for laying—oh, I am skipping ahead with final day of Report enthusiasm.

As noble Lords know, companies will fund the costs of Ofcom’s online safety functions through annual fees. This means that the regime which the Bill ushers in will be cost neutral to the taxpayer. Once the fee regime is operational, regulated providers with revenue at or above a set threshold will be required to notify Ofcom and to pay a proportionate fee. Ofcom will calculate fees with reference to the provider’s qualifying worldwide revenue.

The Delegated Powers and Regulatory Reform Committee of your Lordships’ House has made two recommendations relating to the fee regime which we have accepted, and the amendments we are discussing in this group reflect this. In addition, we are making an additional change to definitions to ensure that Ofcom can collect proportionate fees.

A number of the amendments in my name relate to qualifying worldwide revenue. Presently, the Bill outlines that this should be defined in a published statement laid before Parliament. Your Lordships’ committee advised that it should be defined through regulations subject to the affirmative procedure. We have agreed with this and are proposing changes to Clause 76 so that Ofcom can make provisions about qualifying worldwide revenue by regulations which, as per the committee’s recommendations, will be subject to the affirmative procedure.

Secondly, the committee recommended that we change the method by which the revenue threshold is defined. Presently, as set out in the Bill, it is set by the Secretary of State in a published statement laid before Parliament. The committee recommended that the threshold be set through regulations subject to the negative procedure and we are amending Clause 77 to make the recommended change.

Other amendments seek to make a further change to enable Ofcom to collect proportionate fees from providers. A provider of a regulated service the qualifying worldwide revenue of which is equal to, or greater than, the financial threshold will be required to notify Ofcom and pay an annual fee, calculated by reference to its qualifying worldwide revenue. Currently, this means that that fee calculation can be based only on the revenue of the regulated provider. The structure of some technology companies, however, means that how they accrue revenue is not always straightforward. The entity which meets the definition of a provider may therefore not be the entity which generates revenue referable to the regulated service.

Regulations to be made by Ofcom about the qualifying worldwide revenue will therefore be able to provide that the revenue accruing to certain entities in the same group as a provider of a regulated service can be taken into account for the purposes of determining qualifying worldwide revenue. This will enable Ofcom, when making such regulations, to make provisions, if necessary, to account for instances where a provider has a complex group structure; for example, where the regulated provider might accrue only a portion of the revenue referrable to the regulated service, the rest of which might be accrued by other entities in the group’s structure. These amendments to Clause 76 address these issues by allowing Ofcom to make regulations which provide that the revenue from certain other entities within the provider’s group structure can be taken into account. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, we have not talked much about fees in our consideration of the Bill, and I will not talk much about them today, but there are some important questions. We should not skip too lightly over the fact that we will be levying revenues from online providers. That might have a significant impact on the markets. I have some specific questions about this proposed worldwide revenue method but I welcome these amendments and that we will now be getting a better procedure. This will also allow the Minister to say, “All these detailed points can be addressed when these instruments come before Parliament”. That is a good development. However, there are three questions that are worth putting on the record now so that we have time to think about them.

First, what consideration will be given to the impact on services that do not follow a classic revenue model but instead rely on donations and other sorts of support? I know that we will come back to this question in a later group but there are some very large internet service providers that are not the classic advertising-funded model, instead relying on foundations and other things. They will have significant questions about what we would judge their qualifying worldwide revenue to be, given that they operate to these very different models.

The second question concerns the impact on services that may have a very large footprint outside the UK, and significant worldwide revenues, but which do very little business within the UK. The amendment that the Minister has tabled about group revenues is also relevant here. You can imagine an entity which may be part of a very large worldwide group making very significant revenues around the world. It has a relatively small subsidiary that is offering a service in the UK, with relatively low revenues. There are some important questions there around the potential impact of the fees on decision-making within that group. We have discussed how we do not want to end up with less choice for consumers of services in the UK. There is an interesting question there as to whether getting the fee level wrong might lead to worldwide entities saying, “If you’re going to ask me to pay a fee based on my qualifying worldwide revenue, the UK market is just not worth it”. That may particularly true if, for example, the European Union and other markets are also levying a fee. You can see a rational business choice of, “We’re happy to pay the fee to the EU but not to Ofcom if it is levied at a rate that is disproportionate to the business that we do here”.

The third and very topical question is about the Government’s thinking about services with declining revenues but whose safety needs are not reducing and may even be increasing. I hope as I say this that people have Twitter in mind, which has very publicly told us that its revenue is going down significantly. It has also very publicly fired most of its trust and safety staff. You can imagine a model within which, because its revenue is declining, it is paying less to Ofcom precisely when Ofcom needs to do more supervision of it.

I hope that we can get some clarity around the Government’s intentions in these circumstances. I have referenced three areas where the worldwide qualifying revenue calculation may go a little awry. The first is where the revenue is not classic commercial income but comes from other sources. The second is where the footprint in the UK is very small but it is otherwise a large global company which we might worry will withdraw from the market. The third, and perhaps most important, is what the Government’s intention is where a company’s revenue is declining and it is managing its platform less well and its Ofcom needs increase, and what we would expect to happen to the fee level in those circumstances.

--- Later in debate ---
The amendment does not remove the sites completely. Those sites promoting suicide, serious self-harm and other activities across society will still continue, but because they will potentially be able to be captured and required to look at their risk assessment, their activities will perhaps at least be curtailed and, to a certain extent, regulated. It seems that the amendment simply provides a level playing field in the core issue of safety, which has been a theme we have addressed right through the Bill. I hope the Minister will accept the amendment as it is; one change of wording could allow Ofcom to do its job so much better. If he does not, I hope the amendment will be strongly supported by all sides of the House.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I am pleased to follow the noble Baroness, Lady Morgan of Coates, and her amendment, which tries to help parliamentary counsel draft better regulations later on. I am really struggling to see why the Government want to resist something that will make their life easier if they are going to do what we want them to do, which is to catch those high-risk services—as the noble Baroness, Lady Finlay, set out—but also, as we have discussed in Committee and on Report, exclude the low-risk services that have been named, such as Wikipedia and OpenStreetMap.

I asked the Minister on Report how that might happen, and he confirmed that such services are not automatically exempt from the user-to-user services regulations, but he also confirmed that they might be under the subsequent regulations drafted under Schedule 11. That is precisely why we are coming back to this today; we want to make sure that they can be exempt under the regulations drafted under Schedule 11. The test should be: would that be easier under the amended version proposed by the noble Baroness, Lady Morgan, or under the original version? I think it would be easier under the amended version. If the political intent is there to exclude the kind of services that I have talked about—the low-risk services—and I think it should be, because Ofcom should not be wasting time, in effect, supervising services that do not present a risk and, not just that, creating a supervisory model that may end up driving those services out of the UK market because they cannot legally say that they will make the kind of commitments Ofcom would expect them to make, having two different thresholds, size and functionality, gives the draftspeople the widest possible choice. By saying “or”, we are not saying they cannot set a condition that is “and” or excludes “and”, but “and” does exclude “or”, if I can put it that way. They can come back with a schedule that says, “You must be of this size and have this kind of functionality”, or they could say “this functionality on its own”—to the point made by the two noble Baronesses about some sites. They might say, “Look, there is functionality which is always so high-risk that we do not care what size you are; if you’ve got this functionality, you’re always going to be in”. Again, the rules as drafted at the moment would not allow them to do that; they would have to say, “You need to have this functionality and be of this size. Oh, whoops, by saying that you have to be of this size, we’ve now accidentally caught somebody else who we did not intend to catch”.

I look forward to the Minister’s response, but it seems entirely sensible that we have the widest possible choice. When we come to consider this categorisation under Schedule 11 later on, the draftspeople should be able to say either “You must be this size and have this functionality” or “If you’ve got this functionality, you’re always in” or “If you’re of this size, you’re always in”, and have the widest possible menu of choices. That will achieve the twin objectives which I think everyone who has taken part in the debate wants: the inclusion of high-risk services, no matter their size, and the exclusion of low-risk services, no matter their size—if they are genuinely low risk. That is particularly in respect of the services we have discussed and which the noble Lord, Lord Moylan, has been a very strong advocate for. In trying to do good, we should not end up inadvertently shutting down important information services that people in this country rely on. Frankly, people would not understand it if we said, “In the name of online safety, we’ve now made it so that you cannot access an online encyclopaedia or a map”.

It is going to be much harder for the draftspeople to draft categorisation under Schedule 11, as it is currently worded, that has the effect of being able to exclude low-risk services. The risk of their inadvertently including them and causing that problem is that much higher. The noble Baroness is giving us a way out and I hope the Minister will stand up and grab the lifeline. I suspect he will not.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome the Minister’s Amendment 238A, which I think was in response to the DPRRC report. The sentiment around the House is absolutely clear about the noble Baroness’s Amendment 245. Indeed, she made the case conclusively for the risk basis of categorisation. She highlighted Zoe’s experience and I struggle to understand why the Secretary of State is resisting the argument. She knocked down the nine pins of legal uncertainty, and how it was broader than children and illegal by reference to Clause 12. The noble Baroness, Lady Finlay, added to the knocking down of those nine pins.

Smaller social media platforms will, on the current basis of the Bill, fall outside category 1. The Royal College of Psychiatrists made it pretty clear that the smaller platforms might be less well moderated and more permissive of dangerous content. It is particularly concerned about the sharing of information about methods of suicide or dangerous eating disorder content. Those are very good examples that it has put forward.

I return to the scrutiny committee again. It said that

“a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model”

should be adopted. It seems that many small, high-harm services will be excluded unless we go forward on the basis set out by the noble Baroness, Lady Morgan. The kind of breadcrumbing we have talked about during the passage of the Bill and, on the other hand, sites such as Wikipedia, as mentioned by noble friend, will be swept into the net despite being low risk.

I have read the letter from the Secretary of State which the noble Baroness, Lady Morgan, kindly circulated. I cannot see any argument in it why Amendment 245 should not proceed. If the noble Baroness decides to test the opinion of the House, on these Benches we will support her.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am conscious of the imprecation earlier from the noble Lord, Lord Stevenson of Balmacara, that we keep our contributions short, but I intend to take no notice of it. That is for the very good reason that I do not think the public would understand why we disposed of such a momentous matter as bringing to an end end-to-end encryption on private messaging services as a mere technicality and a brief debate at the end of Report.

It is my view that end-to-end encryption is assumed nowadays by the vast majority of people using private messaging services such as WhatsApp, iMessage and Signal. They are unaware, I think, of the fact that it is about to be taken from them by Clause 111 of the Bill. My amendment would prevent that. It is fairly plain; it says that

“A notice under subsection (1)”


of Clause 111

“may not impose a requirement relating to a service if the effect of that requirement would be to require the provider of the service to weaken or remove end-to-end encryption applied in relation to the service”.

My noble friend says that there is no threat of ending end-to-end encryption in his proposal, but he achieves that by conflating two things—which I admit my own amendment conflates, but I will come back to that towards the end. They are the encryption of platforms and the encryption of private messaging services. I am much less concerned about the former. I am concerned about private messaging services. If my noble friend was serious in meaning that there was no threat to end-to-end encryption, then I cannot see why he would not embrace my amendment, but the fact that he does not is eloquent proof that it is in fact under threat, as is the fact that the NSPCC and the Internet Watch Foundation are so heavily lobbying against my amendment. They would not be doing that if they did not think it had a serious effect.

I shall not repeat at any length the technical arguments we had in Committee, but the simple fact is that if you open a hole into end-to-end encryption, as would be required by this provision, then other people can get through that hole, and the security of the system is compromised. Those other people may not be very nice; they could be hostile state actors—we know hostile state actors who are well enough resourced to do this—but they could also be our own security services and others, from whom we expect protection. Normally, we do get a degree of protection from those services, because they are required to have some form of warrant or prior approval but, as I have explained previously in debate on this, these powers being given to Ofcom require no warrant or prior approval in order to be exercised. So there is a vulnerability, but there is also a major assault on privacy. That is the point on which I intend to start my conclusion.

If we reflect for a moment, the evolution of this Bill in your Lordships’ House has been characterised and shaped, to a large extent, by the offer made by the noble Lord, Lord Stevenson of Balmacara, when he spoke at Second Reading, to take a collaborative approach. But that collaborative approach has barely extended to those noble Lords concerned about privacy and freedom of expression. As a result, in my view, those noble Lords rightly promoting child protection have been reckless to the point of overreaching themselves.

If we stood back and had to explain to outsiders that we were taking steps today that took end-to-end encryption and the privacy they expect on their private messaging services away from them, together with the security and protection it gives, of course, in relation to scams and frauds and all the other things where it has a public benefit, then I think they would be truly outraged. I do not entirely understand how the Government think they could withstand that outrage, were it expressed publicly. I actually believe that the battle for this Bill—this part of this Bill, certainly—is only just starting. We may be coming to the end here, but I do not think that this Bill is settled, because this issue is such a sensitive one.

Given the manifest and widespread lack of support for my views on this question in your Lordships’ House in Committee, I will not be testing the opinion of the House today. I think I know what the opinion of the House is, but it is wrong, and it will have to be revised. My noble friend simply cannot stand there and claim that what he is proposing is proportionate and necessary, because it blatantly and manifestly is not.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, the powers in Clause 111 are perhaps the most controversial outstanding issue in the Bill. I certainly agree with the noble Lord, Lord Moylan, that they deserve some continued scrutiny. I suspect that Members of another place are being lobbied on this extensively right now. Again, it is one of the few issues; they may not have heard of the Online Safety Bill, but they will do in the context of this particular measure.

We debated the rights and wrongs of encryption at some length in Committee, and I will not repeat those points today, not least because the noble Lord, Lord Moylan, has made some of the arguments as to why encryption is important. I will instead today focus on the future process, assuming that the Clause 111 powers will be available to Ofcom as drafted and that we are not going to accept the amendment from the noble Lord, Lord Moylan.

Amendments 258 and 258ZA, in my name and that of my noble friend Lord Clement-Jones, both aim to improve the process of issuing a Clause 111 order by adding in some necessary checks and balances.

As we debate this group, we should remember that the Clause 111 powers are not specific to encrypted services—I think the Minister made this point—and we should have the broader context in mind. I often try to bring some concrete scenarios to our discussions, and it may be helpful to consider three different scenarios in which Ofcom might reach for a Clause 111 notice.

The first is where a provider has no particular objections to using technology to identify and remove child sexual exploitation and abuse material or terrorist material but is just being slow to do this. There are mature systems out there. PhotoDNA is very well known in the industry and effectively has a database with digital signatures of known child sexual exploitation material. All the services we use on a daily basis such as Facebook, Instagram and others will check uploaded photos against that database and, where it is child sexual exploitation material, they will make sure that it does not get shown and that those people are reported to the authorities.

I can imagine scenarios where Ofcom is dealing with a service which has not yet implemented the technology—but does not have a problem doing it—and the material is unencrypted so there is no technical barrier; it is just being a bit slow. In those scenarios, Ofcom will tell the service to get on with it or it will get a Clause 111 notice. In those circumstances, in most cases the service will just get on with it, so Ofcom will be using the threat of the notice as a way to encourage the slow coaches. That is pretty unexceptional; it will work in a pretty straightforward way. I think the most common use of these notices may be to bring outliers into the pack of those who are following best practice. Ofcom may not even need to issue any kind of warning notice at all and will not get past the warning notice period. Waving a warning notice in front of a provider may be sufficient to get it to move.

The second scenario is one where the provider equally does not object to the use of the technology but would prefer to have a notice before it implements it. Outside the world of tech companies, it may seem a little strange why a provider would want to be ordered to do something rather than doing the right thing voluntarily, but we have to remember that the use of this kind of technology is legally fraught in many jurisdictions. There have been court cases in a number of places, not least the European Union, where there are people who will challenge whether you should use this technology on unencrypted services, never mind encrypted ones. In those cases, you can imagine there will be providers, particularly those established outside the United Kingdom, which may say, “Look, we are fine implementing this technology, but Ofcom please can you give us a notice? Then when someone challenges it in court, we can say that the UK regulator made us do it”. That would be helpful to them. This second group will want a notice and here we will get to the point of the notice being issued. They are not going to contest it; they want to have the notice because it gives them some kind of legal protection.

I think those two groups are relatively straightforward: we are dealing with companies which are being slow or are looking for legal cover but do not fundamentally object. The third scenario, though, is the most challenging and it is where I think the Government could get into real trouble. My amendments seek to help the Government in situations where a provider fundamentally objects to being ordered to deploy a particular technology because it believes that that technology will create real privacy threats and risks to the service that it offers. I do not think the provider is being awkward in these circumstances; it has genuine concerns about the implications of the technology being developed or which it is being instructed to deploy.

In these circumstances, Ofcom may have all the reasons in the world to argue why it thinks that what it is asking for is reasonable. However, the affected provider may not accept those reasons and take quite a strong counterview and have all sorts of other arguments as to why what it is being asked to do is unacceptable and too high-risk. This debate has been swirling around at the moment as we think about current models of end-to-end encryption and client-side scanning technology, but we need to recognise that this Bill is going to be around for a while and there may be all sorts of other technologies being ordered to be deployed that we do not even know about and have not even been developed yet. At any point, we may hit this impasse where Ofcom is saying it thinks it is perfectly reasonable to order a company to do it and the service provider is saying, “No, as we look at this, our experts and our lawyers are telling us that this is fundamentally problematic from a privacy point of view”.

--- Later in debate ---
I will touch on the question raised by my noble friend Lady Harding of Winscombe—
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

I appreciate the tone of the Minister’s comments very much, but they are not entirely reassuring me. There is a debate going on out there: there are people saying, “We’ve got these fabulous technologies that we would like Ofcom to order companies to install” and there are companies saying, “That would be disastrous and break encryption if we had to install them”. That is a dualistic situation where there is a contest going on. My amendment seeks to make sure the conflict can be properly resolved. I do not think Ofcom on its own can ever do that, because Ofcom will always be defending what it is doing and saying “This is fine”. So, there has to be some other mechanism whereby people can say it is not fine and contest that. As I say, in this debate we are ignoring the fact that they are already out there: people saying “We think you should deploy this” and companies saying “It would be disastrous if we did”. We cannot resolve that by just saying “Trust Ofcom”.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

To meet the expectation the noble Lord voiced earlier, I will indeed point out that Ofcom can consult the ICO as a skilled person if it wishes to. It is important that we square the circle and look at these issues. The ICO will be able to be involved in the way I have set out as a skilled person.

Before I conclude, I want to address my noble friend Lady Harding’s questions on skilled persons. Given that notices will be issued on a case-by-case basis, and Ofcom will need to look at specific service design and existing systems of a provider to work out how a particular technology would interact with that design system, a skilled person’s report better fits this process by requiring Ofcom to obtain tailored advice rather than general technical advice from an advisory board. The skilled person’s report will be largely focused on the technical side of Ofcom’s assessment: that is to say, how the technology would interact with the service’s design and existing systems. In this way, it offers something similar to but more tailored than a technical advisory board. Ofcom already has a large and expert technology group, whose role it is to advice policy teams on new and existing technologies, to anticipate the impact of technologies and so on. It already has strong links with academia and with external researchers. A technical advisory board would duplicate that function. I hope that reassures my noble friend that the points she raised have been taken into account.

So I hope the noble Lord, Lord Allan, will not feel the need to divide—

--- Later in debate ---
Moved by
258ZA: After Clause 114, insert the following new Clause—
“Review by the Information Commissioner of notices under Section 111(1)
(1) Where a provider believes that a notice it has been given under section 111(1) will have a material impact on the private communications of its users, it may request a review by the Information Commissioner.(2) The review must consider the compatibility of the notice with—(a) the Human Rights Act 1998,(b) the Data Protection Act 2018,(c) the Privacy and Electronic Communications (EC Directive) Regulations 2003, and(d) any other legislation the Information Commissioner considers relevant.(3) In carrying out the review, the Information Commissioner must consult—(a) OFCOM,(b) the provider,(c) UK users of the provider’s service, and (d) such other persons as the Information Commissioner considers appropriate.(4) Following a review under subsection (1) the Information Commissioner must publish a report including—(a) their determination of the compatibility of the notice with relevant legislation,(b) their reasons for making such a determination, and(c) their advice to OFCOM in respect of the drafting and implementation of the notice.”Member’s explanatory statement
This amendment would give providers a right to request an assessment by the ICO of the compatibility of a section 111 order with UK privacy legislation.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I wish to test the opinion of the House.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, in moving Amendment 262A, I will speak also to the other government amendments in the group. These amendments address the Bill’s enforcement powers. Government Amendments 262A, 262B, 262C, 264A and 266A, Amendments 265, 266 and 267, tabled by my noble friend Lord Bethell, and Amendment 268 tabled by the noble Lord, Lord Stevenson of Balmacara, relate to senior management liability. Amendment 268C from the noble Lord, Lord Weir of Ballyholme, addresses interim service restriction orders.

In Committee, we amended the Bill to create an offence of non-compliance with steps set out in confirmation decisions that relate to specific children’s online safety duties, to ensure that providers and individuals can be held to account where their non-compliance risks serious harm to children. Since then, we have listened to concerns raised by noble Lords and others, in particular that the confirmation decision offence would not tackle child sexual exploitation and abuse. That is why the government amendments in this group will create a new offence of a failure to comply with a child sexual exploitation and abuse requirement imposed by a confirmation decision. This will mean that providers and senior managers can be held liable if they fail to comply with requirements to take specific steps as set out in Ofcom’s confirmation decision in relation to child sexual exploitation and abuse on their service.

Ofcom must designate a step in a confirmation decision as a child sexual exploitation and abuse requirement, where that step relates, whether or not exclusively, to a failure to comply with specific safety duties in respect of child sexual exploitation and abuse content. Failure to comply with such a requirement will be an offence. This approach is necessary, given that steps may relate to multiple or specific kinds of illegal content, or systems and process failures more generally. This approach will ensure that services know from the confirmation decision when they risk criminal liability, while providing sufficient legal certainty via the specified steps to ensure that the offence can be prosecuted effectively.

The penalty for this offence is up to two years in prison, a fine or both. Through Clause 182, where an offence is committed with the consent or connivance of a senior manager, or attributable to his or her neglect, the senior manager, as well as the entity, will have committed the offence and can face up to two years in prison, a fine or both.

I thank my noble friend Lord Bethell, as well as our honourable friends Miriam Cates and Sir William Cash in another place, for their important work in raising this issue and their collaborative approach as we have worked to strengthen the Bill in this area. I am glad that we have reached a position that will help to keep children safe online and drive a change in culture in technology companies. I hope this amendment reassures them and noble Lords that the confirmation decision offence will tackle harms to children effectively by ensuring that technology executives take the necessary steps to keep children safe online. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

My Lords, I will briefly comment positively on the Minister’s explanation of how these offences might work, particularly the association of the liability with the failure to enforce a confirmation decision, which seems entirely sensible. In an earlier stage of the debate, there was a sense that we might associate liability with more general failures to enforce a duty of care. That would have been problematic, because the duty of care is very broad and requires a lot of pieces to be put in place. Associating the offences with the confirmation decision makes absolute sense. Having been in that position, if, as an executive in a tech company, I received a confirmation decision that said, “You must do these things”, and I chose wilfully to ignore that decision, it would be entirely reasonable for me to be held potentially criminally liable for that. That association is a good step forward.

Lord Weir of Ballyholme Portrait Lord Weir of Ballyholme (DUP)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendment 268C, which is in my name and that of the noble Baroness, Lady Benjamin, who has been so proactive in this area. The amendment seeks to clarify the threshold for Ofcom to take immediate enforcement action when children are exposed to suicide, self-harm, eating disorders and pornographic materials. It would require the regulator to either take that action or at least provide an explanation to the Secretary of State within a reasonable timeframe as to why it has chosen not to.

When we pass the Bill, the public will judge it not simply on its contents but on its implementation, its enforcement and the speed of that enforcement. Regulatory regimes as a whole work only if the companies providing the material believe the regulator to be sufficiently muscular in its approach. Therefore, the test is not simply what is there but how long it will take for a notice, whenever it is issued, to lead to direct change.

I will give two scenarios to illustrate the point. Let us take the example of a video encouraging the so-called blackout challenge, or choking challenge, which went viral on social media about two years ago. For those who are unaware, it challenged children to choke themselves to the point at which they lost consciousness and to see how long they could do that. This resulted in the death of about 15 children. If a similar situation arises and a video is not removed because it is not against the terms and conditions of the service, does Ofcom allow the video to circulate for a period of, say, six months while giving a grace period for the platform to introduce age gating? What if the platform fails to implement that highly effective age verification? How long will it take to get through warnings, a provisional notice of contravention, a representation period, a confirmation decision and the implementation of required measures before the site is finally blocked? As I indicated, this is not hypothetical; it draws from a real-life example. We know that this is not simply a matter of direct harm to children; it can lead to a risk of death, and has done in the past.

What about, for example, a pornographic site that simply has a banner where a person can self-declare that they are over 18 in order to access it? I will not rehearse, since they have been gone through a number of times, the dangers for children of early exposure to violent pornography and the impact that will have on respectful relationships, as we know from government reports, and particularly the risk it creates of viewing women as sex objects. It risks additional sexual aggression towards women and perpetuates that aggression. Given that we are aware that large numbers of children have access to this material, surely it would be irresponsible to sacrifice another generation of children to a three-year implementation process.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, I am grateful to the noble Baroness, Lady Newlove, and the noble Lord, Lord Clement-Jones, for adding their names to Amendment 270A, and to the NSPCC for its assistance in tabling this amendment and helping me to think about it.

The Online Safety Bill has the ambition, as we have heard many times, of making the UK the safest place for a child to be online. Yet, as drafted, it could pass into legislation without a system to ensure that children’s voices themselves can be heard. This is a huge gap. Children are experts in their own lives, with a first-hand understanding of the risks that they face online. It is by speaking to, and hearing from, children directly that we can best understand the harms they face online—what needs to change and how the regulation is working in practice.

User advocates are commonplace in most regulated environments and are proven to be effective. Leading children’s charities such as 5Rights, Barnardo’s and YoungMinds, as well as organisations set up by bereaved parents campaigning for child safety online, such as the Molly Rose Foundation and the Breck Foundation, have joined the NSPCC in calling for the introduction of this advocacy body for children, as set out in the amendment.

I do not wish to detain anyone. The Minister’s response when this was raised in Committee was, in essence, that this should go to the Children’s Commissioner for England. I am grateful to her for tracking me down in a Pret A Manger in Russell Square on Monday and having a chat. She reasonably pointed out that much of the amendment reads a bit like her job description, but she also could see that it is desirable to have an organisation such as the NSPCC set up a UK-wide helpline. There are children’s commissioners for Scotland, Wales and Northern Ireland who are supportive of a national advocacy body for children. She was suggesting —if the Minister agrees that this seems like a good solution—that they could commission a national helpline that works across the United Kingdom, and then advises a group that she could convene, including the children’s commissioners from the other nations of the United Kingdom. If that seems a good solution to the Minister, I do not need to press the amendment, we are all happy and we can get on with the next group. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I just want to make some brief comments in support of the principle of what the noble Lord, Lord Knight, is aiming at in this amendment.

The Bill is going to have a profound impact on children in the United Kingdom. We hope that the most profound impact will be that it will significantly advance their interests in terms of safety online. But it will also potentially have a significant impact on what they can access online and the functionality of different services. They are going to experience new forms of age assurance, about which they may have very strong views. For example, the use of their biometric data to estimate their age will be there to protect them, but they may still have strong views about that.

I have said many times that there may be some measures in the Bill that will encourage services to become 18-plus only. That is not adult in the sense of adult content. Ordinary user-to-user social media services may look at the obligations and say, “Frankly, we would much rather restrict ourselves to users from the UK who identify as being 18-plus, rather than have to take on board all the associated liabilities in respect of children”—not because they are irresponsible, but precisely because they are responsible, and they can see that there is a lot of work to do in order to be legally and safely available to those under 18. For all those reasons, it is really important that the child advocacy body looks at things such as the United Nations Convention on the Rights of the Child and the rights of children to access information, and that it is able to take a view on them.

The reason I think that is important—as will any politician who has been out and spoken in schools—is that very often children are surprising in terms of what they see as their priorities. We make assumptions about their priorities, which can often be entirely wrong. There has been some really good work done on this. There was a project called EU Kids Online, back in the days of the EU, which used to look at children right across the European Union and ask them what their experience of being online was like and what was important to them. There are groups such as Childnet International, which for years has been convening groups of children and taking them to places such as the Internet Governance Forum. That always generates a lot of information that we here would not have thought of, about what children feel is really important to them about their online experience.

For all those reasons, it really would be helpful to institutionalise this in the new regime as some kind of body that looks in the round at children’s interests—their interests to stay safe, but also their interests to be able to access a wide variety of online services and to use the internet as they want to use it. I hope that that strengthens the case the noble Lord, Lord Knight, has made for such a body to exist in some kind of coalition-like format.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I am afraid that I have some reservations about this amendment. I was trying not to, but I have. The way that the noble Lord, Lord Allan of Hallam, explained the importance of listening to young people is essential—in general, not being dictated to by them, but to understand the particular ways that they live their lives; the lived experience, to use the jargon. Particularly in relation to a Bill that spends its whole time saying it is designed to protect young people from harm, it might be worth having a word with them and seeing what they say. I mean in an ongoing way—I am not being glib. That seems very sensible.

I suppose my concern is that this becomes a quango. We have to ask who is on it, whether it becomes just another NGO of some kind. I am always concerned about these kinds of organisations when they speak “on behalf of”. If you have an advocacy body for children that says, “We speak on behalf of children”, that makes me very anxious. You can see that that can be a politically very powerful role, because it seems to have the authority of representing the young, whereas actually it can be entirely fictitious and certainly not democratic or accountable.

The key thing we discussed in Committee, which the noble Lord, Lord Knight of Weymouth, is very keen on—and I am too—is that we do not inadvertently deny young people important access rights to the internet in our attempt to protect them. That is why some of these points are here. The noble Baroness, Lady Kidron, was very keen on that. She wants to protect them but does not want to end up with them being denied access to important parts of the internet. That is all good, but I just think this body is wrong.

The only other thing to draw noble Lords’ attention to—I am not trying to be controversial, but it is worth nothing—is that child advocacy is currently in a very toxic state because of some of the issues around who represents children. As we speak, there is a debate about, for example, whether the NSPCC has been captured by Stonewall. I make no comment because I do not know; I am just noting it. We have had situations where a child advocacy group such as Mermaids is now discredited because it is seen to have been promoting chest binders for young people, to have gone down the gender ideology route, which some people would argue is child abuse of a sort, advocating that young women remove their breasts—have double mastectomies. This is all online, by the way.

I know that some people would say, “Oh, you’re always going on about that”, but I raise it because it is a very real and current discussion. I know a lot of people who work in education, with young people or in children’s rights organisations, and they keep telling me that they are tearing themselves apart. I just wondered whether the noble Lord, Lord Knight, might note that there is a danger of walking into a minefield here—which I know he does not mean to walk into—by setting up an organisation that could end up being the subject of major culture wars rows or, even worse, one of those dreaded quangos that pretends it is representing people but does not.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, even if the content is not harmful. We keep saying “content” because it is the way the content is disseminated, as the Bill sets out, but the features and functionalities can increase the risks of harm as well. We have addressed this through looking at the cumulative effects and in other ways.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

This is the key question. For example, let us take a feature that is pushing something at you constantly; if it was pushing poison at you then it would obviously be harmful, but if it was pushing marshmallows then they would be singularly not harmful but cumulatively harmful. Is the Minister saying that the second scenario is still a problem and that the surfeit of marshmallows is problematic and will still be captured, even if each individual marshmallow is not harmful?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, because the cumulative harm—the accumulation of marshmallows in that example—has been addressed.

Noble Lords should also be aware that the drafting of Amendment 281FA has the effect of saying that harm can arise from proposed new paragraphs (a), (b) and (c)—for example, from the

“age or characteristics of the likely user group”.

In effect, being a child or possessing a particular characteristic may be harmful. This may not be the intention of the noble Baronesses who tabled the amendment, but it highlights the important distinction between something being a risk factor that influences the risk of harm occurring and something being harmful.

The Government are clear that these aspects should properly be treated as risk factors. Other parts of the Bill already make it clear that the ways in which a service is designed and used may impact on the risk of harm suffered by users. I point again to paragraphs (f) to (h) of Clause 10(6); paragraph (e) talks about the level of risk of functionalities of the service, paragraph (f) talks about the different ways in which the service is used, and so on.

We have addressed these points in the Bill, though clearly not to the satisfaction of my noble friend, the noble Baroness, Lady Kidron, and others. As we conclude Report, I recognise that we have not yet convinced everyone that our approach achieves what we all seek, though I am grateful for my noble friend’s recognition that we all share the same aim in this endeavour. As I explained to the noble Baroness, Lady Kidron, on her Amendment 35, I was asking her not to press it because, if she did, the matter would have been dealt with on Report and we would not be able to return to it at Third Reading.

As the Bill heads towards another place with this philosophical disagreement still bubbling away, I am very happy to commit to continuing to talk to your Lordships—particularly when the Bill is in another place, so that noble Lords can follow the debates there. I am conscious that my right honourable friend Michelle Donelan, who has had a busy maternity leave and has spoken to a number of your Lordships while on leave, returns tomorrow in preparation for the Bill heading to her House. I am sure she will be very happy to speak even more when she is back fully at work, but we will both be happy to continue to do so.

I think it is appropriate, in some ways, that we end on this issue, which remains an area of difference. With that promise to continue these discussions as the Bill moves towards another place, I hope that my noble friend will be content not to press these amendments, recognising particularly that the noble Baroness, Lady Kidron, has already inserted this thinking into the Bill for consideration in the other House.

Online Safety Bill

Lord Allan of Hallam Excerpts
I do not think we have time to wait for the report that my noble friend seeks. This is the long-awaited Online Safety Bill. We have been warned by the inventors of neural networks and leaders in AI and alternate realities that we are at a crossroads between human and machine. It is incumbent on the Government to ensure that the Bill is fit not only for the past but for the future. In order to do that, they need to look at the definitions—as they did so admirably in Part 5—but also at some of the exceptions they have carved out so that they can say that the Bill truly ends the era of exceptionality in which harms online are treated differently from those offline. My view is that the amendment in the name of my noble friend Lady Finlay should not be necessary at this stage. But, if the Minister cannot confirm that it is already covered, perhaps he will indicate his willingness to accept the amendment.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I will make some arguments in favour of Amendment 191A, in the name of the noble Baroness, Lady Kidron, and inject some notes of caution around Amendment 186A.

On Amendment 191A, it has been my experience that when people frequently investigate something that has happened on online services, they do it well, and well-formed requests are critical to making this work effectively. This was the case with law enforcement: when an individual police officer is investigating something online for the first time, they often ask the wrong questions. They do not understand what they can get and what they cannot get. It is like everything in life: the more you do it, the better you get at it.

Fortunately, in a sense, most coroners will only very occasionally have to deal with these awful circumstances where they need data related to the death of a child. At that point, they are going to be very dependent on Ofcom—which will be dealing with the companies day in and day out across a range of issues—for its expertise. Therefore, it makes absolute sense that Ofcom’s expertise should be distributed widely and that coroners—at the point where they need to access this information—should be able to rely on that. So Amendment 191A is very well intended and, from a practical point of view, very necessary if we are going to make this new system work as I know the noble Baroness, Lady Kidron, and I would like to see it work.

On Amendment 186A around consumer law, I can see the attraction of this, as well as some of the read-across from the United States. A lot of the enforcement against online platforms in the US takes place through the Federal Trade Commission precisely in this area of consumer law and looking at unfair and deceptive practices. I can see the attraction of seeking to align with European Union law, as the noble Lord, Lord Moylan, argued we should be doing with respect to consumer law. However, I think this would be much better dealt with in the context of the digital markets Bill and it would be a mistake to squeeze it in here. My reasons for this are about both process and substance.

In terms of process, we have not done the impact assessment on this. It is quite a major change, for two reasons. First, it could potentially have a huge impact in terms of legal costs and the way businesses will have to deal with that—although I know nobody is going to get too upset if the impact assessment says there will be a significant increase in legal costs for category 1 companies. However, we should at least flesh these things out when we are making regulations and have them in an impact assessment before going ahead and doing something that would have a material impact.

Secondly in process terms, there are some really interesting questions about the way this might affect the market. The consumer law we have does exclude services that are offered for free, because so much of consumer law is about saying, “If the goods are not delivered correctly, you get your money back”. With free services, we are clearly dealing with a different model, so the notion that we have a law that is geared towards making sure you either get the goods or you get the money may not be the best fit. To try to shoehorn in these free-at-the-point-of-use services may not be the best way to do it, even from a markets and consumer point of view. Taking our time to think about how to get this right would make sense.

More fundamentally, in terms of the substance, we need to recognise that, as a result of the Online Safety Bill, Ofcom will be requiring regulated services to rewrite their terms of service in quite a lot of detail. We see this throughout the Bill. We are going to have to do all sorts of things—we will debate other amendments in this area today—to make sure that their terms of service are conformant with what we want from them in this Bill. They are going to have to redo their complaints and redress mechanisms. All of this is going to have to change and Ofcom is going to be the regulator that tells them how to do it; that is what we are asking Ofcom to tell them to do.

My fundamental concern here, if we introduce another element, is that there is a whole different structure under consumer law where you might go to local trading standards or the CMA, or you might launch a private action. In many cases, this may overlap. The overlap is where consumer law states that goods must be provided with reasonable care and skill and in a reasonable time. That sounds great, but it is also what the Online Safety Bill is going to be doing. We do not want consumer law saying, “You need to write your terms of service this way and handle complaints this way”, and then Ofcom coming along and saying, “No, you must write your terms of service that way and handle complaints that way”. We will end up in a mess. So I just think that, from a practical point of view, we should be very focused in this Bill on getting all of this right from an Online Safety Bill point of view, and very cautious about introducing another element.

Perhaps one of the attractions of the consumer law point for those who support the amendment is that it says, “Your terms must be fair”. It is the US model; you cannot have unfair terms. Again, I can imagine a scenario in which somebody goes to court and tries to get the terms struck down because they are unfair but the platform says, “They’re the terms Ofcom told me to write. Sort this out, please, because Ofcom is saying I need to do this but the courts are now saying the thing I did was unfair because somebody feels that they were badly treated”.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

Does the noble Lord accept that that is already a possibility? You can bring an action in contract law against them on the grounds that it is an unfair contract. This could happen already. It is as if the noble Lord is not aware that the possibility of individual action for breach of contract is already built into Clause 65. This measure simply supplements it.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I am certainly aware that it is there but, again, the noble Lord has just made the point himself: this supplements it. The intent of the amendment is to give consumers more rights under this additional piece of legislation; otherwise, why bother with the amendment at all? The noble Lord may be arguing against himself in saying that this is unnecessary and, at the same time, that we need to make the change. If we make the change, it is, in a sense, a material change to open the door to more claims being made under consumer law that terms are unfair. As I say, we may want this outcome to happen eventually, but I find it potentially conflicting to do it precisely at a time when we are getting Ofcom to intervene much more closely in setting those terms. I am simply arguing, “Let’s let that regime settle down”.

The net result and rational outcome—again, I am speaking to my noble friend’s Amendment 253 here—may be that other regulators end up deferring to Ofcom. If Ofcom is the primary regulator and we have told it, under the terms of the Online Safety Bill, “You must require platforms to operate in this way, handle complaints in this way and have terms that do these things, such as excluding particular forms of language and in effect outlawing them on platforms”, the other regulators will eventually end up deferring to it. All I am arguing is that, at this stage, it is premature to try to introduce a second, parallel route for people to seek changes to terms or different forms of redress, however tempting that may be. So I am suggesting a note of caution. It is not that we are starting from Ground Zero—people have routes to go forward today—but I worry about introducing something that I think people will see as material at this late stage, having not looked at the full impact of it and potentially running in conflict with everything else that we are trying to do in this legislation.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, transparency and accountability are at the heart of the regulatory framework that the Bill seeks to establish. It is vital that Ofcom has the powers it needs to require companies to publish online safety information and to scrutinise their systems and processes, particularly their algorithms. The Government agree about the importance of improving data sharing with independent researchers while recognising the nascent evidence base and the complexities of this issue, which we explored in Committee. We are pleased to be bringing forward a number of amendments to strengthen platforms’ transparency, which confer on Ofcom new powers to assess how providers’ algorithms work, which accelerate the development of the evidence base regarding researchers’ access to information and which require Ofcom to produce guidance on this issue.

Amendment 187 in my name makes changes to Clause 65 on category 1 providers’ duties to create clear and accessible terms of service and apply them consistently and transparently. The amendment tightens the clause to ensure that all the providers’ terms through which they might indicate that a certain kind of content is not allowed on its service are captured by these duties.

Amendment 252G is a drafting change, removing a redundant paragraph from the Bill in relation to exceptions to the legislative definition of an enforceable requirement in Schedule 12.

In relation to transparency, government Amendments 195, 196, 198 and 199 expand the types of information that Ofcom can require category 1, 2A and 2B providers to publish in their transparency reports. With thanks to the noble Lord, Lord Stevenson of Balmacara, for his engagement on this issue, we are pleased to table these amendments, which will allow Ofcom to require providers to publish information relating to the formulation, development and scope of user-to-user service providers’ terms of service and search service providers’ public statements of policies and procedures. This is in addition to the existing transparency provision regarding their application.

Amendments 196 and 199 would enable Ofcom to require providers to publish more information in relation to algorithms, specifically information about the design and operation of algorithms that affect the display, promotion, restriction, discovery or recommendation of content subject to the duties in the Bill. These changes will enable greater public scrutiny of providers’ terms of service and their algorithms, providing valuable information to users about the platforms that they are using.

As well as publicly holding platforms to account, the regulator must be able to get under the bonnet and scrutinise the algorithms’ functionalities and the other systems and processes that they use. Empirical tests are a standard method for understanding the performance of an algorithmic system. They involve taking a test data set, running it through an algorithmic system and observing the output. These tests may be relevant for assessing the efficacy and wider impacts of content moderation technology, age-verification systems and recommender systems.

Government Amendments 247A, 250A, 252A, 252B, 252C, 252D, 252E and 252F will ensure that Ofcom has the powers to enable it to direct and observe such tests remotely. This will significantly bolster Ofcom’s ability to assess how a provider’s algorithms work, and therefore to assess its compliance with the duties in the Bill. I understand that certain technology companies have voiced some concerns about these powers, but I reassure your Lordships that they are necessary and proportionate.

The powers will be subject to a number of safeguards. First, they are limited to viewing information. Ofcom will be unable to remotely access or interfere with the service for any other purpose when exercising the power. These tests would be performed offline, meaning that they would not affect the services’ provision or the experience of users. Assessing systems, processes, features and functionalities is the focus of the powers. As such, individual user data and content are unlikely to be the focus of any remote access to view information.

Additionally, the power can be used only where it is proportionate to use in the exercise of Ofcom’s functions—for example, when investigating whether a regulated service has complied with relevant safety duties. A provider would have a right to bring a legal challenge against Ofcom if it considered that a particular exercise of the power was unlawful. Furthermore, Ofcom will be under a legal obligation to ensure that the information gathered from services is protected from disclosure, unless clearly defined exemptions apply.

The Bill contains no restriction on services making the existence and detail of the information notice public. Should a regulated service wish to challenge an information notice served to it by Ofcom, it would be able to do so through judicial review. In addition, the amendments create no restrictions on the use of this power being viewable to members of the public through a request, such as those under the Freedom of Information Act—noting that under Section 393 of the Communications Act, Ofcom will not be able to disclose information it has obtained through its exercise of these powers without the provider’s consent, unless permitted for specific, defined purposes. These powers are necessary and proportionate and will that ensure Ofcom has the tools to understand features and functionalities and the risks associated with them, and therefore the tools to assess companies’ compliance with the Bill.

Finally, I turn to researchers’ access to data. We recognise the valuable work of researchers in improving our collective understanding of the issues we have debated throughout our scrutiny of the Bill. However, we are also aware that we need to develop the evidence base to ensure that any sharing of sensitive information between companies and researchers can be done safely and securely. To this end, we are pleased to table government Amendments 272B, 272C and 272D.

Government Amendment 272B would require Ofcom to publish its report into researcher access to information within 18 months, rather than two years. This report will provide the evidence base for government Amendments 272C and 272D, which would require Ofcom to publish guidance on this issue. This will provide valuable, evidence-based guidance on how to improve access for researchers safely and securely.

That said, we understand the calls for further action in this area. The Government will explore this issue further and report back to your Lordships’ House on whether further measures to support researchers’ access to data are required—and if so, whether they could be implemented through other legislation, such as the Data Protection and Digital Information Bill. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, Amendment 247B in my name was triggered by government Amendment 247A, which the Minister just introduced. I want to explain it, because the government amendment is quite late—it has arrived on Report—so we need to look in some detail at what the Government have proposed. The phrasing that has caused so much concern, which the Minister has acknowledged, is that Ofcom will be able to

“remotely access the service provided by the person”.

It is those words—“remotely access”—which are trigger words for anyone who lived through the Snowden disclosures, where everyone was so concerned about remote access by government agencies to precisely the same services we are talking about today: social media services.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to noble Lords for their contributions in this group. On the point made by the noble Lord, Lord Knight of Weymouth, on why we are bringing in some of these powers now, I say that the power to direct and observe algorithms was previously implicit within Ofcom’s information powers and, where a provider has UK premises, under powers of entry, inspection and audit under Schedule 12. However, the Digital Markets, Competition and Consumers Bill, which is set to confer similar powers on the Competition and Markets Authority and its digital markets unit, makes these powers explicit. We wanted to ensure that there was no ambiguity over whether Ofcom had equivalent powers in the light of that. Furthermore, the changes we are making ensure that Ofcom can direct and observe algorithmic assessments even if a provider does not have relevant premises or equipment in the UK.

I am grateful to the noble Lord, Lord Allan of Hallam, for inviting me to re-emphasise points and allay the concerns that have been triggered, as his noble friend Lord Clement-Jones put it. I am happy to set out again a bit of what I said in opening this debate. The powers will be subject to a number of safeguards. First, they are limited to “viewing information”. They can be used only where they are proportionate in the exercise of Ofcom’s functions, and a provider would have the right to bring a legal challenge against Ofcom if it considered that a particular exercise of the power was done unlawfully. Furthermore, Ofcom will be under a legal obligation to ensure that the information gathered from services is protected from disclosure, unless clearly defined exemptions apply.

These are not secret powers, as the noble Lord rightly noted. The Bill contains no restriction on services making the existence and detail of the information notice public. If a regulated service wished to challenge an information notice served to it by Ofcom, it would be able to do so through judicial review. I also mentioned the recourse that people have through existing legislation, such as the Freedom of Information Act, to give them safeguards, noting that, under Section 393 of the Communications Act, Ofcom will not be able to disclose information that it has obtained through its exercise of these powers without the provider’s consent unless that is permitted for specific, defined purposes.

The noble Lord’s Amendment 247B seeks to place further safeguards on Ofcom’s use of its new power to access providers’ systems remotely to observe tests. While I largely agree with the intention behind it, there are already a number of safeguards in place for the use of that power, including in relation to data protection, legally privileged material and the disclosure of information, as I have outlined. Ofcom will not be able to gain remote access simply for exploratory or fishing purposes, and indeed Ofcom expects to have conversations with services about how to provide the information requested.

Furthermore, before exercising the power, Ofcom will be required to issue an information notice specifying the information to be provided, setting out the parameters of access and why Ofcom requires the information, among other things. Following the receipt of an information notice, a notice requiring an inspection or an audit notice, if a company has identified that there is an obvious security risk in Ofcom exercising the power as set out in the notice, it may not be proportionate to do so. As set out in Ofcom’s duties, Ofcom must have regard to the principles under which regulatory activities should be proportionate and targeted only at cases where action is needed.

In line with current practice, we anticipate Ofcom will issue information notice requests in draft form to identify and address any issues, including in relation to security, before the information notice is issued formally. Ofcom will have a legal duty to exercise its remote access powers in a way that is proportionate, ensuring that undue burdens are not placed on businesses. In assessing proportionality in line with this requirement, Ofcom would need to consider the size and resource capacity of a service when choosing the most appropriate way of gathering information, and whether there was a less onerous method of obtaining the necessary information to ensure that the use of this power is proportionate. As I said, the remote access power is limited to “viewing information”. Under this power, Ofcom will be unable to interfere or access the service for any other purpose.

In practice, Ofcom will work with services during the process. It is required to specify, among other things, the information to be provided, which will set the parameters of its access, and why it requires the information, which will explain the link between the information it seeks and the online safety function that it is exercising or deciding whether to exercise.

As noble Lords know, Ofcom must comply with the UK’s data protection law. As we have discussed in relation to other issues, it is required to act compatibly with the European Convention on Human Rights, including Article 8 privacy rights. In addition, under Clause 91(7), Ofcom is explicitly prohibited from requiring the provision of legally privileged information. It will also be under a legal obligation to ensure that the information gathered from services is protected from disclosure unless clearly defined exemptions apply, such as those under Section 393(2) of the Communications Act 2003—for example, the carrying out of any of Ofcom’s functions. I hope that provides reassurance to the noble Lord, Lord Allan, and the noble Baroness, Lady Fox, who raised these questions.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I am grateful to the Minister. That was helpful, particularly the description of the process and the fact that drafts have to be issued early on. However, it still leaves open a couple of questions, one of which was very helpfully raised by the noble Lord, Lord Knight. We have in Schedule 12 this other set of protections that could be applied. There is a genuine question as to why this has been put in this place and not there.

The second question is to dig a little more into the question of what happens when there is a dispute. The noble Lord, Lord Moylan, pointed out that if you have created a backdoor then you have created a backdoor, and it is dangerous. If we end up in a situation where a company believes that what it is being asked to do by Ofcom is fundamentally problematic and would create a security risk, it will not be good enough to open up the backdoor and then have a judicial review. It needs to be able to say no at that stage, yet the Bill says that it could be committing a serious criminal offence by failing to comply with an information notice. We want some more assurances, in some form, about what would happen in a scenario where a company genuinely and sincerely believes that what Ofcom is asking for is inappropriate and/or dangerous and it wants not to have to offer it unless and until its challenge has been looked at, rather than having to offer it and then later judicially review a decision. The damage would already have been done by opening up an inappropriate backdoor.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

A provider would have a right to bring a legal challenge against Ofcom if it considered that a particular exercise of the remote access power was unlawful. I am sure that would be looked at swiftly, but I will write to the noble Lord on the anticipated timelines while that judicial review was pending. Given the serious nature of the issues under consideration, I am sure that would be looked at swiftly. I will write further on that.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

We do not think that six weeks is enough time for the evidence base to develop sufficiently, our assessment being that to endow the Secretary of State with that power at this point is premature.

Amendment 262AA would require Ofcom to consider whether it is appropriate to require providers to take steps to comply with Ofcom’s researcher access guidance when including a requirement to take steps in a confirmation decision. This would be inappropriate because the researcher access provisions are not enforceable requirements; as such, compliance with them should not be subject to enforcement by the regulator. Furthermore, enforcement action may relate to a wide variety of very important issues, and the steps needed should be sufficient to address a failure to comply with an enforceable requirement. Singling out compliance with researcher access guidance alone risks implying that this will be adequate to address core failures.

Amendment 272AB would require Ofcom to give consideration to whether greater access to data could be achieved through legal requirements or incentives for regulated services. I reassure noble Lords that the scope of Ofcom’s report will already cover how greater access to data could be achieved, including through enforceable requirements on providers.

Amendment 272E would require Ofcom to take a provider’s compliance with Ofcom’s guidance on researcher access to data into account when assessing risks from regulated services and determining whether to take enforcement action and what enforcement action to take. However, we do not believe that this is a relevant factor for consideration of these issues. I hope noble Lords will agree that whether or not a company has enabled researcher access to its data should not be a mitigating factor against Ofcom requiring companies to deal with terrorism or child sexual exploitation or abuse content, for example.

On my noble friend Lord Bethell’s remaining Amendments 272BA, 273A and 273B, the first of these would require Ofcom to publish its report on researchers’ access to information within six months. While six months would not be deliverable given other priorities and the complexity of this issue, the government amendment to which I have spoken would reduce the timelines from two years to 18 months. That recognises the importance of the issue while ensuring that Ofcom can deliver the key priorities in establishing the core parts of the regulatory framework; for example, the illegal content and child safety duties.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

Just on the timescale, one of the issues that we talked about in Committee was the fact that there needs to be some kind of mechanism created, with a code of practice with reference to data protection law and an approving body to approve researchers as suitable to take information; the noble Baroness, Lady Kidron, referred to the DSA process, which the European Union has been working on. I hope the Minister can confirm that Ofcom might get moving on establishing that. It is not dependent on there being a report in 18 months; in fact, you need to have it in place when you report in 18 months, which means you need to start building it now. I hope the Minister would want Ofcom, within its existing framework, to be encouraging the creation of that researcher approval body and code of practice, not waiting to start that process in 18 months’ time.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will continue my train of thought on my noble friend’s amendments, which I hope will cover that and more.

My noble friend’s Amendment 273A would allow Ofcom to appoint approved independent researchers to access information. Again, given the nascent evidence base here, it is important to focus on understanding these issues before we commit to a researcher access framework.

Under the skilled persons provisions, Ofcom will already have the powers to appoint a skilled person to assess compliance with the regulatory framework; that includes the ability to leverage the expertise of independent researchers. My noble friend’s Amendment 273B would require Ofcom to produce a code of practice on access to data by researchers. The government amendments I spoke to earlier will require Ofcom to produce guidance on that issue, which will help to promote information sharing in a safe and secure way.

To the question asked by the noble Lord, Lord Allan: yes, Ofcom can start the process and do it quickly. The question here is really about the timeframe in which it does so. As I said in opening, we understand the calls for further action in this area.

I am happy to say to my noble friend Lord Bethell, to whom we are grateful for his work on this and the conversations we have had, that we will explore the issue further and report back on whether further measures to support researchers’ access to data are required and, if so, whether they can be implemented through other legislation, such as the Data Protection and Digital Information (No.2) Bill.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, Clause 158 is one of the more mysterious clauses in the Bill and it would greatly benefit from a clear elucidation by the Minister of how it is intended to work to reduce harm. I thank him for having sent me an email this afternoon as we started on the Bill, for which I am grateful; I had only a short time to consider it but I very much hope that he will put its content on the record.

My amendment is designed to ask how the Minister envisages using the power to direct if, say, there is a new contagious disease or riots, and social media is a major factor in the spread of the problem. I am trying to erect some kind of hypothetical situation through which the Minister can say how the power will be used. Is the intention, for example, to set Ofcom the objective of preventing the spread of information on regulated services injurious to public health or safety on a particular network for six months? The direction then forces the regulator and the social media companies to confront the issue and perhaps publicly shame an individual company into using their tools to slow the spread of disinformation. The direction might give Ofcom powers to gather sufficient information from the company to make directions to the company to tackle the problem.

If that is envisaged, which of Ofcom’s media literacy powers does the Minister envisage being used? Might it be Section 11(1)(e) of the Communications Act 2003, which talks about encouraging

“the development and use of technologies and systems for regulating access to such material, and for facilitating control over what material is received, that are both effective and easy to use”.

By this means, Ofcom might encourage a social media company to regulate access to and control over the material that is a threat.

Perhaps the Minister could set out clearly how he intends all this to work, because on a straight reading of Clause 158, we on these Benches have considerable concerns. The threshold for direction is low—merely having

“reasonable grounds for believing that circumstances exist”—

and there is no sense here of the emergency that the then Minister, Mr Philp, cited in the Commons Public Bill Committee on 26 May 2022, nor even of the exceptional circumstances in Amendment 138 to Clause 39, which the Minister tabled recently. The Minister is not compelled by the clause to consult experts in public health, safety or national security. The Minister can set any objectives for Ofcom, it seems. There is no time limit for the effect of the direction and it seems that the direction can be repeatedly extended with no limit. If the Minister directs because they believe there is a threat to national security, we will have the curious situation of a public process being initiated for reasons the Minister is not obliged to explain.

Against this background, there does not seem to be a case for breaching the international convention of the Government not directing a media regulator. Independence of media regulators is the norm in developed democracies, and the UK has signed many international statements in this vein. As recently as April 2022, the Council of Europe stated:

“Media and communication governance should be independent and impartial to avoid undue influence on policymaking or”


the discriminatory and

“preferential treatment of powerful groups”,

including those with significant political or economic power. The Secretary of State, by contrast, has no powers over Ofcom regarding the content of broadcast regulation and has limited powers to direct over radio spectrum and wireless, but not content. Ofcom’s independence in day-to-day decision-making is paramount to preserving freedom of expression. There are insufficient safeguards in this clause, which is why I argue that it should not stand part of the Bill.

I will be brief about Clause 159 because, by and large, we went through it in our debate on a previous group. Now that we can see the final shape of the Bill, it really does behove us to stand back and see where the balance has settled on Ofcom’s independence and whether this clause needs to stand part of the Bill. The Secretary of State has extensive powers under various other provisions in the Bill. The Minister has tabled welcome amendments to Clause 39, which have been incorporated into the Bill, but Clause 155 still allows the Secretary of State to issue a “statement of strategic priorities”, including specific outcomes, every five years.

Clause 159 is in addition to this comprehensive list, but the approach in the clause is incredibly broad. We have discussed this, and the noble Lord, Lord Moylan, has tabled an amendment that would require parliamentary scrutiny. The Secretary of State can issue guidance to Ofcom on more or less anything encompassed by the exercise of its functions under this Act, with no consultation of the public or Parliament prior to making such guidance. The time limit for producing strategic guidance is three years rather than five. Even if it is merely “have regard” guidance, it represents an unwelcome intervention in Ofcom going about its business. If the Minister responds that the guidance is merely “to have regard”, I will ask him to consider this: why have it all, then, when there are so many other opportunities for the Government to intervene? For the regulated companies, it represents a regulatory hazard of interference in independent regulation and a lack of stability. As the noble Lord, Lord Bethell, said in Committee, a clear benefit of regulatory independence is that it reduces lobbying of the Minister by powerful corporate interests.

Now that we can see it in context, I very much hope that the Minister will agree that Clause 159 is a set of guidance too many that compromises Ofcom’s independence and should not stand part of the Bill.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I will add to my noble friend’s call for us to consider whether Clause 158 should be struck from the Bill as an unnecessary power for the Secretary of State to take. We have discussed powers for the Secretary of State throughout the Bill, with some helpful improvements led by the noble Baroness, Lady Stowell. This one jars in particular because it is about media literacy; some of the other powers related to whether the Secretary of State could intervene on the codes of practice that Ofcom would issue. The core question is whether we trust Ofcom’s discretion in delivering media literacy and whether we need the Secretary of State to have any kind of power to intervene.

I single out media literacy because the clue is in the name: literacy is a generic skill that you acquire about dealing with the online world; it is not about any specific text. Literacy is a broader set of skills, yet Clause 158 has a suggestion that, in response to specific forms of content or a specific crisis happening in the world, the Secretary of State would want to takesb this power to direct the media literacy efforts. To take something specific and immediate to direct something that is generic and long-term jars and seems inappropriate.

I have a series of questions for the Minister to elucidate why this power should exist at all. It would be helpful to have an example of what kind of “public statement notice”—to use the language in the clause—the Government might want to issue that Ofcom would not come up with on its own. Part of the argument we have been presented with is that, somehow, the Government might have additional information, but it seems quite a stretch that they could come up with that. In an area such as national security, my experience has been that companies often have a better idea of what is going on than anybody in government.

Thousands of people out there in the industry are familiar with APT 28 and APT 29 which, as I am sure all noble Lords know, are better known by their names Fancy Bear and Cozy Bear. These are agents of the Russian state that put out misinformation. There is nothing that UK agencies or the Secretary of State might know about them that is not already widely known. I remember talking about the famous troll factory run by Prigozhin, the Internet Research Agency, with people in government in the context of Russian interference—they would say “Who?” and have to go off and find out. In dealing with threats such as that between the people in the companies and Ofcom, you certainly want a media literacy campaign which tells you about these troll agencies and how they operate and gives warnings to the public, but I struggle to see why you need the Secretary of State to intervene as opposed to allowing Ofcom’s experts to work with company experts and come up with a strategy to deal with those kinds of threat.

The other example cited of an area where the Secretary of State might want to intervene is public health and safety. It would be helpful to be specific; had they had it, how would the Government have used this power during the pandemic in 2020 and 2021? Does the Minister have examples of what they were frustrated about and would have done with these powers that Ofcom would not do anyway in working with the companies directly? I do not see that they would have had secret information which would have meant that they had to intervene rather than trusting Ofcom and the companies to do it.

Perhaps there has been an interdepartmental workshop between DHSC, DCMS and others to cook up this provision. I assume that Clause 158 did not come from nowhere. Someone must have thought, “We need these powers in Clause 158 because we were missing them previously”. Are there specific examples of media literacy campaigns that could not be run, where people in government were frustrated and therefore wanted a power to offer it in future? It would be really helpful to hear about them so that we can understand exactly how the Clause 158 powers will be used before we allow this additional power on to the statute book.

In the view of most people in this Chamber, the Bill as a whole quite rightly grants the Government and Ofcom, the independent regulator, a wide range of powers. Here we are looking specifically at where the Government will, in a sense, overrule the independent regulator by giving it orders to do something it had not thought of doing itself. It is incumbent on the Government to flesh that out with some concrete examples so that we can understand why they need this power. At the moment, as noble Lords may be able to tell, these Benches are not convinced that they do.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I will be very brief. The danger with Clause 158 is that it discredits media literacy as something benign or anodyne; it will become a political plaything. I am already sceptical, but if ever there was anything to add to this debate then it is that.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful for the opportunity to set out the need for Clauses 158 and 159. The amendments in this group consider the role of government in two specific areas: the power for the Secretary of State to direct Ofcom about its media literacy functions in special circumstances and the power for the Secretary of State to issue non-binding guidance to Ofcom. I will take each in turn.

Amendment 219 relates to Clause 158, on the Secretary of State’s power to direct Ofcom in special circumstances. These include where there is a significant threat to public safety, public health or national security. This is a limited power to enable the Secretary of State to set specific objectives for Ofcom’s media literacy activity in such circumstances. It allows the Secretary of State to direct Ofcom to issue public statement notices to regulated service providers, requiring providers to set out the steps they are taking to address the threat. The regulator and online platforms are thereby compelled to take essential and transparent actions to keep the public sufficiently informed during crises. The powers ensure that the regulatory framework is future-proofed and well equipped to respond in such circumstances.

As the noble Lord, Lord Clement-Jones, outlined, I corresponded with him very shortly before today’s debate and am happy to set out a bit more detail for the benefit of the rest of the House. As I said to him by email, we expect the media literacy powers to be used only in exceptional circumstances, where it is right that the Secretary of State should have the power to direct Ofcom. The Government see the need for an agile response to risk in times of acute crisis, such as we saw during the Covid-19 pandemic or in relation to the war in Ukraine. There may be a situation in which the Government have access to information, through the work of the security services or otherwise, which Ofcom does not. This power enables the Secretary of State to make quick decisions when the public are at risk.

Our expectation is that, in exceptional circumstances, Ofcom would already be taking steps to address harm arising from the provision of regulated services through its existing media literacy functions. However, these powers will allow the Secretary of State to step in if necessary to ensure that the regulator is responding effectively to these sudden threats. It is important to note that, for transparency, the Secretary of State will be required to publish the reasons for issuing a direction to Ofcom in these circumstances. This requirement does not apply should the circumstances relate to national security, to protect sensitive information.

The noble Lord asked why we have the powers under Clause 158 when they do not exist in relation to broadcast media. We believe that these powers are needed with respect to social media because, as we have seen during international crises such as the Covid-19 pandemic, social media platforms can sadly serve as hubs for low-quality, user-generated information that is not required to meet journalistic standards, and that can pose a direct threat to public health. By contrast, Ofcom’s Broadcasting Code ensures that broadcast news, in whatever form, is reported with due accuracy and presented with due impartiality. Ofcom can fine, or ultimately revoke a licence to broadcast in the most extreme cases, if that code is breached. This means that regulated broadcasters can be trusted to strive to communicate credible, authoritative information to their audiences in a way that social media cannot.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

We established in our last debate that the notion of a recognised news publisher will go much broader than a broadcaster. I put it to the Minister that we could end up in an interesting situation where one bit of the Bill says, “You have to protect content from these people because they are recognised news publishers”. Another bit, however, will be a direction to the Secretary of State saying that, to deal with this crisis, we are going to give a media literacy direction that says, “Please get rid of all the content from this same news publisher”. That is an anomaly that we risk setting up with these different provisions.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

On the previous group, I raised the issue of legal speech that was labelled as misinformation and removed in the extreme situation of a public health panic. This was seemingly because the Government were keen that particular public health information was made available. Subsequently, we discovered that those things were not necessarily untrue and should not have been removed. Is the Minister arguing that this power is necessary for the Government to direct that certain things are removed on the basis that they are misinformation—in which case, that is a direct attempt at censorship? After we have had a public health emergency in which “facts” have been contested and shown to not be as black and white or true as the Government claimed, saying that the power will be used only in extreme circumstances does not fill me with great confidence.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, the noble Lord, Lord Moylan, and the noble Baroness, Lady Fox, have a very strong point to make with this amendment. I have tried in our discussions to bring some colour to the debate from my own experience so I will tell your Lordships that in my former professional life I received representations from many Ministers in many countries about the content we should allow or disallow on the Facebook platform that I worked for.

That was a frequent occurrence in the United Kingdom and extended to Governments of all parties. Almost as soon as I moved into the job, we had a Labour Home Secretary come in and suggest that we should deal with particular forms of content. It happened through the coalition years. Indeed, I remember meeting the Minister’s former boss at No. 10 in Davos, of all places, to receive some lobbying about what the UK Government thought should be on or off the platform at that time. In that case it was to do with terrorist content; there was nothing between us in terms of wanting to see that content gone. I recognise that this amendment is about misinformation and disinformation, which is perhaps a more contentious area.

As we have discussed throughout the debate, transparency is good. It keeps everybody on the straight and narrow. I do not see any reason why the Government should not be forthcoming. My experience was that the Government would often want to go to the Daily Telegraph, the Daily Mail or some other upright publication and tell it how they had been leaning on the internet companies—it was part of their communications strategy and they were extremely proud of it—but there will be other circumstances where they are doing it more behind the scenes. Those are the ones we should be worried about.

If those in government have good reason to lean on an internet company, fine—but knowing that they have to be transparent about it, as in this amendment, will instil a certain level of discipline that would be quite healthy.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, clearly, there is a limited number of speakers in this debate. We should thank the noble Lord, Lord Moylan, for tabling this amendment because it raises a very interesting point about the transparency—or not—of the Counter Disinformation Unit. Of course, it is subject to an Oral Question tomorrow as well, which I am sure the noble Viscount will be answering.

There is some concern about the transparency of the activities of the Counter Disinformation Unit. In its report, Ministry of Truth, which deals at some length with the activities of the Counter Disinformation Unit, Big Brother Watch says:

“Giving officials an unaccountable hotline to flag lawful speech for removal from the digital public square is a worrying threat to free speech”.


Its complaint is not only about oversight; it is about the activities. Others such as Full Fact have stressed the fact that there is little or no parliamentary scrutiny. For instance, freedom of information requests have been turned down and Written Questions which try to probe what the activities of the Counter Disinformation Unit are have had very little response. As it says, when the Government

“lobby internet companies about content on their platforms … this is a threat to freedom of expression”.

We need proper oversight, so I am interested to hear the Minister’s response.

--- Later in debate ---
Moved by
228: Clause 173, page 151, leave out lines 1 and 2
Member’s explanatory statement
This amendment removes a requirement on providers which could encourage excessive content removal in borderline cases of illegality.
--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

My Lords, we are coming to some critical amendments on a very important issue relatively late in the Bill, having had relatively little discussion on it. It is not often that committees of this House sit around and say, “We need more lawyers”, but this is one of those areas where that was true.

Notwithstanding the blushes of my noble friend on the Front Bench here, interestingly we have not had in our debate significant input from people who understand the law of freedom of expression and wish to contribute to our discussions on how online platforms should deal with questions of the legality of content. These questions are crucial to the Bill, which, if it does nothing else, tells online platforms that they have to be really robust in taking action against content that is deemed to be illegal under a broad swathe of law in the United Kingdom that criminalises certain forms of speech.

We are heavy with providers, and we are saying to them, “If you fail at this, you’re in big trouble”. The pressure to deal with illegal content will be huge, yet illegality itself covers a broad spectrum, from child sexual exploitation and abuse material, where in many cases it is obvious from the material that it is illegal and there is strict liability—there is never any excuse for distributing that material—and pretty much everyone everywhere in the world would agree that it should be criminalised and removed from the internet, through to things that we discussed in Committee, such as public order offences, where, under some interpretations of Section 5 of the Public Order Act, swearing at somebody or looking at them in a funny way in the street could be deemed alarming and harassing. There are people who interpret public order offences in this very broad sense, where there would be a lot less agreement about whether a specific action is or is not illegal and whether the law is correctly calibrated or being used oppressively. So we have this broad spectrum of illegality.

The question we need to consider is where we want providers to draw the line. They will be making judgments on a daily basis. I said previously that I had to make those judgments in my job. I would write to lawyers and they would send back an expensive piece of paper that said, “This is likely to be illegal”, or, “This is likely not to be illegal”. It never said that it was definitely illegal or definitely not illegal, apart from the content I have described, such as child sexual abuse. You would not need to send that, but you would send the bulk of the issues that we are dealing with to a lawyer. If you sent it to a second lawyer, you would get another “likely” or “not likely”, and you would have to come to some kind of consensus view as to the level of risk you wished to take on that particular form of speech or piece of content.

This is really challenging in areas such as hate speech, where exactly the same language has a completely different meaning in different contexts, and may or may not be illegal. Again, to give a concrete example, we would often deal with anti-Semitic content being shared by anti-anti-Semitic groups—people trying to raise awareness of anti-Semitic speech. Our reviewers would quite commonly remove the speech: they would see it and it would look like grossly violating anti-Semitic speech. Only later would they realise that the person was sharing it for awareness. The N-word is a gross term of racial abuse, but if you are an online platform you permit it a lot of the time, because if people use it self-referentially they expect to be able to use it. If you start removing it they would naturally get very upset. People expect to use it if it is in song lyrics and they are sharing music. I could give thousands of examples of speech that may or may not be illegal depending entirely on the context in which it is being used.

We will be asking platforms to make those judgments on our behalf. They will have to take it seriously, because if they let something through that is illegal they will be in serious trouble. If they misjudged it and thought the anti-Semitic hate speech was being circulated by Jewish groups to promote awareness but it turned out it was being circulated by a Nazi group to attack people and that fell foul of UK law, they would be in trouble. These judgments are critical.

We have the test in Clause 173, which says that platforms should decide whether they have “reasonable grounds to infer” that something is illegal. In Committee, we debated changing that to a higher bar, and said that we wanted a stronger evidential basis. That did not find favour with the Government. We hoped they might raise the bar themselves unilaterally, but they have not. However, we come back again in a different way to try to be helpful, because I do not think that the Government want excessive censorship. They have said throughout the Bill’s passage that they are not looking for platforms to be overly censorious. We looked at the wording again and thought about how we could ensure that the bar is not operated in a way that I do not think that the Government intend. We certainly would not want that to happen.

We look at the current wording in Clause 173 and see that the test there has two elements. One is: “Do you have reasonable grounds to infer?” and then a clause in brackets after that says, “If you do have reasonable grounds to infer, you must treat the content as illegal”. In this amendment we seek to remove the second part of that phrasing because it seems problematic. If we say to the platform, “Reasonable grounds to infer, not certainty”—and it is weird to put “inference”, which is by definition mushy, with “must”, which is very certain, into the same clause—we are saying, “If you have this mushy inference, you must treat it as illegal”, which seems quite problematic. Certainly, if I were working at a platform, the way I would interpret that is: “If in doubt, take it out”. That is the only way you can interpret that “must”, and that is really problematic. Again, I know that that is not the Government’s intention, and if it were child sexual exploitation material, of course you “must”. However, if it is the kind of abusive content that you have reasonable grounds to infer may be an offence under the Public Order Act, “must” you always treat that as illegal? As I read the rest of the Bill, if you are treating it as illegal, the sense is that you should remove it.

That is what we are trying to get at. There is a clear understanding from the Government that their intention is “must” when it comes to that hard end of very bad, very clearly bad content. However, we need something else—a different kind of behaviour where we are dealing with content where it is much more marginal. Otherwise, the price we will pay will be in freedom of expression.

People in the United Kingdom publish quite robust, sweary language. I sometimes think that some of the rules we apply penalise the vernacular. People who use sweary, robust language may be doing so entirely legally—the United Kingdom does not generally restrict people from using that kind of language. However, we risk heading towards a scenario where people post such content in future, and they will find that the platform takes it down. They will complain to the platform, saying, “Why the hell did you take my content down?”—in fact, they will probably use stronger words than that to register their complaint. When they do, the platform will say, “We had reasonable grounds to infer that that was in breach of the Public Order Act, for example, because somebody might feel alarmed, harassed or distressed by it. Oh, and look—in this clause, it says we ‘must’ treat it as illegal. Sorry—there is nothing else we can do. We would have loved to have been able to exercise the benefit of the doubt and to allow you to carry on using that kind of language, because we think there is some margin where you have not behaved in an illegal way. But unfortunately, because of the way that Clause 173 has been drafted, our lawyers tell us we cannot afford to take the risk”.

In the amendment we are trying to—I think—help the Government to get out of a situation which, as I say, I do not think they want. However, I fear that the totality of the wording of Clause 173, this low bar for the test and the “must treat as” language, will lead to that outcome where platforms will take the attitude: “Safety first; if in doubt, take it out”, and I do not think that that is the regime we want. I beg to move.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

My Lords, I regret I was unable to be present in Committee to deliver my speech about the chilling effect that the present definition of illegality in the Bill will have on free speech on the internet.

I am still concerned about Clause 173, which directs platforms how to come to the judgment on what is illegal. My concern is that the criterion for illegality, “reasonable grounds to infer” that elements of the content are illegal, will encourage the tech companies to take down content which is not necessarily illegal but which they infer could be. Indeed, the noble Lord, Lord Allan, gave us a whole list of examples of where that might happen. Unfortunately, in Committee there was little support for a higher bar when asking the platforms to judge what illegal content is. However, I have added my name to Amendment 228, put forward by the noble Lord, Lord Allan, because, as he has just said, it is a much less radical way of enhancing free speech when platforms are not certain whether to take down content which they infer is illegal.

The deletion of part of Clause 173(5) is a moderate proposal. It still leaves intact the definition for the platforms of how they are to make the judgment on the illegality of content, but it takes out the compulsory element in this judgment. I believe that it will have the biggest impact on the moderation system. Some of those systems are run by machines, but many of the moderation processes, such as Meta’s Facebook, involve thousands of human beings. The deletion of the second part of Clause 173(5), which demands that they take down content that they infer is illegal, will give them more leeway to err on the side of freedom of speech. I hope that this extra leeway to encourage free speech will also be included in the way that algorithms moderate our content.

--- Later in debate ---
I hope that this allays the concerns raised by the noble Baroness and the noble Lord, and that the noble Lord will be content to withdraw his amendment.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I remain concerned that people who use more choice words of Anglo-Saxon origin will find their speech more restricted than those who use more Latinate words, such as “inference” and “reasonable”, but the Minister has given some important clarifications.

The first is that no single decision could result in a problem for a platform, so it will know that it is about a pattern of bad decision-making rather than a single decision; that will be helpful in terms of taking a bit of the pressure off. The Minister also gave an important clarification around—I hate this language, but we have to say it—priority versus primary priority. If everything is a priority, nothing is a priority but, in this Bill, some things are more of a priority than others. The public order offences are priority offences; therefore, they have a little bit more leeway over those offences than they do over primary priority offences, which include the really bad stuff that we all agree we want to get rid of.

As I say, I do not think that we are going to get much further in our debates today although those were important clarifications. The Minister is trying to give us reasonable grounds to infer that the guidance from Ofcom will result in a gentle landing rather than a cliff edge, which the noble Baroness, Lady Kidron, rightly suggested is what we want. With that, I beg leave to withdraw the amendment.

Amendment 228 withdrawn.
--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I pay tribute to the noble Baroness, Lady Harding, for her role in bringing this issue forward. I too welcome the government amendments. It is important to underline that adding the potential role of app stores to the Bill is neither an opportunity for other companies to fail to comply and wait for the gatekeepers to do the job nor a one-stop shop in itself. It is worth reminding ourselves that digital journeys rarely start and finish in one place. In spite of the incredible war for our attention, in which products and services attempt to keep us rapt on a single platform, it is quite important for everyone in the ecosystem to play their part.

I have two minor points. First, I was not entirely sure why the government amendment requires the Secretary of State to consult as opposed to Ofcom. Can the Minister reassure me that, whoever undertakes the consultation, it will include children and children’s organisations as well as tech companies? Secondly, like the noble Baroness, Lady Harding, I was a little surprised that the amendment does not define an app store but uses the term “the ordinary meaning of”. That seems like it may have the possibility for change. If there is a good reason for that—I am sure there is—then it must be stated that app stores cannot suddenly rebrand to something else and that that gatekeeper function will be kept absolutely front and centre.

Notwithstanding those comments, and associating myself with the idea that nothing should wait until 2025-26, I am very grateful to the Government for bringing this forward.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I will make a brief contribution because I was the misery guts when this was proposed first time round. I congratulate the noble Baroness, Lady Harding, not just on working with colleagues to come up with a really good solution but on seeking me out. If I heard someone be as miserable as I was, I might try to avoid them. She did not; she came and asked me, “Why are you miserable? What is the problem here?”, and took steps to address it. Through her work with the Government, their amendments address my main concerns.

My first concern, as we discussed in Committee, was that we would be asking large companies to regulate their competitors, because the app stores are run by large tech companies. She certainly understood that concern. The second was that I felt we had not necessarily yet clearly defined the problem. There are lots of problems. Before you can come up with a solution, you need a real consensus on what problem you are trying to address. The government amendment will very much help in saying, “Let’s get really crunchy about the actual problem that we need app stores to address”.

Finally, I am a glass-half-full kind of guy as well as a misery guts—there is a contradiction there—and so I genuinely think that these large tech businesses will start to change their behaviour and address some of the concerns, such as getting age ratings correct, just by virtue of our having this regulatory framework in place. Even if today the app stores are technically outside, the fact that the sector is inside and that this amendment tells them that they are on notice will, I think and hope, have a hugely positive effect and we will get the benefits much more quickly than the timescale envisaged in the Bill. That feels like a true backstop. I sincerely hope that the people in those companies, who I am sure will be glued to our debate, will be thinking that they need to get their act together much more quickly. It is better for them to do it themselves than wait for someone to do it to them.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I add my congratulations to the noble Baroness, Lady Harding, on her tenacity, and to the Minister on his flexibility. I believe that where we have reached is pretty much the right balance. There are the questions that the noble Baroness, Lady Harding, and others have asked of the Minister, and I hope he will answer those, but this is a game-changer, quite frankly. Rightly, the noble Baroness has paid tribute to the companies which have put their head above the parapet. That was not that easy for them to do when you consider that those are the platforms they have to depend on for their services to reach the public.

Unlike the research report, they have reserved powers that the Secretary of State can use if the report is positive, which I hope it will be. I believe this could be a turning point. The digital markets and consumers Bill is coming down the track this autumn and that is going to give greater powers to make sure that the app stores can be tackled—after all, there are only two of them and they are an oligopoly. They are the essence of big tech, and they need to function in a much more competitive way.

The noble Baroness talked about timing, and it needs to be digital timing, not analogue. Four years does seem a heck of a long time. I hope the Minister will address that.

Then there is the really important aspect of harmful content. In the last group, the Minister reassured us about systems and processes and the illegality threshold. Throughout, he has tried to reassure us that this is all about systems and processes and not so much about content. However, every time we look, we see that content is there almost by default, unless the subject is raised. We do not yet have a Bill that is actually fit for purpose in that sense. I hope the Minister will use his summer break wisely and read through the Bill to make sure that it meets its purpose, and then come back at Third Reading with a whole bunch of amendments that add functionalities. How about that for a suggestion? It is said in the spirit of good will and summer friendship.

The noble Baroness raised a point about transparency when it comes to Ofcom publishing its review. I hope the Minister can give that assurance as well.

The noble Baroness, Lady Kidron, asked about the definition of app store. That is the gatekeeper function, and we need to be sure that that is what we are talking about.

I end by congratulating once again the noble Baroness and the Minister on where we have got to so far.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I want to congratulate the noble Baroness, Lady Stowell, on her amendments and to raise some concerns, in particular about Amendment 138. I do this as somebody who has had the perhaps unique experience of being leaned on by Governments around the world who sought to give us, as a platform, directions about how to handle content. The risk is real: when there is a huge public outcry and you are an elected politician, you must be seen to be doing something, and the thing that you have been doing to date is to go directly to the platforms and seek to lean on them to make the change that you want.

In future, as the noble Baroness, Lady Stowell, has pointed out quite a few times, we are moving the accountability from the platforms to our independent regulator, Ofcom—and I agree with the noble Baroness, Lady Harding, that that is the right model, as it is an independent regulator. In these amendments we are considering a mechanism whereby that political outrage can still find an outlet, and that outlet will be a direction from the Secretary of State to the regulator asking it to change the guidance that it would otherwise have issued. It is really important that we dig into that and make sure that it does not prevent legitimate political activity but, at the same time, does not replicate the problem that we have had—the lack of transparency about decision-making inside companies, which has been resolved and addressed through leaks and whistleblowers. We do not want to be in a position in which understanding what has been happening in that decision-making process, now inside government, depends on leaks and whistleblowers. Having these directions published seems critical, and I worry a lot about Amendment 138 and how it will potentially mean that the directions are not published.

I have a couple of specific questions around that process to which I hope the Minister can respond. I understand how this will work: Ofcom will send its draft code of practice to the department and, inside the department, if the Secretary of State believes that there is an issue related to national security or there is another more limited set of conditions, they will be able to issue a direction. The direction may or may not have reasons with it; if the Secretary of State trusts Ofcom, they might give their reasons, but if the Secretary of State does not trust Ofcom with the information, they will give it the bare direction with no reasons. Clause 39 gives the Secretary of State the power to either give or withhold reasons, for reasons of national security. Ofcom will then come up with an amended version of the code of practice, reflecting the direction that it has been given.

The bit that I am really interested in is what happens from a Freedom of Information Act point of view. I hope that the Minister can clarify whether it would be possible for an individual exercising their Freedom of Information Act powers to seek the original draft code of practice as it went to the department. The final code of practice will be public, because it will come to us. It may be that we are in a situation in which you can see the original—Ofcom’s draft—and the final draft as it came to Parliament, and the only bit you cannot see under Amendment 138 is the actual direction itself, if the Secretary of State chooses to withhold it. That is quite critical, because we can anticipate that in these circumstances there will be Freedom of Information Act requests and a significant public interest in understanding any direction that was given that affected the speech of people in the United Kingdom. I would expect the ICO, unless there was some compelling reason, to want that original draft from Ofcom to be made public. That is one question around the interaction of the Freedom of Information Act and the process that we are setting out here, assuming that the Secretary of State has withheld their direction.

The other question is whether the Minister can enlighten us as to the circumstances in which he thinks the Secretary of State would be happy to publish the direction. We have said that this is now related only to very narrow national security interests and we have given them that get-out, so I am curious as to whether there are any examples of the kind of direction, in legislating for a power for the Secretary of State, that would meet the narrow criteria of being those exceptional circumstances, yet not be so sensitive—to use the double negative—that the Secretary of State would want to withhold it. If there were some examples of that, it might help assure us that the withholding of publication will be exceptional rather than routine.

My fear is that Section 138 says you can withhold in some circumstances. Actually, if we read it all together and say that, by definition, the direction comes from the fact that there is a national security concern, we end up with a situation in which the lack of publication has to be on national security grounds. Those two mirror each other, and therefore the norm may be that directions are never published. The Minister might allay our concerns if he could, at least in general terms, describe the kind of directions that would meet the gateway criteria for being permissible and yet not be so sensitive that the Secretary of State would not be comfortable with them being published.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this is indeed an apposite day to be discussing ongoing ping-pong. I am very happy to speak enthusiastically and more slowly about my noble friend Lady Stowell of Beeston’s Amendments 139 and 140. We are happy to support those, subject to some tidying up at Third Reading. We agree with the points that she has made and are keen to bring something forward which would mean broadly that a statement would be laid before Parliament when the power to direct had been used. My noble friend Lady Harding characterised them as the infinite ping-pong question and the secretive ping-pong question; I hope that deals with the secretive ping-pong point.

My noble friend Lady Stowell’s other amendments focus on the infinite ping-pong question, and the power to direct Ofcom to modify a code. Her Amendments 139, 140, 144 and 145 seek to address those concerns: that the Secretary of State could enter into a private form of ping-pong with Ofcom, making an unlimited number of directions on a code to prevent it from ever coming before Parliament. Let me first be clear that we do not foresee that happening. As the amendments I have spoken to today show, the power can be used only when specific exceptional reasons apply. In that sense, we agree with the intent of the amendments tabled by my noble friend Lady Stowell. However, we cannot accept them as drafted because they rely on concepts— such as the “objective” of a direction—which are not consistent with the procedure for making a direction set out in the Bill.

The amendments I have brought forward mean that private ping-pong between the Secretary of State and Ofcom on a code is very unlikely to happen. Let me set out for my noble friend and other noble Lords why that is. The Secretary of State would need exceptional reasons for making any direction, and the Bill then requires that the code be laid before Parliament as soon as is reasonably practicable once the Secretary of State is satisfied that no further modifications to the draft are required. That does not leave room for the power to be used inappropriately. A code could be delayed in this way and in the way that noble Lords have set out only if the Secretary of State could show that there remained exceptional reasons once a code had been modified. This test, which is a very high bar, would need to be met each time. Under the amendments in my name, Parliament would also be made aware straightaway each time a direction was made, and when the modified code came before Parliament, it would now come under greater scrutiny using the affirmative procedure.

I certainly agree with the points that the noble Lord, Lord Allan, and others made that any directions should be made in as transparent a way as possible, which is why we have tabled these amendments. There may be some circumstances where the Secretary of State has access to information—for example, from the security services—the disclosure of which would have an adverse effect on national security. In our amendments, we have sought to retain the existing provisions in the Bill to make sure that we strike the right balance between transparency and protecting national security.

As the noble Lord mentioned, the Freedom of Information Act provides an additional route to transparency while also containing existing safeguards in relation to national security and other important areas. He asked me to think of an example of something that would be exceptional but not require that level of secrecy. By dropping economic policy and burden to business, I would point him to an example in those areas, but a concrete example evades me this afternoon. Those are the areas to which I would turn his attention.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

Can the Minister confirm that the fact that a direction has been made will always be known to the public, even if the substance of it is not because it is withheld under the secrecy provision? In other words, will the public always have a before and after knowledge of the fact of the direction, even if its substance is absent?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes; that is right.

I hope noble Lords will agree that the changes we have made and that I have outlined today as a package mean that we have reached the right balance in this area. I am very grateful to my noble friend Lady Stowell —who I see wants to come in—for the time that she too has given this issue, along with members of her committee.

--- Later in debate ---
Lord Faulks Portrait Lord Faulks (Non-Afl)
- Hansard - - - Excerpts

The report must be read as a whole. I do not accept at all what the noble Lord has said. It is worth visiting the IPSO website, because he was very disparaging about the number of complaints that were upheld. IPSO is very transparent; its website shows all the decisions that were reached and the way in which they were reached. I invite those who doubt its independence to look at the constituent elements of those who are on the complaints committee and the board, and all the published decisions, in order to decide whether IPSO is indeed in the pockets of the press, which seemed to be the suggestion made by both noble Lords.

Of course, the approved regulator, Impress, has very little work to do. I am sure it does its work highly conscientiously. The code by which it regulates is remarkably similar to the editors’ code, which is produced by the industry, it is true, with contributions from all sorts of people. It varies from year to year. There is very little criticism of the editors’ code. It provides a very sensible and balanced view to make the press accountable, allowing the complaints committee to decide whether there has been a violation of the code.

The noble Lord, Lord Lipsey, said that at last it has found the press to be in breach of that code in the recent complaint. It was interesting that the complaints body which I chair was alleged to not be independent of the press. It was roundly criticised by the press for coming to that decision—by the Times, the Telegraph and the Daily Mail. At the same time, it is said that the organisation which I chair is not independent. It is of course independent and will continue to be so.

As for Section 40, before I had anything to do with press regulation, I did not like it. As a lawyer, the idea of somebody having a free hit against anybody is unattractive. Whatever you think of press regulation, I do not think that Section 40 should commend itself to anybody. As they have promised for some time, the Government are quite right to include it in the media Bill, which is to come before your Lordships’ House in due course. It has been a sword of Damocles hanging over the industry. It is not helpful, and I hope that it is repealed. I understand that the Labour Party and perhaps the Liberal Democrats will bring back something of that sort. I understand they may be opposing it when it comes into the media Bill, but that is a matter for them in due course.

Of course, the press should be accountable. Of course, it should be properly regulated. The idea of an independent regulator is to provide reassurance that it is being regulated, as opposed to, until this Bill becomes law, social media—which is not regulated—which provides a source for news which is considerably less reliable than all those newspapers which are subject to regulation.

This is not the occasion to go into further debates about Leveson, but it is perhaps worth rereading the Leveson report and the conclusions that Sir Brian reached—which I have done recently. It must be seen, as all reports, as very much of its time. It is particularly interesting to see the extent to which he promoted and advanced the cause of arbitration. Alternative dispute resolution is very much at the centre of what the legal profession as a whole, and Sir Brian Leveson and his committee in particular, advance as a much better way to resolve disputes. There is an arbitration scheme provided by IPSO, as noble Lords and the House may know. Of course, that is an option which we would encourage people to use—consistent with what Sir Brian and his committee recommended. It is not a substitute for going to court, and if people want to, they should be allowed to do so. However, I think there is a case for courts considering having directions whereby, at first, somebody seeking relief in the court should show that they have exhausted alternative remedies, including alternative dispute resolution. I am in favour of that.

On the idea of being Leveson-compliant—I do not think Sir Brian Leveson particularly likes that expression. He made various recommendations, many of which are reflected in what IPSO does now. I understand there is a great deal of history in this debate. I remember the debates myself. No doubt, we will return to them in due course, but I think we should fight today’s battles, and not the battles of 10 years ago or longer. I think the press is much more accountable and responsible than it was. Of course, as parliamentarians, we will carefully watch what the press do and consider carefully whether this exemption is merited. However, I do not think that this amendment is justified and I hope that the Government do not support it.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I want to bring the tone of the debate down somewhat to talk about government Amendments 158 and 161 in a rather nerdier fashion. I hope that the House will be patient with me as I do that.

The Minister said that these two amendments introduce some “minor changes” that would make the Bill work as intended. I want to explore whether they are rather more significant than the Minister has given them credit for, and whether they may have unintended consequences. As I understand it, the purpose of the amendments is to ensure that all forms of video and audio content, in long form or short form, whether originally broadcast or made exclusively for social media, will now benefit from the news publisher exemptions.

Particularly thinking about this from a social media point of view—the noble Lord, Lord Faulks, just made the point about news publishers such as newspapers—when we have been looking at the Bill and the news publisher exemption, we have been thinking of the BBC and newspapers. We have been thinking a lot less about people who regard themselves to be news publishers but produce things exclusively for social media—often in a clickbait fashion, using a lot of short-form material. As I read these amendments, they are saying very clearly that this kind of material will benefit from the news publisher exemption. That opens up a whole series of questions we must ask ourselves about whether that will have unintended consequences.

Looking at this in the context of what it takes to be registered as a news publisher in Clause 50, the noble Viscount, Lord Colville, referred to the fact that there is an intention and a view that Clause 50 should be kept broad so that people can register as news publishers. Clearly, that is good for media diversity, but if we look at those tests, they are tests that I think that a lot of organisations could pass. We must ask ourselves who might try to establish themselves as a recognised news publisher. They would need to have an office in the United Kingdom. They would also need to apply our standards code, but Clause 50(6)(b) says that the standards code can be their own standards code—it does not have to be anyone else’s.

I am not going to get into a debate about who should be the press regulator; that is for other noble Lords. As I read it, these internet services could pass the Clause 50(2) test by establishing the office and meeting a few basic requirements, then under Clause 50(6)(b) say, “I’ve got a standards code. It’s my standards code. I’ve written it—on the back of an envelope but it’s a standards code”. Then we need to think about who might want to take advantage of that material. My reading of the Bill, thinking about intention, is that services such as Breitbart News—which is not my cup of tea, but is a recognised news publisher—would pass the test and would be able to establish themselves as a news publisher in the UK, benefiting from the exemptions. Whether or not I agree with it, I can see that is a reasonable unintended outcome.

My concern is about other services, such as Infowars, which I am sure everybody is familiar with. It is a service that has caused untold harm and has been sued in the US courts for defamation—which is a pretty high bar. Infowars has clearly caused so much harm that it has found itself on the wrong end of defamation lawsuits in the United States. I do not think it should in any way be our intention that a service such as Infowars should be able to benefit from the special privileges granted to news publishers under the legislation. I know that it is hard to draw lines, and I am not expecting the Minister to say at the Dispatch Box exactly where the line should be drawn. However, I think that without citing examples such as that, we risk not testing the legislation to destruction—which is precisely what we should be doing here—and ending up in a scenario where we have created a news publisher exemption that could be taken advantage of by the wrong organisations. Someone has to draw a line and make a classification.

As we create this news publisher exemption, it is incumbent on us to describe it to people out there in vernacular terms they would understand. My understanding is that the BBC, the Daily Mail, Breitbart News—all those are in. We expect them to be able to pass the Clause 50 test and we have no problem with that. Russia Today, Infowars and a whole host of other services that brand themselves news but are incredibly harmful and destructive to society and individuals—we would want them to fail the Clause 50 test.

I hope the Minister will at least acknowledge that there is going to be a challenge around bad services run by bad people claiming to be news publishers under Clause 50. I hope he will agree that it is not our intention to give publisher privileges to services such as Infowars that cause so much harm to society.

--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I was unfortunately unable to attend round 1 of this debate—I had to leave. My noble friend Lord Knight has absented himself from hearing what I am going to say about his remarks, so he must fear that he had got his lines wrong. I apologised to him for leaving him a bit exposed, because we had not quite anticipated how the conversation would go, but I think he did as well as he could, and I repeat the lines he said: this is not the right Bill to rerun the arguments about the Leveson report. I still believe that. The noble Lord, Lord Clement-Jones, does not think the media Bill is; maybe it is not, but at least we can make sure that the debate is properly argued.

It is interesting that, although we clearly have well-defined positions and antipathies present in the debate, a number of things have been said today that will be helpful, if we manage to get a little closer, in trying to resolve some of the issues outstanding. If I am still around and involved in it, I will approach this by trying to see what we can do together rather than the rights and wrongs of positions we have adopted before. It has worked for this Bill: we have achieved huge changes to the Bill because we decided from the start that we would try to see what was the best that could come out of it. That is the instinct I have as we go forward to any future debate and discussion, whether or not it is on the media Bill.

The puzzling thing here is why this is such a continuing concern that it needs to be brought into to any opportunity we have to discuss these areas. The sense we had in the pre-legislative scrutiny committee, which discussed this to some extent but not in quite the same range as we have tonight, or even in Committee, was that the issues raised in this Bill were really about protecting freedom of expression. At that stage, the Bill still had the legal but harmful clauses in it so perhaps had had less exposure to those issues in the debate we had. I still think it is primarily about that. I still have real concerns about it, as have been raised by one or two people already in our discussion. I do not think the recognised news provider definition is a good one; I do not think the definition of a journalist is a good one. The pre-legislative scrutiny committee wanted an objective test of material based around public interest, but the Government would not accept that, so we are where we are. We must try to ensure that what works is what we have in the Bill in relation to the topics before it.

The primary purpose must be to ensure material that will inform and enhance our knowledge about democracy, current affairs and issues that need to be debated in the public space, so it is clearly right that that which is published by recognised journalists—quality journalists is another phrase that has been used—should be protected, perhaps more than other material, but at the fringes there are still doubts as to whether the Bill does that.

I had taken it that in the amendments I signed up to, government Amendments 158 and 161, the material we were talking about was from recognised news publishers, not material self-generated in social media. I am looking hard at the Minister hoping he will be able to come to my aid when he comes to respond. The issue here is about making sure that material that was not originally broadcast but is still provided by a recognised news publisher is protected from being taken down, and it would not have been if those amendments were not made. I hope that is the right interpretation. That was the basis on which I signed up for them; I do not know quite where it leaves me if that is wrong.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

As I opened up that question, just to be clear, I was saying that it is exactly right that an individual user would not be covered, but I was trying to suggest that a social media-only news service that does not exist as a publication or a broadcaster outside social media, if it meets the Clause 50 test to be a recognised news publisher, should be given extra scope under the amendments.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I hope they do not, and I think the Minister has to answer that question quite directly. The issue here is about quality material that would otherwise be taken down being kept in place so that we can all as a society be informed by that. That does not mean it needs to be from particular sources that we know to be egregious or running material which is certainly not in the public interest. Again, I make the point that that would have been a better way of approaching this in the legislation, but I take the point made by the noble Lord, Lord Allan, who knows his stuff—I often think we ought to bottle him and carry it around so we can take a whiff of his expertise and knowledge every time we get stuck on a problem, but I am not quite sure how we manage that.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, child sexual exploitation or abuse is an abhorrent crime. Reporting allows victims to be identified and offenders apprehended. It is vital that in-scope companies retain the data included in reports made to the National Crime Agency. This will enable effective prosecutions and ensure that children can be protected.

The amendments in my name in this group will enable the Secretary of State to include in the regulations about the reporting of child sexual exploitation or abuse content a requirement for providers to retain data. This requirement will be triggered only by a provider making a report of suspected child sexual exploitation or abuse to the National Crime Agency. The provider will need to retain the data included in the report, along with any associated account data. This is vital to enabling prosecutions and to ensuring that children can be protected, because data in reports cannot be used as evidence. Law enforcement agencies request this data only when they have determined that the content is in fact illegal and that it is necessary to progress investigations.

Details such as the types of data and the period of time for which providers must retain this data will be specified in regulations. This will ensure that the requirement is future-proofed against new types of data and will prevent companies retaining types of data that may have become obsolete. The amendments will also enable regulations to include any necessary safeguards in relation to data protection. However, providers will be expected to store, process and share this personal data within the UK GDPR framework.

Regulations about child sexual exploitation or abuse reporting will undergo a robust consultation with relevant parties and will be subject to parliamentary scrutiny. This process will ensure that the regulations about retaining data will be well-informed, effective and fit for purpose. These amendments bring the child sexual exploitation and abuse reporting requirements into line with international standards. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, these seem very sensible amendments. I am curious about why they have arrived only at this stage, given this was a known problem and that the Bill has been drafted over a long period. I am genuinely curious as to why this issue has been raised only now.

On the substance of the amendments, it seems entirely sensible that, given that we are now going to have 20,000 to 25,000 regulated entities in scope, some of which will never have encountered child sexual exploitation or abuse material or understood that they have a legal duty in relation to it, it will be helpful for them to have a clear set of regulations that tell them how to treat their material.

Child sexual exploitation or abuse material is toxic in both a moral and a legal sense. It needs to be treated almost literally as toxic material inside a company, and sometimes that is not well understood. People feel that they can forward material to someone else, not understanding that in doing so they will break the law. I have had experiences where well-meaning people acting in a vigilante capacity sent material to me, and at that point you have to report them to police. There are no ifs or buts. They have committed an offence in doing so. As somebody who works inside a company, your computer has to be quarantined and taken off and cleaned, just as it would be for any other toxic material, because we framed the law, quite correctly, to say that we do not want to offer people the defence of saying “I was forwarding this material because I’m a good guy”. Forwarding the material is a strict liability offence, so to have regulations that explain, particularly to organisations that have never dealt with this material, exactly how they have to deal with it in order to be legally compliant will be extremely helpful.

One thing I want to flag is that there are going to be some really fundamental cross-border issues that have to be addressed. In many instances of child sexual exploitation or abuse material, the material has been shared between people in different jurisdictions. The provider may not be in a UK jurisdiction, and we have got to avoid any conflicts of laws. I am sure the Government are thinking about this, but in drafting those regulations, what we cannot do, for example, is order a provider to retain data in a way that would be illegal in the jurisdiction from which it originates or in which it has its headquarters. The same would apply vice versa. We would not expect a foreign Government to order a UK company to act in a way that was against UK law in dealing with child sexual exploitation or abuse material. This all has to be worked out. I hope the Government are conscious of that.

I think the public interest is best served if the United Kingdom, the United States and the European Union, in particular, adopt common standards around this. I do not think there is anything between us in terms of how we would want to approach child sexual exploitation or abuse material, so the extent to which we end up having common legal standards will be extraordinarily helpful.

As a general matter, to have regulations that help companies with their compliance is going to be very helpful. I am curious as to how we have got there with the amendment only at this very late stage.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to make a slightly lesser point, but I also welcome these amendments. I want to ask the Minister where the consultation piece of this will lie and to check that all the people who have been in this space for many years will be consulted.

Online Safety Bill

Lord Allan of Hallam Excerpts
Finally, last week, at the invitation of the right reverend Prelate the Bishop of Gloucester, the Minister and I attended an event at which we were addressed by children about the pressures they felt from social media. I thank all the young people present for the powerful and eloquent way in which they expressed the need for politicians and religious, civic and business leaders to do more to detoxify the digital world. If they are listening, as they said they would, I want to assure them that all of us in this Chamber hear their concerns. Importantly, when I asked Oliver, aged 12, and Arthur, aged 13, what one thing we could and should do to make their online world better, they said, “Make age checking meaningful”. Today, we are doing just that.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I shall follow on directly from some of the comments of the noble Baroness, Lady Kidron, around privacy. I shall not repeat the arguments around children and pornography but touch on something else, which is the impact of these amendments on the vast majority of internet users, the 85%-plus who are 18 or older. Of course, when we introduce age assurance measures, they will affect everyone: we should not kid ourselves that it is only about children, because everyone will have to pass through these gateways.

I shall speak particularly to Amendments 184 and 217 on privacy. I am sure that most adults will support extra safety measures for children, but they also want to be able to access a wide range of online services with the least possible friction and the lowest risk to their own personal data. We can explore how this might work in practice by looking at something that we might all do in this Chamber. Looking round, I believe that we are all at least 18 years old, and we might find ourselves idly passing the time creating an account on a new user-to-user or search service that has been recommended. We should consider this group of amendments by how that might play out. In future, the services will have to check that we are in the United Kingdom—there is a range of ways in which they can do that. Having confirmed that, they will need to understand whether we are 18-plus or a child user so that they can tailor their service appropriately.

I hope we all agree that the services should not be asking us for passports or driving licences, for example, as that would be entirely contrary to the thrust of privacy regulations and would be a huge gateway to fraud and other problems. The most efficient way would be for them to ask us for some sort of digital certificate—a certificate that we have on our devices where we have proven to a trusted third party that we are 18-plus. The certificate does not need to contain any personal data but simply confirms that we are of age. That is very similar to the way in which secure websites work: they send a digital certificate to your browser and you verify that certificate with a trusted third party—a certificate authority—and then you can open an encrypted connection. We are reversing the flow: the service will ask the user for a certificate and then verify that before granting access. A user may have a setting on their device in future where they confirm that they are happy for their 18-plus certificate to be given to anybody or whether they would like to be asked every time there will be a new set of privacy controls.

Building the infrastructure for this is non-trivial. Many things could go wrong but at least the kind of model I am describing has some hope of achieving widespread adoption. It is very good for the adult users as they can continue to have the frictionless experience as long as they are happy for their device to send a certificate to new services. It is good for the market of internet services if new services can bring users on easily. It is good for privacy by avoiding lots of services each collecting personal data, as most people access a multiplicity of services. Perhaps most importantly in terms of the Bill’s objectives, it is good for children if services can separate out the vast majority of their users who are 18-plus and then focus their real efforts on tailoring the services for the minority of users who will be children. The Bill will introduce a whole set of new obligations.

We should not underestimate the scale of the challenge in practice; it will work only if major internet companies are willing to play the game and get into the market of offering 18-plus certificates. Companies such as Google, Meta, Amazon, Apple and Microsoft—the ones we normally love to hate—will potentially provide the solution, as well as not-for-profits. There will be foundations for those who object to the big internet companies, but it is those big internet companies which will have the reach; they each have millions of users in the United Kingdom. This is not to fly the flag for those companies; it is simply a question of efficiency. I suspect that everyone in the Chamber uses a combination of services from those big providers. We already share with them the personal data necessary for age assurance, and there would be no additional sharing of data. If they were willing to provide a certificate, they could do so at the kind of scale necessary for the 50 million or so adult internet users in the United Kingdom to be able to get one easily and then pass it to services when they choose to access them.

There may be some discomfort with big tech playing this role, but I cannot see the kind of aggressive targets that we are setting in the amendments working unless we take advantage of those existing platforms and use them to make this work. Amendment 230 tells us that we have about 18 months, which is very soon in terms of trying to build something. We should be clear that if we are to deliver this package it will depend on persuading some of those big names in tech to create age certification schemes for UK users.

For this to have widespread adoption and a competitive market, we need it to be free of direct financial costs to individual users and to services choosing to age-verify, as we have asked them to do so. We need to think very carefully about that, as it raises a whole series of competition questions that I am sure Ofcom and the Competition and Markets Authority will have to address, not least because we will be asking companies to provide age certification free of charge that will be used by their existing and future competitors to meet their compliance requirements.

There may be some listening who think that we can rely on small age-assurance start-ups. Some of them have a really important role to play and we should be proud of our homegrown industry, but we should be realistic that they will reach scale only if they work with and through the large service providers. Many of them are already seeking those kinds of relationship.

As a test case, we might think of an application such as Signal, a messaging app that prides itself on being privacy-first. It does not want to collect any additional information from its users, which is perfectly reasonable, given where it is coming from. It will be really interesting to see how comfortable such a service will be with working with certification schemes, under which it can prove that users are over 18 by taking advantage of the data held by other services which collect significant amounts of data and have a very good idea of how old we are.

I have not focused on under-18s but, once this system is in place, application providers will be thinking very carefully about the pros and cons of allowing under-18s on at all. I know that the noble Baroness, Lady Kidron, is also concerned about this. There will be services that will think very carefully, if they find that the vast majority of their users are 18-plus, about the extent to which they want to put time and effort into tailoring them for users under 18. We do not intend that outcome from the Bill, but we need realistically to consider it.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Just to be clear, I say that the purpose of my question to the Minister was to get at the fact that, for low-risk situations, there can be age assurance that is a lot less effective or intrusive, for that very reason.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I agree; that is very helpful. I think Amendments 74, 93 and 99 also talk about the exclusion, as the noble Baroness raised, of services from the child safety duties if they can show that they are only 18-plus. It will be quite material and critical to know at what level they can demonstrate that.

I have avoided talking about pornography services directly, but there are interesting questions around what will happen if this model develops, as it likely will. If big tech is now starting to provide age certification for the kinds of mainstream services we may all want to access, they may be much less comfortable providing that same certification to pornography providers, for reputational reasons. A mainstream provider would not want to enter that market. Ofcom will need to take a view on this. We have talked about interoperability in the framework we have created, but it is a big question for Ofcom whether it wants to steer all age certification providers also to provide 18-plus certification for pornography providers or, effectively, to allow two markets to develop—one for mainstream certification and one for certification for pornography.

I have taken a few minutes because this is a very high-risk area for the Bill. There are material risks in willing into existence a model that depends on technical infrastructure that has not yet been built. The noble Lord, Lord Bethell, referred to prior experience; one of the reasons why we have not delivered age assurance before is that the infrastructure was not there. We now want it built, so must recognise that it is quite a high-risk endeavour. That does not mean it is not worth attempting, but we must recognise the risks and work on them.

If the implementation is poor, it will frustrate adult users, which may bring the Bill into disrepute. We need to recognise that as a genuine risk. There are people out there already saying that the Bill means that every internet service in the world will ask you for your passport. If that is not the case, we need to stress that we do not expect that to happen. There are also potentially significant impacts on the market for online services available to both adults and children in the UK, depending on the design of this system.

The purpose of thinking about some of these risks today is not to create a doom-laden scenario and say that it will not work. It is entirely the opposite—to say that, if we are to move ahead into a world in which children are protected from harmful content, for which very good reasons have been articulated and a huge amount of work has gone ahead, and in which services can tailor and gear access to the age of the child, we have to be able to take the 18-plus out of that, put it into a separate box and do so in a really easy, straightforward manner. If not, the 18-plus will end up dragging down what we want to do for the underage.

I hope that explanation helps in the context of these amendments. We will need to test them against it as implementation happens over the next few months.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I just realised I forgot to thank the Government for Amendment 271, which reflected something I raised in Committee. I will reflect back to the Minister that, as is reinforced by his response now, it goes precisely where I wanted to. That is to make sure—I have raised this many times—that we are not implementing another cookie banner, but are implementing something and then going back to say, “Did it work as we intended? Were the costs proportionate to what we achieved?” I want to put on the record that I appreciate Amendment 271.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I appreciate the noble Lord’s interjection and, indeed, his engagement on this issue, which has informed the amendments that we have tabled.

In relation to the amendment of the noble Baroness, Lady Fox, as I set out, there are already robust safeguards for user privacy in the Bill. I have already mentioned Amendment 124, which puts age-assurance principles in the Bill. These require Ofcom to have regard, when producing its codes of practice on the use of age assurance, to the principle of protecting the privacy of users, including data protection. We think that the noble Baroness’s amendment is also unnecessary. I hope that she and the noble Baroness, Lady Kidron, will be willing to not move their amendments and to support the government amendments in the group.

--- Later in debate ---
I simply ask the Minister to reflect and look carefully at this and, frankly, the illogicality of the Government’s current approach to see whether we can yet again improve the Bill—as he has on so many occasions.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I follow the noble Lord, Lord Russell, particularly in talking about Amendments 43, 87 and 242, which raise some interesting and quite profound questions on what we are expecting from the market of internet services once the Online Safety Bill is in place.

It is worth taking a moment to remind ourselves of what we do and do not want from the Bill. We want services that are causing harm and are unwilling to take reasonable steps to address that to leave the UK market. That is clear. As a result of this legislation, it will be likely that some services leave the UK market, because we have asked them to do reasonable things and they have said no; they are not willing to comply with the law and therefore they need to be out. There is a whole series of measures in the Bill that will lead to that.

Equally, we want services that are willing to take reasonable steps to stay in the UK market, do the risk assessments, work at improvements and have the risks under control. They may not all be resolved on day one—otherwise, we would not need the legislation—but they should be on a path to address the risks that have been identified. We want those people to be in the market, for two reasons.

The first is that we want choice for people; we do not take pleasure in shutting people who are providing services out of the market. Also, from a child safety point of view, there is a genuine concern that, if you limit choice too far, you will end up creating more of a demand for completely unregulated services that sit outside the UK and will fill the gap. There is a balance in making sure that there is a range of attractive services, so that teenagers in particular feel that their needs are being met. We want those services to be regulated and committed to improvement.

Something that is in between will be a hard decision for Ofcom—something that is not great today, but not so bad that we want it out tomorrow. Ofcom will have to exercise considerable judgment in how it deals with those services. This is my interpretation of where proportionality and capacity come in. If you are running a very large internet service, something such as PhotoDNA, which is the technology that allows you to scan photos and detect child abuse images, is relatively straightforward to implement. All the major providers do it, but there are costs to that for smaller services. There are some real capacity challenges around implementing those kinds of technology. It is getting better over time and we would like them to do it, but you would expect Ofcom to engage in a conversation as a smaller service—smaller not in terms of its users but in its engineers and capacity—may need a little longer to implement such a technology.

A larger service could do proactive investigations. If it has a large team, once it has identified that something is problematic, it can investigate proactively. Again, a smaller service may not have the bodies on the ground to do that, but you would hope it would develop that capacity. It is important to recognise something about capacity if we are to encourage those that are half way between to come to the light side rather than slip off to the dark side.

I am interested in the Minister’s interpretation of these words and the instruction to Ofcom. We will be dependent on Ofcom, which will sit on the other side of a real or virtual table with the people who run these companies, as Ofcom can insist that they come in and talk to it. It will have to make these judgments, but we do not want it to be conned or to be a walkover for an organisation that has the capacity or could do things that are helpful, but is simply refusing to do them or somehow trying to pull the wool over Ofcom’s eyes.

Equally, we do not want Ofcom to demand the impossible of a service that genuinely struggles to meet a demand and that has things broadly under control. That is the balance and the difficult judgment. I think we are probably aiming for the same thing, and I hope the Minister is able to clarify these instructions and the way the Government expect Ofcom to interpret them. We are looking for that point at which Ofcom is seriously demanding but does not get overbearing and unnecessarily drive out of the market people who are making best efforts to do their risk assessments and then work hard to resolve those risks.

--- Later in debate ---
Lord Hope of Craighead Portrait Lord Hope of Craighead (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I speak to Amendments 286 and 294, which are the last two amendments in this group, and I will explain what they are about. They are in the name of the noble Baroness, Lady Fraser of Craigmaddie, who unfortunately cannot be here this evening, to which I and the noble Lord, Lord Stevenson of Balmacara, have added our names, as has the Minister, for which we are very grateful. They serve a simple purpose: they seek to insert a definition of the phrase “freedom of expression” into the list of definitions in Clause 211 and add it to the index of defined expressions in Clause 212.

They follow an amendment which I proposed in Committee. My amendment at that stage was to insert the definition into Clause 18, where the phrase

“freedom of expression within the law”

appears. It was prompted by a point made by the Constitution Committee in its report on the Bill, which said that the House might wish to consider defining that expression in the interests of legal certainty.

The same point arose when the House was considering the then Higher Education (Freedom of Speech) Bill. Following a similar amendment by me, a government amendment on Report, to achieve the same result, was agreed to that Bill. My amendment in Committee on this Bill adopted the same wording as the government amendment to that Bill. In his response to what I said in Committee, the Minister pointed out, quite correctly, that the Higher Education (Freedom of Speech) Act and this Bill serve quite different purposes, but he did say that the Bill team—and he himself—would consider our amendment closely between then and Report.

What has happened since is the amendment we are now proposing, which has undergone some changes since Committee. They are the product of some very helpful discussions with the Bill team. The most important is that the definition placed in Clause 211 extends to the use of the expression “freedom of expression” wherever it appears in the Bill, which is obviously a sensible change. It also now includes the word “receive” as well as the word “impart”, so that it extends to both kinds of communication that are within the scope of the Bill. The words “including in electronic form”, which are in my amendment, have been removed as unnecessary, as the Bill is concerned with communications in electronic form only.

There are also two provisions in the Bill which refer to freedom of expression to which, as the definition now makes clear, this definition is not to apply. They are in Clauses 36(6)(f) and 69(2)(d). This is because the context in which the expression is used there is quite different. They require Ofcom to consult people with expertise as to this right when preparing codes of conduct. They are not dealing with the duties of providers, which is what the definition aims to do.

As the discussion in Committee showed, and as the noble Baroness, Lady Fox, demonstrated again this evening, we tend to use the phrases “freedom of speech” and “freedom of expression” interchangeably, perhaps without very much thought as to what they really mean and how they relate to other aspects of the idea. That is why legal certainty matters when they appear in legislation. The interests of legal certainty will be met if this definition finds a place in the Bill, and it makes it clear that the reference is to the expression referred to in Article 10(1) of the convention as it has effect for the purposes of the Human Rights Act. That is as generous and comprehensive a definition as one would wish to have for the purposes of the Bill.

I am grateful to the Minister for his support and to the Bill team for their help. When the times come, either the noble Baroness, Lady Fraser, or I will move the amendment; it comes at the very end of the Bill so it will be at the last moment of the last day, when we are finishing Report. I look forward to that stage, as I am sure the Minister does himself.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I want to respond to some of the comments made by the noble Baroness, Lady Fox, and the noble Lord, Lord Moylan. I have been looking forward to this debate equally, as it touches on some crucial issues. One of the mistakes of the Bill that I place on the Government is that it was sold as somehow a balancing Bill. It is not; it is a speech-limiting Bill, as all Bills of this kind are. Its primary purpose is to prevent people in the United Kingdom encountering certain types of content.

If you support the Bill, it is because you believe that those restrictions are necessary and proportionate in the context of Article 8. Others will disagree. We cannot pretend that it is boosting free speech. The United States got it right in its first amendment. If you want to maximise speech, you prohibit your parliament regulating on speech: “Congress shall make no law that limits speech”. As soon as you start regulating, you tend towards limitations; the question in the UK and European contexts is whether those limitations are justified and justifiable.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

I understand the point the noble Lord is making but, if he were thrown out, sacked or treated in some other way that was incompatible with his rights to freedom of expression under Article 10 of the European convention, he would have cause for complaint and, possibly, cause for legal redress.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

That point is well made. In support of that, if the public space treated me in a discriminatory way, I would expect to have redress, but I do not think I have a right in every public space to say everything I like in the classic Article 8 sense. My right vis-à-vis the state is much broader than my right vis-à-vis any public space that I am operating in where norms apply as well as my basic legal rights. Again, to take the pub example, if I went in and made a racist speech, I may well be thrown out of the pub even though it is sub-criminal and the police are never called; they do not need to be as the space itself organises it.

I am making the point that terms of service are about managing these privately managed public services, and it would be a mistake to equate them entirely with our right to speak or the point at which the state can step in and censor us. I understand the point about state interference but it cuts both ways: both the state interfering in excessively censoring what we can say but also the state potentially interfering in the management of what is, after all, a private space. To refer back to the US first amendment tradition, a lot of that was about freedom of religion and precisely about enabling heterodoxy. The US did not want an orthodoxy in which one set of rules applied everywhere to everybody. Rather, it wanted people to have the right to dissent, including in ways that were exclusive. You could create your own religious sect and you could not be told not to have those beliefs.

Rolling that power over to the online world, online services, as long as they are non-discriminatory, can have quite different characters. Some will be very restrictive of speech like a restrictive religious sect; some will be very open and catholic, with a small “c”, in the sense of permitting a broad range of speech. I worry about some of the amendments in case there is a suggestion that Ofcom would start to tell a heterodox community of online services that there is an orthodox way to run their terms of service; I would rather allow this to be a more diverse environment.

Having expressed some concerns, I am though very sympathetic to Amendment 162 on Section 5 of the Public Order Act. I have tried in our debates to bring some real experience to this. There are two major concerns about the inclusion of the Public Order Act in the Bill. One is a lack of understanding of what that means. If you look at the face of the language that has been quoted at us, and go back to that small service that does not have a bunch of lawyers on tap, it reads as though it is stopping any kind of abusive content. Maybe you will google it, as I did earlier, and get a little thing back from the West Yorkshire Police. I googled: “Is it illegal to swear in the street?”. West Yorkshire Police said, “Yes, it is”. So if you are sitting somewhere googling to find out what this Public Order Act thing means, you mind end up thinking, “Crikey, for UK users, I have to stop them swearing”. There is a real risk of misinterpretation.

The second risk is that of people deliberately gaming the system; again, I have a real-life example from working in one of the platforms. I had people from United Kingdom law enforcement asking us to remove content that was about demonstrations by far-right groups. They were groups I fundamentally disagree with, but their demonstrations did not appear to be illegal. The grounds cited were that, if you allow this content to go ahead and the demonstration happens, there will be a Public Order Act offence. Once you get that on official notepaper, you have to be quite robust to say, “No, I disagree”, which we did on occasion.

I think there will be other services that receive Public Order Act letters from people who seem official and they will be tempted to take down content that is entirely legal. The critical thing here is that that content will often be political. In other parts of the Bill, we are saying that we should protect political speech, yet we have a loophole here that risks that.

I am sure the Minister will not concede these amendments, but I hope he will concede that it is important that platforms are given guidance so that they do not think that somebody getting upset about a political demonstration is sufficient grounds to remove the content as a Public Order Act offence. If you are a local police officer it is much better to get rid of that EDL demonstration, so you write to the platform and it makes your life easier, but I do not think that would be great from a speech point of view.

Finally, I turn to the point made by the noble Lord, Lord Moylan, on Amendment 188 about the ECHR Article 8 exemption. As I read it, if your terms of service are not consistent with ECHR Article 8—and I do not think they will be for most platforms—you then get an exemption from all the other duties around appeals and enforcing them correctly. It is probably a probing amendment but it is a curious way of framing it; it essentially says that, if you are more restrictive, you get more freedom in terms of the Ofcom relationship. I am just curious about the detail of that amendment.

It is important that we have this debate and understand this relationship between the state, platforms and terms of service. I for one am persuaded that the general framework of the Bill makes sense; there are necessary and proportionate restrictions. I am strongly of the view that platforms should be allowed to be heterodox in their terms of service. Ofcom’s job is very much to make sure that they are done correctly but not to interfere with the content of those terms of service beyond that which is illegal. I am persuaded that we need to be extraordinarily careful about including Public Order Act offences; that particular amendment needs a good hearing.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I have said several times when we have been debating this Bill—and I will probably say it again when we get to the group about powers—that, for me, the point of the Online Safety Bill is to address the absence of accountability for the extraordinary power that the platforms and search engines have over what we see online and, indeed, how we live and engage with each other online. Through this Bill, much greater responsibility for child safety will be placed on the platforms. That is a good thing; I have been very supportive of the measures to ensure that there are strong protections for children online.

The platforms will also have responsibility, though, for some measures to help adults protect themselves. We must not forget that, the more responsibility that platforms have to protect, the more power we could inadvertently give them to influence what is an acceptable opinion to hold, or to shape society to such an extent that they can even start to influence what we believe to be right or wrong—we are talking about that significant amount of power.

I was of the camp that was pleased when the Government removed the legal but harmful aspects of the Bill, because for me they represented a serious risk to freedom of expression. As I just described, I felt that they risked too much inadvertent power, as it were, going to the platforms. But, with the Government having done that, we have seen through the passage of the Bill some push-back, which is perfectly legitimate and understandable—I am not criticising anyone—from those who were concerned about that move. In response to that, the Government amended the Bill to provide assurances and clarifications on things like the user-empowerment tools. As I said, I do not have any problem; although I might not necessarily support some of the specific measures that were brought forward, I am okay with that as a matter of principle.

However, as was explained by my noble friend Lord Moylan and the noble Baroness, Lady Fox, there has not been a similar willingness from the Government to reassure those who remain concerned about the platforms’ power over freedom of expression. We have to bear in mind that some people’s concerns in this quarter remained even when the legal but harmful change was made—that is, the removal of legal but harmful was a positive step, but it did not go far enough for some people with concerns about freedom of expression.

I am sympathetic to the feeling behind this group, which was expressed by my noble friend and the noble Baroness, Lady Fox. I am sympathetic to many of the amendments. As the noble Lord, Lord Allan of Hallam, pointed out, specifically Amendment 162 in relation to the Public Order Act seems worthy of further consideration by the Government. But the amendments in the group that caught my attention place a specific duty on Ofcom in regard to freedom of expression when drawing up or amending codes of practice or other guidance—these amendments are in my noble friend Lord Moylan’s name. When I looked at them, I did not think that they undermined anything else that the Government brought forward through the amendments to the Bill, as he said, but I thought that they would go a long way towards enforcing the importance of freedom of expression as part of this regulatory framework—one that we expect Ofcom to attach serious importance to.

I take on board what the noble Lord, Lord Allan, said about the framework of this legislation being primarily about safeguarding and protection. The purpose of the Bill is not to enhance freedom of expression, but, throughout its passage, that has none the less always been a concern. It is right that the Government seek to balance these two competing fundamental principles. I ask whether more can be done—my noble friend pointed to the recommendations of the Equality and Human Rights Commission and how they reinforce some of what he proposed. I would like to think that my noble friend the Minister could give some greater thought to this.

As was said, it is to the Government’s credit how much they have moved on the Bill during its passage, particularly between Committee and Report. That was quite contrary to the sense that I think a lot of us felt during the early stages of our debates. It would be a shame if, once the Bill leaves the House, it is felt that the balance is not as fine—let me put it like that—as some people feel it needs to be. I just wanted to express some support and ask my noble friend the Minister to give this proper and serious consideration.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

As I set out, I think my noble friend and the noble Baroness, Lady Fox, are not right to point to the European Convention on Human Rights here. That concerns individuals’ and entities’ rights

“to receive and impart ideas without undue interference”

by public authorities, not private entities. We do not see how a service provider deciding not to allow certain types of content on its platform would engage the Article 10 rights of the user, but I would be very happy to discuss this further with my noble friend and the noble Baroness in case we are talking at cross-purposes.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

On that point specifically, having worked inside one of the companies, they fear legal action under all sorts of laws, but not under the European Convention on Human Rights. As the Minister explained, it is for public bodies; if people are going to take a case on Article 10 grounds, they will be taking it against a public body. There are lots of other grounds to go after a private company but not ECHR compliance.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

My Lords, I genuinely appreciate this debate. The noble Lord, Lord Clement-Jones, made what I thought was a very important point, which is, in going through the weeds of the Bill—and some people have been involved in it for many years, looking at the detail—I appreciate that it can be easy to forget the free speech point. It is important that it has been raised but it also constantly needs to be raised. That is the point: it is, as the noble Lord, Lord Allan of Hallam, admitted, a speech-restricting Bill where we are working out the balance.

I apologise to the noble and learned, Lord Hope of Craighead, for not acknowledging that he has constantly emphasised the distinction between free speech and free expression. He and I will not agree on this; it is that we do not have time for this argument now rather than me not understanding. But he has been diligent in his persistence in trying to at least raise the issues and that is important.

I was a bit surprised by the Minister’s response because, for the first time ever, since I have been here, there has been some enthusiasm across the House for one of my amendments—it really is unprecedented—Amendment 162 on the public order offences. I thought that the Minister might have noted that, because he has noted it every other time there has been a consensus across the House. I think he ought to look again at Amendment 162.

To indicate the muddle one gets in, in terms of public order offences and illegality, the police force in Cheshire, where I am from, has put out a film online today saying that misgendering is a crime. That is the police who have said that. It is not a crime and the point about these things, and the difficulty we are concerned with, is asking people to remove and censor material based on illegality or public offences that they should not be removing. That is my concern: censorship.

To conclude, I absolutely agree with the noble Lord, Lord Allan of Hallam, that of course free speech does not mean saying whatever you want wherever you want. That is not free speech, and I am a free speech absolutist. Even subreddits—if people know what they are—think they are policing each other’s speech. There are norms that are set in place. That is fine with me—that multitude.

My concern is that a state body such as Ofcom is going to set norms of what is acceptable free speech that are lower than free speech laws by demanding, on pain of breach of the law, with fines and so on, that these private companies have to impose their own terms of service, which can actually then set a norm, leading them to be risk-averse, and set a norm for levels of speech that are very dangerous. For example, when you go into work, you cannot just say anything, but there are people such as Maya Forstater, who said something at work and was disciplined and lost her job and has just won more than £100,000, because she was expressing her views and opinions. The Equality Act ran to her aid and she has now won and been shown to be right. You cannot do that if your words have disappeared and are censored.

I could talk about this for a long time, as noble Lords know. I hope that at least, as the Bill progresses, even when it becomes an Act, the Government could just stamp on its head, “Don’t forget free speech”—but before then, as we end this process, they could come back with some concessions to some of the amendments that have been raised here today. That would be more than just words. I beg leave to withdraw the amendment.

Online Safety Bill

Lord Allan of Hallam Excerpts
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the Minister and indeed the Secretary of State for bringing forward these amendments in the fulsome manner that they have. I appreciate it, but I know that Bereaved Families for Online Safety also appreciates it. The Government committed to bringing forward these amendments on the last day in Committee, so they have been pre-emptively welcomed and discussed at some length. One need only read through Hansard of 22 June to understand the strength of feeling about the pain that has been caused to families and the urgent need to prevent others experiencing the horror faced by families already dealing with the loss of their child.

I will speak briefly on three matters only. First, I must once again thank bereaved families and colleagues in this House and in the other place for their tireless work in pressing this issue. This is one of those issues that does not allow for celebration. As I walked from the Chamber on 22 June, I asked one of the parents how they felt. They said: “It is too late for me”. It was not said in bitterness but in acknowledgement of their profound hurt and the failure of companies voluntarily to do what is obvious, moral and humane. I ask the Government to see the sense in the other amendments that noble Lords brought forward on Report to make children safer, and make the same, pragmatic, thoughtful solution to those as they have done on this group of amendments. It makes a huge difference.

Secondly, I need to highlight just one gap; I have written to the Secretary of State and the Minister on this. I find it disappointing that the Government did not find a way to require senior management to attend an inquest to give evidence. Given that the Government have agreed that senior managers should be subject to criminal liability under some circumstances, I do not understand their objections to summoning them to co-operate with legal proceedings. If a company submits information in response to Ofcom and at the coroner’s request the company’s senior management is invited to attend the inquest, it makes sense that someone should be required to appear to answer and follow up those questions. Again, on behalf of the bereaved families and specifically their legal representatives, who are very clear on the importance of this part of the regime, I ask the Government to reconsider this point and ask the Minister to undertake to speak to the department and the MoJ, if necessary, to make sure that, if senior managers are asked to attend court, they are mandated to do so.

Thirdly, I will touch on the additional commitments the Minister made beyond the Bill, the first of which is the upcoming Data Protection and Digital Information Bill. I am glad to report that some of the officials working on the Bill have already reached out, so I am grateful to the Minister that this is in train, but I expect it to include guidance for companies that will, at a minimum, cover data preservation orders and guidance about the privacy of other users in cases where a child has died. I think that privacy for other users is central to this being a good outcome for everybody, and I hope we are able to include that.

I am pleased to hear about the undertaking with the US regarding potential barriers, and I believe—and I would love to hear from the Minister—that the objective is to make a bilateral agreement that would allow data to be shared between the two countries in the case of a child’s death. It is very specific requirement, not a wide-ranging one. I believe, if we can do it on a bilateral basis, it would be easier than a broad attempt to change the data storage Act.

I turn finally to training for coroners. I was delighted that the Chief Coroner made a commitment to consider issuing non-legislative guidance and training on social media for coroners and the offer of consultation with experts, including Ofcom, the ICO and bereaved families and their representatives, but this commitment was made subject to funding. I ask the Minister to agree to discuss routes to funding from the levy via Ofcom’s digital literacy duty. I have proposed an amendment to the government amendment that would make that happen, but I would welcome the opportunity to discuss it with the Minister. Coroners must feel confident in their understanding of the digital world, and I am concerned that giving this new route to regulated companies via Ofcom without giving them training on how to use it may create a spectre of failure or further frustration and distress for bereaved families. I know there is not a person in the House who would want that to be the outcome of these welcome government amendments.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I also welcome this group of amendments. I remember a debate led by the noble Baroness, Lady Kidron, some time ago in the Moses Room, where we discussed this, and I said at the time I thought it would get fixed in the Online Safety Bill. I said that in a spirit of hope, not knowing any of the detail, and it is really satisfying to see the detail here today. As she said, it is testimony to the families, many of whom got in touch with me at that time, who have persisted in working to find a solution for other families—as the noble Baroness said, it is too late for them, but it will make a real difference to other families—and it is so impressive that, at a time of extreme grief and justifiable anger, people have been able to channel that into seeking these improvements.

The key in the amendments, which will make that difference, is that there will be a legal order to which the platforms know they have to respond. The mechanism that has been selected—the information notice—is excellent because it will become well known to every one of the 25,000 or so platforms that operate in the United Kingdom. When they get an information notice from Ofcom, that is not something that they will have discretion over; they will need to comply with it. That will make a huge difference.

Online Safety Bill

Lord Allan of Hallam Excerpts
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the noble Baroness, Lady Kidron, has unfortunately been briefly detained. If you are surprised to see me standing up, it is because I am picking up for her. I start by welcoming these amendments. I am grateful for the reaction to the thought-provoking debate that we had in Committee. I would like to ask a couple of questions just to probe the impact around the edges.

Amendment 27 looks as if it implies that purely content-generating machine-learning or AI bots could be excluded from the scope of the Bill, rather than included, which is the opposite of what we were hoping to achieve. That may be us failing to understand the detail of this large body of different amendments, but I would welcome my noble friend the Minister’s response to make sure that in Amendment 27 we are not excluding harm that could be generated by some form of AI or machine-learning instrument.

Maybe I can give my noble friend the Minister an example of what we are worried about. This is a recent scenario that noble Lords may have seen in the news, of a 15 year-old who asked, “How do I have sex with a 30 year-old?”. The answer was given in forensic detail, with no reference to the fact that it would in fact be statutory rape. Would the regulated service, or the owner of the regulated service that generated that answer, be included or excluded as a result of Amendment 27? That may be my misunderstanding.

This group is on AI-generated pornography. My friend, the noble Baroness, Lady Kidron, and I are both very concerned that it is not just about pornography, and that we should make sure that AI is included in the Bill. Specifically, many of us with teenage children will now be learning how to navigate the Snap AI bot. Would harm generated by that bot be captured in these amendments, or is it only content that is entirely pornographic? I hope that my noble friend the Minister can clarify both those points, then we will be able to support all these amendments.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I rise briefly to welcome the fact that there is a series of amendments here where “bot” is replaced by

“bot or other automated tool”.

I point out that there is often a lot of confusion about what a bot is or is not. It is something that was largely coined in the context of a particular service—Twitter—where we understand that there are Twitter bots: accounts that have been created to pump out lots of tweets. In other contexts, on other services, there is similar behaviour but the mechanism is different. It seems to me that the word “bot” may turn out to be one of those things that was common and popular at the end of the 2010s and in the early 2020s, but in five years we will not be using it at all. It will have served its time, it will have expired and we will be using other language to describe what it is that we want to capture: a human being has created some kind of automated tool that will be very context dependent, depending on the nature of the service, and they are pumping out material. It is very clear that we want to make sure that such behaviour is in scope and that the person cannot hide behind the fact that it was an automated tool, because we are interested in the mens rea of the person sitting behind the tool.

I recognise that the Government have been very wise in making sure that whenever we refer to a bot we are adding that “automated tool” language, which will make the Bill inherently much more future-proof.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I just want to elucidate whether the Minister has any kind of brief on my Amendment 152A. I suspect that he does not; it is not even grouped—it is so recent that it is actually not on today’s groupings list. However, just so people know what will be coming down the track, I thought it would be a good idea at this stage to say that it is very much about exactly the question that the noble Baroness, Lady Harding, was asking. It is about the interaction between a provider environment and a user, with the provider environment being an automated bot—or “tool”, as my noble friend may prefer.

It seems to me that we have an issue here. I absolutely understand what the Minister has done, and I very much support Amendment 153, which makes it clear that user-generated content can include bots. But this is not so much about a human user using a bot or instigating a bot; it is much more about a human user encountering content that is generated in an automated way by a provider, and then the user interacting with that in a metaverse-type environment. Clearly, the Government are apprised of that with regard to Part 5, but there could be a problem as regards Part 3. This is an environment that the provider creates, but it is interacted with by a user as if that environment were another user.

I shall not elaborate or make the speech that I was going to make, because that would be unfair to the Minister, who needs to get his own speaking note on this matter. But I give him due warning that I am going to degroup and raise this later.

--- Later in debate ---
Moved by
28: Schedule 1, page 185, line 23, at end insert—
“Public information services
5A A user-to-user service is exempt if its primary purpose is the creation of public information resources and it has the following characteristics—(a) user-to-user functions are limited to those necessary for the creation and maintenance of a public information resource,(b) OFCOM has determined that there is minimal risk of users sharing harmful content on the service, and(c) it is non-commercial.”Member’s explanatory statement
This amendment would allow OFCOM to exempt services like Wikipedia from regulation where it deems them to be low risk.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

My Lords, as we enter the final stages of consideration of this Bill, it is a good time to focus a little more on what is likely to happen once it becomes law, and my Amendment 28 is very much in that context. We now have a very good idea of what the full set of obligations that in-scope services will have to comply with will look like, even if the detailed guidance is still to come.

With this amendment I want to return to the really important question that I do not believe we answered satisfactorily when we debated it in Committee. That is that there is a material risk that, without further amendment or clarification, Wikipedia and other similar services may feel that they can no longer operate in the United Kingdom.

Wikipedia has already featured prominently in our debates, but there are other major services that might find themselves in a similar position. As I was discussing the definitions in the Bill with my children yesterday—this may seem an unusual dinner conversation with teenagers, but I find mine to be a very useful sounding board—they flagged that OpenStreetMap, to which we all contribute, also seems to be in the scope of how we have defined user-to-user services. I shall start by asking some specific questions so that the Minister has time to find the answers in his briefing or have them magically delivered to him before summing up: I shall ask the questions and then go on to make the argument.

First, is it the Government’s view that Wikipedia and OpenStreetMap fall within the definition of user-to-user services as defined in Clause 2 and the content definition in Clause 211? We need to put all these pieces together to understand the scope. I have chosen these services because each is used by millions of people in the UK and their functionality is very well known, so I trust that the Government had them in mind when they were drafting the legislation, as well as the more obvious services such as Instagram, Facebook et cetera.

Secondly, can the Minister confirm whether any of the existing exemptions in the Bill would apply to Wikipedia and OpenStreetMap such that they would not have to comply with the obligations of a category 1 or 2B user-to-user service?

Thirdly, does the Minister believe that the Bill as drafted allows Ofcom to use its discretion in any other way to exempt Wikipedia and OpenStreetMap, for example through the categorisation regulations in Schedule 11? As a spoiler alert, I expect the answers to be “Yes”, “No” and “Maybe”, but it is really important that we have the definitive government response on the record. My amendment would seek to turn that to “Yes”, “Yes” and therefore the third would be unnecessary because we would have created an exemption.

The reason we need to do this is not in any way to detract from the regulation or undermine its intent but to avoid facing the loss of important services at some future date because of situations we could have avoided. This is not hyperbole or a threat on the part of the services; it is a natural consequence if we impose legal requirements on a responsible organisation that wants to comply with the law but knows it cannot meet them. I know it is not an intended outcome of the Bill that we should drive these services out, but it is certainly one intended outcome that we want other services that cannot meet their duties of care to exit the UK market rather than continue to operate here in defiance of the law and the regulator.

We should remind ourselves that at some point, likely to be towards the end of 2024, letters will start to arrive on the virtual doormats of all the services we have defined as being in scope—these 25,000 services—and their senior management will have a choice. I fully expect that the Metas, the Googles and all such providers will say, “Fine, we will comply. Ofcom has told us what we need to do, and we will do it”. There will be another bunch of services that will say, “Ofcom, who are they? I don’t care”, and the letter will go in the bin. We have a whole series of measures in the Bill by which we will start to make life difficult for them: we will disrupt their businesses and seek to prosecute them and we will shut them out of the market.

However, there is a third category, which is the one I am worried about in this amendment, who will say, “We want to comply, we are responsible, but as senior managers of this organisation”, or as directors of a non-profit foundation, “we cannot accept the risk of non-compliance and we do not have the resources to comply. There is no way that we can build an appeals mechanism, user reporter functions and all these things we never thought we would need to have”. If you are Wikipedia or OpenStreetMap, you do not need to have that infrastructure, yet as I read the Bill, if they are in scope and there is no exemption, then they are going to be required to build all that additional infrastructure.

The Bill already recognises that there are certain classes of services where it would be inappropriate to apply this new regulatory regime, and it describes these in Schedule 1, which I am seeking to amend. My amendment just seeks to add a further class of exempted service and it does this quite carefully so that we would exclude only services that I believe most of us in this House would agree should not be in scope. There are three tests that would be applied.

The first is a limited functionality test—we already have something similar in Schedule 1—so that the user-to-user functions are only those that relate to the production of what I would call a public information resource. In other words, users engage with one another to debate a Wikipedia entry or a particular entry on a map on OpenStreetMap. So, there is limited user-to-user functionality all about this public interest resource. They are not user-to-user services in the classic sense of social media; they are a particular kind of collective endeavour. These are much closer to newspaper publishers, which we have explicitly excluded from the Bill. It is much more like a newspaper; it just happens to be created by users collectively, out of good will, rather than by paid professional journalists. They are very close to that definition, but if you read Schedule 1, I do not think the definition of “provider content” in paragraph 4(2) includes at the moment these collective-user endeavours, so they do not currently have the exemption.

I have also proposed that Ofcom would carry out a harm test to avoid the situation where someone argues that their services are a public information resource, while in practice using it to distribute harmful material. That would be a rare case, but noble Lords can conceive of it happening. Ofcom would have the ability to say that it recognises that Wikipedia does not carry harmful content in any meaningful way, but it would also have the right not to grant the exemption to service B that says it is a new Wikipedia but carries harmful content.

Thirdly, I have suggested that this is limited to non-commercial services. There is an argument for saying any public information resource should benefit, and that may be more in line with the amendment proposed by the noble Lord, Lord Moylan, where it is defined in terms of being encyclopaedic or the nature of the service. I recognise that I have put in “non-commercial” as belt and braces because there is a rationale for saying that, while we do not really want an encyclopaedic resource to be in the 2B service if it has got user-to-user functions, if it is commercial, we could reasonably expect it to find some way to comply. It is different when it is entirely non-commercial and volunteer-led, not least because the Wikimedia Foundation, for example, would struggle to justify spending the money that it has collected from donors on compliance costs with the UK regime, whereas a commercial company could increase its resources from commercial customers to do that.

I hope this is a helpful start to a debate in which we will also consider Amendment 29, which has similar goals. I will close by asking the Minister some additional questions. I have asked him some very specific ones to which I hope he can provide answers, but first I ask: does he acknowledges the genuine risk that services like Wikipedia and OpenStreetMap could find themselves in a position where they have obligations under the Bill that they simply cannot comply with? It is not that they are unwilling, but there is no way for them to do all this structurally.

Secondly, I hope the Minister would agree that it is not in the public interest for Ofcom to spend significant time and effort on the oversight of services like these; rather, it should spend its time and effort on services, such as social media services, that we believe to be creating harms and are the central focus of the Bill.

Thirdly, will the Minister accept that there is something very uncomfortable about a government regulator interfering with the running of a neutral public resource like Wikipedia, when there is so much benefit from it and little or no demonstrative harm? It is much closer to the model that exists for a newspaper. We have debated endlessly in this House—and I am sure we will come back to it—that there is, rightly, considerable reluctance to have regulators going too far and creating this relationship with neutral public information goods. Wikipedia falls into that category, as does OpenStreetMap and others, and there would be fundamental in principle challenges around that.

I hope the Government will agree that we should be taking steps to make sure we are not inadvertently creating a situation where, in one or two years’ time, Ofcom will come back to us saying that it wrote to Wikipedia, because the law told it to do so, and told Wikipedia all the things that it had to do; Wikipedia took it to its senior management and then came back saying that it is shutting shop in the UK. Because it is sensible, Ofcom would come back and say that it did not want that and ask to change the law to give it the power to grant an exemption. If such things deserve an exemption, let us make it clear they should have it now, rather than lead ourselves down this path where we end up effectively creating churn and uncertainty around what is an extraordinarily valuable public resource. I beg to move.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, Amendments 29 and 30 stand in my name. I fully appreciated, as I prepared my thoughts ahead of this short speech, that a large part of what I was going to say might be rendered redundant by the noble Lord, Lord Allan of Hallam. I have not had a discussion with him about this group at all, but it is clear that his amendment is rather different from mine. Although it addresses the same problem, we are coming at it slightly differently. I actually support his amendment, and if the Government were to adopt it I think the situation would be greatly improved. I do prefer my own, and I think he put his finger on why to some extent: mine is a little broader. His relates specifically to public information, whereas mine relates more to what can be described as the public good. So mine can be broader than information services, and I have not limited it to non-commercial operations, although I fully appreciate that quite a lot of the services we are discussing are, in practice, non-commercial. As I say, if his amendment were to pass, I would be relatively satisfied, but I have a moderate preference for my own.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am grateful to noble Lords for their contributions during this debate. I am sympathetic to arguments that we must avoid imposing disproportionate burdens on regulated services, and particularly that the Bill should not inhibit services from providing valuable information which is of benefit to the public. However, I want to be clear that that is why the Bill has been designed in the way that it has. It has a broad scope in order to capture a range of services, but it has exemptions and categorisations built into it. The alternative would be a narrow scope, which would be more likely inadvertently to exempt risky sites or to displace harm on to services which we would find are out of scope of the Bill. I will disappoint noble Lords by saying that I cannot accept their amendments in this group but will seek to address the concerns that they have raised through them.

The noble Lord, Lord Allan, asked me helpfully at the outset three questions, to which the answers are yes, no and maybe. Yes, Wikipedia and OpenStreetMap will be in scope of the Bill because they allow users to interact online; no, we do not believe that they would fall under any of the current exemptions in the Bill; and the maybe is that Ofcom does not have the discretion to exempt services but the Secretary of State can create additional exemptions for further categories of services if she sees fit.

I must also say maybe to my noble friend Lord Moylan on his point about Wikipedia—and with good reason. Wikipedia, as I have just explained, is in scope of the Bill and is not subject to any of its exemptions. I cannot say how it will be categorised, because that is based on an assessment made by the independent regulator, but I reassure my noble friend that it is not the regulator but the Secretary of State who will set the categorisation thresholds through secondary legislation; that is to say, a member of the democratically elected Government, accountable to Parliament, through legislation laid before that Parliament. It will then be for Ofcom to designate services based on whether or not they meet those thresholds.

It would be wrong—indeed, nigh on impossible—for me to second-guess that designation process from the Dispatch Box. In many cases it is inherently a complex and nuanced matter since, as my noble friend Lady Harding said, many services change over time. We want to keep the Bill’s provisions flexible as services change what they do and new services are invented.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I would just like to finish my thought on Wikipedia. Noble Lords are right to mention it and to highlight the great work that it does. My honourable friend the Minister for Technology and the Digital Economy, Paul Scully, met Wikipedia yesterday to discuss its concerns about the Bill. He explained that the requirements for platforms in this legislation will be proportionate to the risk of harm, and that as such we do not expect the requirements for Wikipedia to be unduly burdensome.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I am computing the various pieces of information that have just been given, and I hope the Minister can clarify whether I have understood them correctly. These services will be in scope as user-to-user services and do not have an exemption, as he said. The Secretary of State will write a piece of secondary legislation that will say, “This will make you a category 1 service”—or a category 2 or 2B service—but, within that, there could be text that has the effect that Wikipedia is in none of those categories. So it and services like it could be entirely exempt from the framework by virtue of that secondary legislation. Is that a correct interpretation of what he said?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

The Secretary of State could create further exemptions but would have to bring those before Parliament for it to scrutinise. That is why there is a “maybe” in answer to his third question in relation to any service. It is important for the legislation to be future-proofed that the Secretary of State has the power to bring further categorisations before Parliament for it to discuss and scrutinise.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

My Lords, I will keep pressing this point because it is quite important, particularly in the context of the point made by the noble Baroness, Lady Kidron, about categorisation, which we will debate later. There is a big difference when it comes to Schedule 11, which defines the categorisation scheme: whether in the normal run of business we might create an exemption in the categorisation secondary legislation, or whether it would be the Secretary of State coming back with one of those exceptional powers that the Minister knows we do not like. He could almost be making a case for why the Secretary of State has to have these exceptional powers. We would be much less comfortable with that than if the Schedule 11 categorisation piece effectively allowed another class to be created, rather than it being an exceptional Secretary of State power.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will check what I said but I hope that I have set out why we have taken the approach that we have with the broad scope and the exemptions and categorisations that are contained in it. With that, I urge the noble Lord to withdraw his amendment.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

My Lords, that was a very useful debate. I appreciate the Minister’s response and his “yes, no, maybe” succinctness, but I think he has left us all more worried than when the debate started. My noble friend Lord Clement-Jones tied it together nicely. What we want is for the regulator to be focused on the greatest areas of citizen risk. If there are risks that are missing, or things that we will be asking the regulator to do that are a complete waste of time because they are low risk, then we have a problem. We highlighted both those areas. The noble Lord, Lord Russell, rightly highlighted that we are not content with just “content” as the primary focus of the legislation; it is about a lot more than content. In my amendment and those by the noble Lord, Lord Moylan, we are extremely worried—and remain so—that the Bill creates a framework that will trap Wikipedia and services like it, without that being our primary intention. We certainly will come back to this in later groups; I will not seek to press the amendment now, because there is a lot we all need to digest. However, at the end of this process, we want to get to point where the regulator is focused on things that are high risk to the citizen and not wasting time on services that are very low risk. With that, I beg leave to withdraw my amendment.

Amendment 28 withdrawn.
--- Later in debate ---
Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - - - Excerpts

My Lords, I too welcome these amendments and thank the Minister and the Government for tabling them. The Bill will be significantly strengthened by Amendment 172 and related amendments by putting the harms as so clearly described in the Bill. I identify with the comments of others that we also need to look at functionality. I hope we will do that in the coming days.

I also support Amendment 174, to which I added my name. Others have covered proposed new subsection (9B) very well; I add my voice to those encouraging the Minister to give it more careful consideration. I will also speak briefly to proposed new subsection (9A), on misinformation and disinformation content. With respect to those who have spoken against it and argued that those are political terms, I argue that they are fundamentally ethical terms. For me, the principle of ethics and the online world is not the invention of new ethics but finding ways to acknowledge and support online the ethics we acknowledge in the offline world.

Truth is a fundamental ethic. Truth builds trust. It made it into the 10 commandments:

“You shall not bear false witness against your neighbour”.


It is that ethic that would be translated across in proposed new subsection (9A). One of the lenses through which I have viewed the Bill throughout is the lens of my eight grandchildren, the oldest of whom is eight years old and who is already using the internet. Proposed new subsection (9A) is important to him because, at eight years old, he has very limited ways of checking out what he reads online—fewer even than a teenager. He stands to be fundamentally misled in a variety of ways if there is no regulation of misinformation and disinformation.

Also, the internet, as we need to keep reminding ourselves in all these debates, is a source of great potential good and benefit, but only if children grow up able to trust what they read there. If they can trust the web’s content, they will be able to expand their horizons, see things from the perspective of others and delve into huge realms of knowledge that are otherwise inaccessible. But if children grow up necessarily imbued with cynicism about everything they read online, those benefits will not accrue to them.

Misinformation and disinformation content is therefore harmful to the potential of children across the United Kingdom and elsewhere. We need to guard against it in the Bill.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, Amendment 172 is exceptionally helpful in putting the priority harms for children on the face of the Bill. It is something that we have asked for and I know the pre-legislative scrutiny committee asked for it and it is good to see it there. I want to comment to make sure that we all have a shared understanding of what this means and that people out there have a shared understanding.

My understanding is that “primary priority” is, in effect, a red light—platforms must not expose children to that content if they are under 18—while “priority” is rather an amber light and, on further review, for some children it will be a red light and for other children it be a green light, and they can see stuff in there. I am commenting partly having had the experience of explaining all this to my domestic focus group of teenagers and they said, “Really? Are you going to get rid of all this stuff for us?” I said, “No, actually, it is quite different”. It is important in our debate to do that because otherwise there is a risk that the Bill comes into disrepute. I look at something like showing the harms to fictional characters. If one has seen the “Twilight” movies, the werewolves do not come off too well, and “Lord of the Rings” is like an orc kill fest.

As regards the point made by the noble Baroness, Lady Harding, about going to the cinema, we allow older teenagers to go to the cinema and see that kind of thing. Post the Online Safety Bill, they will still be able to access it. When we look at something like fictional characters, the Bill is to deal with the harm that is there and is acknowledged regarding people pushing quite vile stuff, whereby characters have been taken out of fiction and a gory image has been created, twisted and pushed to a younger child. That is what we want online providers to do—to prevent an 11 year-old seeing that—not to stop a 16 year-old enjoying the slaughter of werewolves. We need to be clear that that is what we are doing with the priority harms; we are not going further than people think we are.

There are also some interesting challenges around humour and evolving trends. This area will be hard for platforms to deal with. I raised the issue of the Tide pod challenge in Committee. If noble Lords are not familiar, it is the idea that one eats the tablets, the detergent things, that one puts into washing machines. It happened some time ago. It was a real harm and that is reflected here in the “do not ingest” provisions. That makes sense but, again talking to my focus group, the Tide pod challenge has evolved and for older teenagers it is a joke about someone being stupid. It has become a meme. One could genuinely say that it is not the harmful thing that it was. Quite often one sees something on the internet that starts harmful—because kids are eating Tide pods and getting sick—and then over time it becomes a humorous meme. At that point, it has ceased to be harmful. I read it as that filter always being applied. We are not saying, “Always remove every reference to Tide pods” but “At a time when there is evidence that it is causing harm, remove it”. If at a later stage it ceases to be harmful, it may well move into a category where platforms can permit it. It is a genuine concern.

To our freedom of expression colleagues, I say that we do not want mainstream platforms to be so repressive of ordinary banter by teenagers that they leave those regulated mainstream platforms because they cannot speak any more, even when the speech is not harmful, and go somewhere else that is unregulated—one of those platforms that took Ofcom’s letter, screwed it up and threw it in the bin. We do not want that to be an effect of the Bill. Implementation has to be very sensitive to common trends and, importantly, as I know the noble Baroness, Lady Kidron, agrees, has to treat 15, 16 and 17 year-olds very differently from 10, 11 or 12 year-olds. That will be hard.

The other area that jumped out was about encouraging harm through challenges and stunts. That immediately brought “Jackass” to mind, or the Welsh version, “Dirty Sanchez”, which I am sure is a show that everyone in the House watched avidly. It is available on TV. Talking about equality, one can go online and watch it. It is people doing ridiculous, dangerous things, is enjoyed by teenagers and is legal and acceptable. My working assumption has to be that we are expecting platforms to distinguish between a new dangerous stunt such as the choking game—such things really exist—from a ridiculous “Jackass” or “Dirty Sanchez” stunt, which has existed for years and is accessible elsewhere.

The point that I am making in the round is that it is great to have these priority harms in the Bill but it is going to be very difficult to implement them in a meaningful way whereby we are catching the genuinely harmful stuff but not overrestricting. But that is that task that we have set Ofcom and the platforms. The more that we can make it clear to people out there what we are expecting to happen, the better. We are not expecting a blanket ban on all ridiculous teenage humour or activity. We are expecting a nuanced response. That is really helpful as we go through the debate.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I just have a question for the noble Lord. He has given an excellent exposé of the other things that I was worried about but, even when he talks about listing the harms, I wonder how helpful it is. Like him, I read them out to a focus group. Is it helpful to write these things, for example emojis, down? Will that not encourage the platforms to over-panic? That is my concern.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

On the noble Baroness’s point, that is why I intervened in the debate: so that we are all clear. We are not saying that, for priority content, it is an amber light and not a red light. We are not saying, “Just remove all this stuff”; it would be a wrong response to the Bill to say, “It’s a fictional character being slaughtered so remove it”, because now we have removed “Twilight”, “Watership Down” and whatever else. We are saying, “Think very carefully”. If it is one of those circumstances where this is causing harm—they exist; we cannot pretend that they do not—it should be removed. However, the default should not be to remove everything on this list; that is the point I am really trying to make.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, our debate on this group is on the topic of priority harms to children. It is not one that I have engaged in so I tread carefully. One reason why I have not engaged in this debate is because I have left it to people who know far more about it than I do; I have concentrated on other parts of the Bill.

In the context of this debate, one thing has come up on which I feel moved to make a short contribution: misinformation and disinformation content. There was an exchange between my noble friend Lady Harding and the noble Baroness, Lady Fox, on this issue. Because I have not engaged on the topic of priority harms, I genuinely do not have a position on what should and should not be featured. I would not want anybody to take what I say as support for or opposition to any of these amendments. However, it is important for us to acknowledge that, as much as misinformation and disinformation are critical issues—particularly for children and young people because, as the right reverend Prelate said, the truth matters—we cannot, in my view, ignore the fact that misinformation and disinformation have become quite political concepts. They get used in a way where people often define things that they do not agree with as misinformation—that is, opinions are becoming categorised as misinformation.

We are now putting this in legislation and it is having an impact on content, so it is important, too, that we do not just dismiss that kind of concern as not relevant because it is real. That is all I wanted to say.

Online Safety Bill

Lord Allan of Hallam Excerpts
That, in essence, is the long and the short of it. I look forward to the Government coming forward in short order with some positive proposals about what they want to do, and how they propose to do it, to protect this group of people who have had their lives and their businesses damaged and who will continue to be at risk until Parliament does something about it. I beg to move.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I will speak to Amendment 5B in my name and that of my noble friend Lord Clement-Jones. I am reminded that this is a new stage of the Bill, so I should declare my interests. I have no current financial interests in the tech sector, but until 2019 I worked for one of the large technology companies that will be regulated, doing the kind of censorship job that the noble Lord, Lord Moylan, is concerned about. We clearly did not do it very well or we would not be here today replacing people like me with Ofcom.

Amendment 5B concerns an issue that we raised in Committee: the offence of encouragement of self-harm. That new offence was broadly welcomed, including on these Benches. We believe that there is scope, in some circumstances, to seek criminal prosecution of individuals who, online or otherwise, maliciously seek to encourage other people to harm themselves. The concern we raised in Committee, which we come back to today, is that we want the offence to be used in a way that we would all agree is sensible. We do not want people who are trying to help individuals at risk of self-harm to become concerned about and afraid of it, and to feel that they need to limit activities that would otherwise be positive and helpful.

In Committee we suggested that one way to do this would be to have a filter where the Director of Public Prosecutions looked at potential prosecutions under the new offence. We take a different approach with the amendment, which would in some senses be more effective, which is to explicitly list in the Bill the three categories of activity that would not render an individual liable to prosecution.

The first is people who provide an educational resource. We should be clear that some educational resources that are intended to help people recognise self-harm and turn away from it can contain quite explicit material. Those people are concerned that they might, in publishing that material with good intent, accidentally fall foul of the offence.

The second category is those who provide support—individuals providing peer support networks, such as an online forum where people discuss their experience of self-harm and seek to turn away from it. They should not be inadvertently caught up in the offence.

The third category is people posting information about their own experience of self-harm. Again, that could be people sharing quite graphic material about what they have been doing to themselves. I hope that there would be general agreement that we would not interpret, for example, a distressed teenager sharing material about their own self-harm, with the intent of seeking advice and support from others, as in some way encouraging or assisting others to commit self-harm themselves.

There is a genuine effort here to try to find a way through so that we can provide assurances to others. If the Minister cannot accept the amendment as it is, I hope he will reaffirm that the categories of people that I described are not the target of the offence and that he will be able to offer some kind of assurance as to how they can feel confident that they would not fall foul of prosecution.

Additionally, some of these groups feel with some conviction that their voices have not been as prominent in the debate as those of other organisations. The work they do is quite sensitive, and they are often quite small organisations. Between Report and the Bill becoming law, I hope that those who will be responsible for doing the detailed work around guidance on prosecutions will meet with those people on the front line—again, specificity is all—and that those who are trying to work out how to make this legislation work will meet with the people doing that work, running those fora and engaging with the young people who seek help around self-harm to look in detail at what they are doing. That would be extraordinarily helpful.

Those are my two asks. Ideally, the Government would accept the amendment that we have tabled, but if not I hope that they can give the assurance that the three groups I listed are not the target and that they will commit to having relevant officials meet with individuals working on the front line, so that we can make sure that we do not end up prosecuting individuals without intending to.

Baroness Burt of Solihull Portrait Baroness Burt of Solihull (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I support all the amendments in this group. However, what I have to say on my own amendments will take up enough time without straying on to the territory of others. I ask noble colleagues to please accept my support as read. I thank the Minister for meeting me and giving context and explanation regarding all the amendments standing in my name. I also welcome the government amendments on intimate image abuse in another group and on digitally altered images, which impinge directly on the cyberflashing amendments.

It is clear that the Government’s heart is in the right place, even if their acceptance of a consent-based law is not. I also thank the Law Commission for meeting me and explaining the thinking behind and practicalities of how the new law in relation to cyberflashing will work, and how the existing court system can help, such as juries deciding whether or not they believe the defendant. Last but definitely not least, I acknowledge the help that I have received from Professor Clare McGlynn, and Morgane Taylor from Bumble—both immensely knowledgeable and practical people who have inspired, informed and helped throughout.

I start with Amendments 5C and 7A in my name and that of the noble Baroness, Lady Finlay. I understand that the Government are following the advice of the Law Commission in refusing to accept consent-based defence, but I point out gently that this is something that the Government choose, and sometimes choose not, to do. Although the Law Commission consulted widely, that consultation did not show support for its proposals from victims and victims’ organisations. I am still of the view that a consent-based requirement would have prevented many unsolicited images being received by women and girls. I still worry that young girls may be socialised and sexualised by their peers who say that they are sending these images for a laugh. These girls do not have the maturity to say that they do not find it funny, but pretend it is okay while cringing with humiliation inside. Consent-based legislation would afford them the best protection and educate young girls and men that not only are women and girls frequently not interested in seeing a picture of a man’s willy, but girls think differently from boys about this. Who knew?

I also believe that a consent-based law would provide the most suitable foundation for education and prevention initiatives. However, I have listened to the Minister and the Law Commission. I have been told that, if it got to court, the complainant would not be humiliated all over again by having to give evidence in court and admit the distress and humiliation they felt. But according to the Minister, like the new intimate image amendment tabled by the Government themselves, it is up to the Crown Prosecution Service to follow it up and, after making their statement of complaint, my understanding is that the complainant does not have to take part further—more of that later. However, given the current success rate of only 4% of even charging alleged perpetrators in intimate image abuse cases, I worry that not only will victims continue to be reluctant to come forward but the chances of prosecution will be so slim that it will not act as a deterrent. We know from experience of sharing sexual images without consent, that the motivation thresholds have limited police investigations and prosecutions due to the evidential challenges. That is what the Law Commission has recommended as regards the introduction of a consent-based image offence.

--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My comments will be rather shorter. I want to make a detailed comment about Amendment 5B, which I strongly support and which is in the name of the noble Lord, Lord Allan. It refers to,

“a genuine medical, scientific or educational purpose, … the purposes of peer support”

I would urge him to put “genuine peer support”. That is very important because there is a lot of dog whistling that goes on in this area. So if the noble Lord—

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

My working assumption would be that that would be contestable. If somebody claimed the peer support defence and it was not genuine, that would lead to them becoming liable. So I entirely agree with the noble Baroness. It is a very helpful suggestion.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I also want to support the noble Baroness, Lady Kennedy. The level of abuse to women online and the gendered nature of it has been minimised; the perpetrators have clearly felt immune to the consequences of law enforcement. What worries me a little in this discussion is the idea or conflation that anything said to a woman is an act of violence. I believe that the noble Baroness was being very specific about the sorts of language that could be caught under her suggestions. I understand from what she said that she has been having conversations with the Minister. I very much hope that something is done in this area, and that it is explored more fully, as the noble Baroness, Lady Morgan, said, in the guidance. However, I just want to make the point that online abuse is also gamified: people make arrangements to abuse people in groups in particular ways that are not direct. If they threaten violence, that is quite different to a pile-in saying that you are a marvellous human being.

--- Later in debate ---
We expect these tight parameters and the usual prosecutorial discretion to provide sufficient safeguards against inappropriate prosecutions. The defence of necessity may also serve to ensure that actions undertaken in extraordinary circumstances to mitigate more serious harm should not be criminal. The offence of encouraging or assisting suicide has not led to the prosecution of vulnerable people who talk about suicidal feelings online or those who offer them support, and there is no reason to suppose that this offence will criminalise those whom this amendment seeks to protect. However, the noble Lords raise an important issue and I assure them that we will keep the operation of the offence under review. The Government have committed to expanding it to cover all ways of encouraging or assisting self-harm so there will be an opportunity to revisit it in due course.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

I appreciate the Minister’s response. Could he also respond to my suggestion that it would be helpful for some of the people working on the front line to meet officials to go through their concerns in more detail?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

I am very happy to make that commitment. It would be useful to have their continued engagement, as we have had throughout the drafting of the Bill.

The noble Baroness, Lady Burt of Solihull, has tabled a number of amendments related to the new offence of cyberflashing. I will start with her Amendment 6. We believe that this amendment reduces the threshold of the new offence to too great an extent. It could, for example, criminalise a person sending a picture of naked performance art to a group of people, where one person might be alarmed by the image but the sender sends it anyway because he or she believes that it would be well received. That may be incorrect, unwise and insensitive, but we do not think it should carry the risk of being convicted of a serious sexual offence.

Crucially, the noble Baroness’s amendment requires that the harm against the victim be proven in court. Not only does this add an extra step for the prosecution to prove in order for the perpetrator to be convicted, it creates an undue burden on the victim, who would be cross-examined about his or her—usually her—experience of harm. For example, she might have to explain why she felt humiliated; this in itself could be retraumatising and humiliating for the victim. By contrast, Clause 170 as drafted means that the prosecution has only to prove and focus on the perpetrator’s intent.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, I am grateful for the opportunity to continue some of the themes we touched on in the last group and the debate we have had throughout the passage of the Bill on the importance of tackling intimate image abuse. I shall introduce the government amendments in this group that will make a real difference to victims of this abhorrent behaviour.

Before starting, I take the opportunity again to thank the Law Commission for the work it has done in its review of the criminal law relating to the non-consensual taking, making and sharing of intimate images. I also thank my right honourable friend Dame Maria Miller, who has long campaigned for and championed the victims of online abuse. Her sterling efforts have contributed greatly to the Government’s approach and to the formulation of policy in this sensitive area, as well as to the reform of criminal law.

As we announced last November, we intend to bring forward a more expansive package of measures based on the Law Commission’s recommendations as soon as parliamentary time allows, but the Government agree with the need to take swift action. That is why we are bringing forward these amendments now, to deliver on the recommendations which fall within the scope of the Bill, thereby ensuring justice for victims sooner.

These amendments repeal the offence of disclosing private sexual photographs and films with intent to cause distress and replace it with four new sexual offences in the Sexual Offences Act 2003. The first is a base offence of sharing an intimate photograph or film without consent or reasonable belief in consent. This recognises that the sharing of such images, whatever the intent of the perpetrator, should be considered a criminal violation of the victim’s bodily autonomy.

The amendments create two more serious offences of sharing an intimate photograph or film without consent with intent to cause alarm, distress or humiliation, or for the purpose of obtaining sexual gratification. Offenders committing the latter offence may also be subject to notification requirements, commonly referred to as being on the sex-offenders register. The amendments create an offence of threatening to share an intimate image. These new sharing offences are based on the Law Commission’s recommended approach to the idea of intimate photographs or films to include images which show or appear to show a person nude or partially nude, or which depict sexual or toileting activity. This will protect more victims than the current Section 33 offence, which protects only images of a private and sexual nature.

Finally, these clauses will, for the first time, make it a criminal offence to share a manufactured or so-called deepfake image of another person without his or her consent. This form of intimate image abuse is becoming more prevalent, and we want to send a clear message that it will not be tolerated.

By virtue of placing these offences in the Sexual Offences Act 2003, we are extending to these offences also the current special measures, so that victims can benefit from them in court, and from anonymity provisions, which are so important when something so intimate has been shared without consent. This is only the first stage in our reform of the law in this area. We are committed to introducing additional changes, giving effect to further recommendations of the Law Commission’s report which are beyond the scope of the Bill, when parliamentary time allows.

I hope that noble Lords from across your Lordships’ House will agree that these amendments represent an important step forward in tackling intimate image abuse and protecting victims. I commend them to the House, and I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

My Lords, I welcome these new offences. From my professional experience, I know that what came to be known as “sextortion” created some of the most distressing cases you could experience, where an individual would obtain intimate images, often by deception, and then use them to make threats. This is where a social network is particularly challenging; it enables people to access a network of all the family and friends of an individual whose photo they now hold and to threaten to distribute it to their nearest and dearest. This affects men and women; many of the victims were men who were honey-potted into sharing intimate images and in the worst cases it led to suicide. It was not uncommon that people would feel that there was no way out; the threat was so severe that they would take their own lives. It is extremely welcome that we are doing something about it, and making it more obvious to anyone who is thinking about committing this kind of offence that they run the risk of criminal prosecution.

I have a few specific questions. The first is on the definitions in proposed new Section 66D, inserted by government Amendment 8, where the Government are trying to define what “intimate” or “nudity” represents. This takes me back again to my professional experience of going through slide decks and trying to decide what was on the right or wrong side of a nudity policy line. I will not go into the detail of everything it said, not least because I keep noticing younger people in the audience here, but I will leave you with the thought that you ended up looking at images that involved typically fishnets, in the case of women, and socks, in the case of men—I will leave the rest to your Lordships’ imaginations to determine at what point someone has gone from being clothed to nude. I can see in this amendment that the courts are going to have to deal with the same issues.

The serious point is that, where there is alignment between platform policies, definitions and what we do not want to be distributed, that is extremely helpful, because it then means that if someone does try to put an intimate image out across one of the major platforms, the platform does not have to ask whether there was consent. They can just say that it is in breach of their policy and take it down. It actually has quite a beneficial effect on slowing transmission.

The other point that comes out of that is that some of these questions of intimacy are quite culturally subjective. In some cultures, even a swimsuit photo could be used to cause humiliation and distress. I know this is extremely difficult; we do not want to be overly censorious but, at the same time, we do not want to leave people exposed to threats, and if you come from a culture where a swimsuit photo would be a threat, the definitions may not work for you. So I hope that, as we go through this, there will be a continued dialogue between experts in the platforms who have to deal with these questions and people working on the criminal offence side. To the extent that we can achieve it, there should be alignment and the message should go out that if you are thinking of distributing an image like this, you run the risk of being censored by the platforms but also of running into a criminal prosecution. That is on the mechanics of making it work.

Online Safety Bill

Lord Allan of Hallam Excerpts
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak very briefly to Amendments 55 and 182. We are now at the stage of completely taking the lead from the Minister and the noble Lords opposite—the noble Lords, Lord Stevenson and Lord Clement-Jones—that we have to accept these amendments, because we need now to see how this will work in practice. That is why we all think that we will be back here talking about these issues in the not too distant future.

My noble friend the Minister rightly said that, as we debated in Committee, the Government made a choice in taking out “legal but harmful”. Many of us disagree with that, but that is the choice that has been made. So I welcome the changes that have been made by the Government in these amendments to at least allow there to be more empowerment of users, particularly in relation to the most harmful content and, as we debated, in relation to adult users who are more vulnerable.

It is worth reminding the House that we heard very powerful testimony during the previous stage from noble Lords with personal experience of family members who struggle with eating disorders, and how difficult these people would find it to self-regulate the content they were looking at.

In Committee, I proposed an amendment about “toggle on”. Anyone listening to this debate outside who does not know what we are talking about will think we have gone mad, talking about toggle on and toggle off, but I proposed an amendment for toggle on by default. Again, I take the Government’s point, and I know my noble friend has put a lot of work into this, with Ministers and others, in trying to come up with a sensible compromise.

I draw attention to Amendment 55. I wonder if my noble friend the Minister is able say anything about whether users will be able to have specific empowerment in relation to specific types of content, where they are perhaps more vulnerable if they see it. For example, the needs of a user might be quite different between those relating to self-harm and those relating to eating disorder content or other types of content that we would deem harmful.

On Amendment 182, my noble friend leapt immediately to abusive content coming from unverified users, but, as we have heard, and as I know, having led the House’s inquiry into fraud and digital fraud last year, there will be, and already is, a prevalence of scams. The Bill is cracking down on fraudulent advertisements but, as an anti-fraud measure, being able to see whether an account has been verified would be extremely useful. The view now is that, if this Bill is successful—and we hope it is—in cracking down on fraudulent advertising, then there will be even more reliance on what is called organic reach, which is the use of fake accounts, where verification therefore becomes more important. We have heard from opinion polling that the public want to see which accounts are or are not verified. We have also heard that Amendment 182 is about giving users choice, in making clear whether their accounts are verified; it is not about compelling people to say whether they are verified or not.

As we have heard, this is a direction of travel. I understand that the Government will not want to accept these amendments at this stage, but it is useful to have this debate to see where we are going and what Ofcom will be looking at in relation to these matters. I look forward to hearing what my noble friend the Minister has to say about these amendments.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I speak to Amendment 53, on the assessment duties, and Amendment 60, on requiring services to provide a choice screen. It is the first time we have seen these developments. We are in something of a see-saw process over legal but harmful. I agree with my noble friend Lord Clement-Jones when he says he regrets that it is no longer in the Bill, although that may not be a consistent view everywhere. We have been see-sawing backwards and forwards, and now, like the Schrödinger’s cat of legal but harmful, it is both dead and alive at the same time. Amendments that we are dealing with today make it a little more alive that it was previously.

In this latest incarnation, we will insist that category 1 services carry out an assessment of how they will comply with their user-empowerment responsibility. Certainly, this part seems reasonable to me, given that it is limited to category 1 providers, which we assume will have significant resources. Crucially, that will depend on the categorisations—so we are back to our previous debate. If we imagine category 1 being the Meta services and Twitter, et cetera, that is one thing, but if we are going to move others into category 1 who would really struggle to do a user empowerment tool assessment—I have to use the right words; it is not a risk assessment—then it is a different debate. Assuming that we are sticking to those major services, asking them to do an assessment seems reasonable. From working on the inside, I know that even if it were not formalised in the Bill, they would end up having to do it as part of their compliance responsibilities. As part of the Clause 8 illegal content risk assessment, they would inevitably end up doing that.

That is because the categories of content that we are talking about in Clauses 12(10) to (12) are all types of content that might sometimes be illegal and sometimes not illegal. Therefore, if you were doing an illegal content risk assessment, you would have to look at it, and you would end up looking at types of content and putting them into three buckets. The first bucket is that it is likely illegal in the UK, and we know what we have to do there under the terms of the Bill. The second is that it is likely to be against your terms of service, in which case you would deal with it there. The third is that it is neither against your terms of service nor against UK law, and you would make a choice about that.

I want to focus on what happens once you have done the risk assessment and you have to have the choice screen. I particularly want to focus on services where all the content in Clause 12 is already against their terms of service, so there is no gap. The whole point of this discussion about legal but harmful is imagining that there is going to be a mixed economy of services and, in that mixed economy, there will be different standards. Some will wish to allow the content listed in Clause 12—self-harm-type content, eating disorder content and various forms of sub-criminal hate speech. Some will choose to do that—that is going to be their choice—and they will have to provide the user empowerment tools and options. I believe that many category 1 providers will not want to; they will just want to prohibit all that stuff under their terms of service and, in that case, offering a choice is meaningless. That will not make the noble Lord, Lord Moylan, or the noble Baroness, Lady Fox, very happy, but that is the reality.

Most services will just say that they do not want that stuff on their platform. In those cases, I hope that what we are going to say is that, in their terms of service, when a user joins a service, they can say that they have banned all that stuff anyway, so they are not going to give the user a user empowerment tool and, if the user sees that stuff, they should just report it and it will be taken down under the terms of service. Throughout this debate I have said, “No more cookie banners, please”. I hope that we are not going to require people, in order for them to comply with this law, to offer a screen that people then click through. It is completely meaningless and ineffective. For those services that have chosen under their terms of service to restrict all the content in Clause 12, I hope that we will be saying that their version of the user empowerment tool is not to make people click anything but to provide education and information and tell them where they can report the content and have it taken down.

Then there are those who will choose to protect that content and allow it on their service. I agree with the noble Lord, Lord Moylan, that this is, in some sense, Twitter-focused or Twitter-driven legislation, because Twitter tends to be more in the freedom of speech camp and to allow hate speech and some of that stuff. It will be more permissive than Facebook or Instagram in its terms, and it may choose to maintain that content and it will have to offer that screen. That is fine, but we should not be making services do so when they have already prohibited such content.

The noble Lord, Lord Moylan, mentioned services that use community moderators to moderate part of the service and how this would apply there. Reddit is the obvious example, but there are others. If you are going to have user empowerment—and Reddit is more at the freedom of expression end of things—then if there are some subreddits, or spaces within Reddit that allow hate speech or the kind of speech that is in Clause 12, it would be rational to say that user empowerment in the context of Reddit is to be told that you can join these subreddits and you are fine or you can join those subreddits and you are allowing yourself to be exposed to this kind of content. What would not make sense would be for Reddit to do it individual content item by content item. When we are thinking about this, I hope that the implementation would say that, for a service with community-moderated spaces, and subspaces within the larger community, user empowerment means choosing which subspaces you enter, and you would be given information about them. Reddit would say to the moderators of the subreddits, “You need to tell us whether you have any Clause 12-type content”—I shall keep using that language—“and, if you are allowing it, you need to make sure that you are restricted”. But we should not expect Reddit to restrict every individual content item.

Finally, as a general note of caution, noble Lords may have detected that I am not entirely convinced that these will be hugely beneficial tools, perhaps other than for a small subset of Twitter users, for whom they are useful. There is an issue around particular kinds of content on Twitter, and particular Twitter users, including people in prominent positions in public life, for whom these tools make sense. For a lot of other people, they will not be particularly meaningful. I hope that we are going to keep focused on outcomes and not waste effort on things that are not effective.

As I say, many companies, when they are faced with this, will look at it and say, “I have limited engineering time. I could build all these user empowerment tools or I could just ban the Clause 12 stuff in my terms of service”. That would not be a great outcome for freedom of expression; it might be a good outcome for the people who wanted to prohibit legal but harmful in the first place. You are going to do that as a really hard business decision. It is much more expensive to try to maintain these different regimes and flag all this content and so on. It is simpler to have one set of standards.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful for the broad, if not universal, support for the amendments that we have brought forward following the points raised in Committee. I apologise for anticipating noble Lords’ arguments, but I am happy to expand on my remarks in light of what they have said.

My noble friend Lord Moylan raised the question of non-verified user duties and crowdsourced platforms. The Government recognise concerns about how the non-verified user duties will work with different functionalities and platforms, and we have engaged extensively on this issue. These duties are only applicable to category 1 platforms, those with the largest reach and influence over public discourse. It is therefore right that such platforms have additional duties to empower their adult users. We anticipate that these features will be used in circumstances where vulnerable adults wish to shield themselves from anonymous abuse. If users decide that they are restricting their experience on a particular platform, they can simply choose not to use them. In addition, before these duties come into force, Ofcom will be required to consult effective providers regarding the codes of practice, at which point they will consider how these duties might interact with various functionalities.

My noble friend and the noble Lord, Lord Allan of Hallam, raised the potential for being bombarded with pop-ups because of the forced-choice approach that we have taken. These amendments have been carefully drafted to minimise unnecessary prompts or pop-ups. That is why we have specified that the requirement to proactively ask users how they want these tools to be applied is applicable only to registered users. This approach ensures that users will be prompted to make a decision only once, unless they choose to ignore it. After a decision has been made, the provider should save this preference and the user should not be prompted to make the choice again.

The noble Lord, Lord Clement-Jones, talked further about his amendments on the cost of user empowerment tools as a core safety duty in the Bill. Category 1 providers will not be able to put the user empowerment tools in Clause 12 behind a pay wall and still be compliant with their duties. That is because they will need to offer them to users at the first possible opportunity, which they will be unable to do if they are behind a pay wall. The wording of Clause 12(2) makes it clear that providers have a duty to include user empowerment features that an adult user may use or apply.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

The Minister may not have the information today, but I would be happy to get it in writing. Can he clarify exactly what will be expected of a service that already prohibits all the Clause 12 bad stuff in their terms of service?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - - - Excerpts

I will happily write to the noble Lord on that.

Clause 12(4) further sets out that all search user empowerment content tools must be made available to all adult users and be easy to access.

The noble Lord, Lord Clement-Jones, on behalf of the noble Baroness, Lady Finlay, talked about people who will seek out suicide, self-harm or eating-disorder content. While the Bill will not prevent adults from seeking out legal content, it will introduce significant protections for adults from some of the most harmful content. The duties relating to category 1 services’ terms of service are expected hugely to improve companies’ own policing of their sites. Where this content is legal and in breach of the company’s terms of service, the Bill will force the company to take it down.

We are going even further by introducing a new user empowerment content-assessment duty. This will mean that where content relates to eating disorders, for instance, but which is not illegal, category 1 providers need fully to assess the incidence of this content on their service. They will need clearly to publish this information in accessible terms of service, so users will be able to find out what they can expect on a particular service. Alternatively, if they choose to allow suicide, self-harm or eating content disorder which falls into the definition set out in Clause 12, they will need proactively to ask users how they would like the user empowerment content features to be applied.

My noble friend Lady Morgan was right to raise the impact on vulnerable people or people with disabilities. While we anticipate that the changes we have made will benefit all adult users, we expect them particularly to benefit those who may otherwise have found it difficult to find and use the user empowerment content features independently—for instance, some users with types of disabilities. That is because the onus will now be on category 1 providers proactively to ask their registered adult users whether they would like these tools to be applied at the first possible opportunity. The requirement also remains to ensure that the tools are easy to access and to set out clearly what tools are on offer and how users can take advantage of them.

Online Safety Bill

Lord Allan of Hallam Excerpts
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, first, I want to recognise the bravery of the families of Olly, Breck, Molly, Frankie and Sophie in campaigning for the amendments we are about to discuss. I also pay tribute to Mia, Archie, Isaac, Maia and Aime, whose families I met this morning on their way to the House. It is a great privilege to stand alongside them and witness their courage and dignity in the face of unimaginable grief. On behalf of myself, my co-signatories—the noble Lords, Lord Stevenson and Lord Clement-Jones, and the noble Baroness, Lady Morgan—and the huge number of Peers and MPs who have supported these amendments, I thank them for their work and the selflessness they have shown in their determination to ensure that other families do not suffer as they have.

This group includes Amendments 198, 199, 215 and 216, which, together, would create a pathway for coroners and, by extension, families to get access to information relevant to the death of a child from technology services. The amendments would put an end to the inhumane situation whereby coroners and families in crisis are forced to battle faceless corporations to determine whether a child’s engagement with a digital service contributed to their death. Bereaved families have a right to know what happened to their children, and coroners have a duty to ensure that lessons are learned and that those who have failed in their responsibilities are held accountable.

Since the Minister is going to be the bearer of good news this afternoon, I will take the time to make arguments for the amendments as they stand. I simply say that, while parents have been fighting for access to information, those same companies have continued to suggest friends, material and behaviours that drive children into places and spaces in which they are undermined, radicalised into despair and come to harm. In no other circumstance would it be acceptable to withhold relevant information from a court procedure. It is both immoral and a failure of justice if coroners cannot access and review all relevant evidence. For the families, it adds pain to heartbreak as they are unable to come to terms with what has happened because there is still so much that they do not know.

I am grateful to the Government for agreeing to bring forward on Report amendments that will go a very long way towards closing the loopholes that allow companies to refuse coroners’ demands and ignore parents’ entreaties. The Government’s approach is somewhat different from that in front of us, but it covers the same ground. These amendments are the result of the considerable efforts of Ministers and officials from DSIT and the Ministry of Justice, with the invaluable support of the right honourable Sajid Javid MP. I wish to note on the record the leadership of the Secretary of State, who is currently on leave, and the Minister here, the noble Lord, Lord Parkinson.

The Government’s amendments will create an express power for Ofcom to require information from services about a deceased child user’s online activity following the receipt of a Schedule 5 request from a coroner. This will vastly increase the reach and power of that coroner. Information that Ofcom can request from regulated companies under the Online Safety Bill is extremely wide and includes detailed data on what is recommended; the amount of time the child spent on the service when they accessed it; their user journey; what content they liked, shared, rewatched, paused and reported; and whether other users raised red flags about the child’s safety or well-being before their death.

Information notices prompted by a Schedule 5 request from a coroner will be backed by Ofcom’s full enforcement powers and will apply to all regulated companies. If a service fails to comply, it may be subject to enforcement action, including senior management liability and fines of up to £18 million or 10% of global turnover—vastly different from the maximum fine of £1,000 under the Coroners and Justice Act 2009. Moreover, these amendments will give coroners access to Ofcom’s expertise and understanding of how online services work and of online services’ safety duties to children. Also, there will be provisions empowering Ofcom to share information freely to assist coroners in their inquiries. Companies must provide a dedicated means of communication to manage requests for information from bereaved parents and provide written responses to those requests. I look forward to the Minister setting out that these will be operated by a team of experts and backed up by Ofcom in ensuring that the communication is adequate, timely and not obstructive. Importantly, if the communication is not adequate, bereaved families will be able to notify Ofcom.

There are a small number of outstanding questions. We remain concerned that only larger companies will be required to set out their policies on disclosure. Sadly, children are often coerced and nudged into smaller sites that have less robust safety mechanisms. Small is not safe. A further issue is to ensure that a coroner is able, via a Schedule 5 notice given to Ofcom, to compel senior management to appear at an inquest. This is a crucial ask of the legal community, who battled and failed to get companies to attend inquests, notably Wattpad at the Frankie Thomas inquest and Snap Inc at Molly Russell’s inquest. Can the Minister undertake to close these gaps before Report?

A number of matters sit outside the scope of the Online Safety Bill. I am particularly grateful to the Secretary of State for committing in writing to further work beyond the Bill to ensure that the UK’s approach is comprehensive and watertight. The Government will be exploring ways in which the Data Protection and Digital Information (No. 2) Bill can support and complement these provisions, including the potential for a code that requires data preservation if a parent or enforcement officer contacts a helpline or if there is constructive knowledge, such as when a death has been widely reported, even before a Schedule 5 notice has been delivered.

The Government are engaging with the Chief Coroner to provide training in order to ensure that coroners have the knowledge they need to carry out inquests where children’s engagement with online services is a possible factor in their death. I am concerned about the funding of this element of the Government’s plans and urge the Minister to indicate whether this could be part of Ofcom’s literacy duties and therefore benefit from the levy. Possibly most importantly, the Secretary of State has undertaken to approach the US Government to ensure that coroners can review private messages that fall outside the scope of this Bill in cases where a child’s death is being investigated. I am grateful to the noble Lord, Lord Allan, for his support in articulating the issue, and accept the invitation to work alongside the department to achieve this.

There are only two further things to say. First, delivery is in the drafting, and I hope that when he responds, the Minister will assure the House that we will see the proposed amendments well before Report so that we can ensure that this works as we have all agreed. Secondly, the Government are now looking very carefully at other amendments which deal with prevention of harm in one way or another. I share the gratitude of Bereaved Parents for Online Safety for the work that has gone into this set of amendments. However, we want to see safety by design; a comprehensive list of harms to children in the Bill, including harms caused or amplified by the design of service; principles for age assurance which ensure that the systems put in place by regulated services are measurable, secure and fit for purpose; and a proper complaints service, so that children have somewhere to turn when things go wrong. What we have been promised is a radical change of status for the coroner and for the bereaved families. What we want is fewer dead children. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, some of the issues that we have been dealing with in this Bill are more abstract or generic harms, but here we are responding to a specific need of families in the UK who are facing the most awful of circumstances.

I want to recognise the noble Baroness, Lady Kidron, for her direct support for many of those families, and for her persistent efforts to use policy and the tools we have available to us here to improve the situation for families who, sadly, will face similar tragedies in future. I appreciate the time that she has spent with me in the spirit of finding workable solutions. It is an alliance that might seem improbable, given our respective responsibilities, which have sometimes placed us in publicly adversarial roles. However, one of the strengths of this Committee process is that it has allowed us to focus on what is important and to find that we have more in common than separates us. Nothing could be more important than the issue we are dealing with now.

I am pleased that it looks like we will be able to use this Bill to make some significant improvements in this area to address the challenges faced by those families, some of whom are here today, challenges which add to their already heart-wrenching distress. The first challenge these families face is to find someone at an online service who is willing and able to answer their questions about their loved one’s use of that platform. This question about contacts at online platforms is not limited to these cases but comes up in other areas.

As noble Lords will know, I used to work for Facebook, where I was often contacted by all sorts of Governments asking me to find people in companies, often smaller companies, concerning very serious issues such as terrorism. Even when they were dealing with the distribution of terrorist content, they would find it very challenging. There is a generic problem around getting hold of people at platforms. A real strength of the Online Safety Bill is that it will necessarily require Ofcom to develop contacts at all online services that offer user-to-user and search services to people in the UK. The Government estimate that 25,000 entities are involved. We are talking about Ofcom building a comprehensive database of pretty much any service that matters to people in the UK.

Primarily, these contacts will be safety focused, as their main responsibility will be to provide Ofcom with evidence that the service is meeting its duties of care under the Bill, so again, they will have the right people in the right companies on their database in future. Importantly, Ofcom will have a team of several hundred people, paid for by a levy on these regulated services, to manage the contacts at the right level. We can expect that, certainly for the larger services, there may be a team of several people at Ofcom dedicated to working with them, whereas for the smaller services it may be a pooled arrangement whereby one Ofcom staff member deals with a group. However, in all cases there will be someone at the regulator with a responsibility for liaising with those companies. We do not expect Ofcom to use those contacts to resolve questions raised by individuals in the UK as a matter of course, but it makes sense to make this channel available where there is a relatively small number of highly impactful cases such as we are dealing with here.

--- Later in debate ---
If a provider outside the UK ignores letters and fines, these measures may well be the only possibility. Many pornography providers probably have absolutely no intention of even trying to comply with the kinds of regulations that are envisaged in the Bill. They are probably not based in the UK, are never going to pay a fine and are probably incorporated in some obscure offshore jurisdiction. Ofcom will need to use these powers in such circumstances, and on a bulk scale. We should not put that enforcement activity at risk of the legal stalling games that these sites will undoubtedly play. For that reason, I ask the Minister to commit to these changes by government amendment before Report next month.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I want to speak to Amendment 218JA in this group, in my name, to which the noble Baroness, Lady Morgan of Cotes, has added her name. This is really trying to understand what the Government’s intentions are in respect of access restriction orders.

Just to take a step back, in the Online Safety Bill regime we are creating, in effect, a licensing regime for in-scope services and saying that, if you want to operate in the United Kingdom and you are covered by the Bill—whether that is the pornography services that the noble Lord, Lord Bethell, referred to or a user-to-user or search service—here are the conditions to which you must adhere. That includes paying a fee to Ofcom for your supervision, and then following the many thousands of pages of guidance that I suspect we will end up producing and issuing to those companies. So what we are exploring here is what happens if a particular organisation does not decide to take up the offer of a licence.

Again, to go back to the previous debate, success for the Bill would be that it has a sufficient deterrent effect that the problems that we are seeking to fix are addressed. I do not think we are looking to block services or for them to fail—we are looking for them to succeed, so stage one is that Ofcom asks them nicely. It says, “You want to operate in the UK, here is what you need to do—it’s a reasonable set of requests we are making”, and the services say, “Fine”. If not, they choose to self-limit—and it is quite trivial for any online service to say, “I’m going to check incoming traffic, and if this person looks like they are coming from the UK, I’m not going to serve them”. That is self-limiting, which is an option that would be preferable if a service chose not to accept the licence condition. But let us assume that it has accepted the licence condition, and Ofcom is going to be monitoring it on a routine basis—and if Ofcom thinks it is not meeting its requirements, whether that is to produce a risk assessment or to fulfil its duty of care, Ofcom will then instruct it to do something. If it fails to follow that instruction, we are in the territory of the amendments that we are considering here: either it has refused to accept the licence conditions and to self-limit, or it has accepted them but has failed to do what we expect it to do. It has signed up and thought that it is not serious, and it is not doing the things that we expect it to do.

At that point, Ofcom has to consider what it can do. The first stage is quite right, in the group of clauses that we are looking at—Ofcom can bring in these business disruption measures. As the noble Lord, Lord Bethell, rightly pointed out, in many instances that will be effective. Any commercial service—not just pornography services, but an online service that depends on advertising—that is told that it can no longer take credit card payments from UK businesses to advertise on the service, will, one hopes, come into line and say, “That’s the end of my business in the UK—I may as well cut myself off”. But if it wants to operate, it will come into line, because that way it gets its payment services restored. But there will be others for which that is insufficient—perhaps that is not their business model—and they will carry on regardless. At that point, we may want to consider the access restrictions.

In a free society, none of us should take pleasure in the idea that we are going to instruct the internet services or block them. That is not our first instinct, but something that is rather potentially a necessary evil. At some point, there may be services that are so harmful and so oblivious to the regime that we put in place that we need to block them. Here we are trying to explore what would happen in those circumstances. The first kind of block is one that we are used to doing, and we do it today for copyright-infringing sites and a small number of other sites that break the law. We instruct service providers such as BT and TalkTalk to implement a network-level block. There are ways you can do that—various technical ways that we do not need to go into in this debate—whereby we can seek to make it so that an ordinary UK user, when they type in www.whatever, will not get to the website. But increasingly people are using technology that will work around that. Browsers, for example, may create traffic between your web browser and the online service such that TalkTalk or BT or the access provider has no visibility as to where you are going and no capability of blocking it. BT has rightly raised that. There will be different views about where we should go with this, but the question is absolutely legitimate as to what the Government’s intentions are, which is what we want to try to tease out with this amendment.

Again, we should be really candid. Somebody who is determined to bypass all the access controls will do so. There is no world in which we can say that we can guarantee that somebody with a UK internet connection can never get to a particular website. What we are seeking to do is to make violating services unavailable for most of the people most of the time. We would be unhappy if it was only some of the people some of the time, but it is not going to be all of the people all of the time. So the question is: what constitutes a sufficient access restriction to either bring them to heel or to ensure that, over the longer term, the harm is not propagated, because these services are generally not made available? It would be really helpful if the Minister was able to tease that out.

Certainly, in my view, there are services such as TOR—the Onion Router—where there is no entity that you can ask to block stuff, so if someone was using that, there is nothing that you can reasonably do. At the other end of the spectrum, there are services such as BT and TalkTalk, where it is relatively straightforward to say to them that they should block. Then there are people in between, such as browser owners that are putting in place these encrypted tunnels for very good reasons, for privacy, but which can also add value-added stuff—helping to manage bandwidth better, and so on. Is it the Government’s intention that they are going to be served with access restriction orders? That is a valid question. We might have different views about what is the right solution, but it is really important for the sector that it understands and is able to prepare if that is the Government’s intention. So we need to tease that out; that is the area in which we are looking for answers from the Government.

The second piece is to think about the long term. If our prediction—or our hope and expectation—is that most companies will come into line, that is fine; the internet will carry on as it does today but in a safer way. However, if we have misjudged the mood, and a significant numbers of services just stick their thumb up at Ofcom and say, “We are not going to play—block us if you dare”, that potentially has significant consequences for the internet as it will operate in the United Kingdom. It would be helpful to understand from the Minister whether the Government have any projections or predictions as to which way we are going to go. Are we talking about the vast majority of the internet continuing as it is today within the new regime, with the odd player that will be outside that, or is it the Government’s expectation that there may need to be blocking of significant numbers of services, essentially for the foreseeable future?

Other countries such as France and Germany have been dealing with this recently, as the noble Lord, Lord Bethell, is probably aware of. They have sought to restrict access to pornography services, and there have been all sorts of consequent knock-on effects and challenges at a technical level. It would be helpful to understand whether our expectation is that we will see the same in the United Kingdom or that something else is going to happen. If the Government do not have that information today, or if they have not made those projections, it would be helpful to know their thinking on where that might happen. Who will be able to inform us as to what that the future landscape is likely to look like as it evolves, and as Ofcom gains these powers and starts to instruct companies that they must obtain licences, and then seeks to take enforcement action against those that choose not to play the game?

Lord Curry of Kirkharle Portrait Lord Curry of Kirkharle (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I support Amendment 217 in the name of the noble Lord, Lord Bethell, and very much support the comments that he has made. I will speak to Amendments 218C, 218E, 218H and 218K in my name within this group. I also support the intent of the other amendments in this group tabled by the noble Lord, Lord Bethell.

I appreciate the process helpfully outlined by the noble Lord, Lord Allan. However, when looking at Ofcom’s implementation of existing provisions on video-sharing platforms, the overwhelming impression is of a very drawn-out process, with Ofcom failing to hold providers to account. Despite being told by Ofcom that a simple tick-box declaration by the user confirming that they are over 18 is not sufficient age verification, some providers are still using only that system. Concerningly, Ofcom has not taken decisive action.

When children are at severe risk, it is not appropriate to wait. Why, for example, should we allow porn sites to continue to host 10 million child sexual abuse videos while Ofcom simply reports that it is continuing to partner with these platforms to get a road map of action together? As has been mentioned by the noble Lord, Lord Bethell, Visa and Mastercard did not think it was appropriate to wait in such circumstances—they just acted.

Similarly, when systems are not in place to protect children from accessing pornography, we cannot just sit by and allow all the egregious associated harms to continue. Just as in Formula 1, when a red flag is raised and the cars must stop and go into the pits until the dangerous debris is cleared, sometimes it is too dangerous to allow platforms to operate until the problems are fixed. It seems to me that platforms would act very swiftly to put effective systems and processes in place if they could not operate in the interim.

The Bill already contains this emergency handbrake; the question is when it should be used. My answer is that it should be used when the evidence of severe harm presents itself, and not only when the regulator has a moment of self-doubt that its “road maps”, which it is normally so optimistic about, will eventually fix the problem. Ofcom should not be allowed to sit on the evidence hoping, with a wing and a prayer, that things will fix themselves in the end.

--- Later in debate ---
Introducing mandatory requirements would undermine Ofcom’s independence and discretion to manage enforcement on a case-by-case basis. This would also frustrate Ofcom’s ability to regulate in a proportionate way and could make its enforcement processes unnecessarily punitive or inflexible. It could also overwhelm the courts if Ofcom is strictly forced to apply for business disruption measures where any grounds apply, even where the breach may be minor. Instead, Ofcom will act proportionately in performing its regulatory functions, targeting action where it is needed and adjusting timeframes as necessary. I am mindful that on the final day in Committee, the noble Lord, Lord Grade of Yarmouth, continues to be in his place, following the Committee’s deliberations very closely on behalf of the regulator.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I am reminded by my noble friend Lord Foster of Bath, particularly relating to the gambling sector, that some of these issues may run across various regulators that are all seeking business disruption. He reminded me that if you type into a search engine, which would be regulated and subject to business disruption measures here, “Casinos not regulated by GAMSTOP”, you will get a bunch of people who are evading GAMSTOP’s regulation. Noble Lords can imagine similar for financial services—something that I know the noble Baroness, Lady Morgan of Cotes, is also very interested in. It may not be for answer now, but I would be interested to understand what thinking the Government have on how all the different business disruption regimes—financial, gambling, Ofcom-regulated search services, et cetera—will all mesh together. They could all come before the courts under slightly different legal regimes.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

When I saw the noble Lord, Lord Foster of Bath, and the noble Baroness, Lady Armstrong of Hill Top, in their places, I wondered whether they were intending to raise these points. I will certainly take on board what the noble Lord says and, if there is further information I can furnish your Lordships with, I certainly will.

The noble Baroness, Lady Kidron, asked whether the powers can be used on out-of-scope services. “No” is the direct answer to her direct question. The powers can be used only in relation to regulated services, but if sites not regulated by the Bill are publishing illegal content, existing law enforcement powers—such as those frequently deployed in cases of copyright infringement—can be used. I could set out a bit more in writing if that would be helpful.

My noble friend Lord Bethell’s amendments seek to set out in the Bill that Ofcom will be able to make a single application to the courts for an order enabling business disruption measures that apply against multiple platforms and operators. I must repeat, as he anticipated, the point made by my right honourable friend Chris Philp that the civil procedure rules allow for a multi-party claim to be made. These rules permit any number of claimants or defendants and any number of claims to be covered by one claim form. The overriding objective of the civil procedure rules is that cases are dealt with justly and proportionately. I want to reassure my noble friend that the Government are confident that the civil procedure rules will provide the necessary flexibility to ensure that services can be blocked or restricted.

The amendment in the name of the noble Lord, Lord Allan of Hallam, seeks to clarify what services might be subject to access restriction orders by removing the two examples provided in the Bill: internet access services and application stores. I would like to reassure him that these are simply indicative examples, highlighting two kinds of service on which access restriction requirements may be imposed. It is not an exhaustive list. Orders could be imposed on any services that meet the definition—that is, a person who provides a facility that is able to withdraw, adapt or manipulate it in such a way as to impede access to the regulated service in question. This provides Ofcom with the flexibility to identify where business disruption measures should be targeted, and it future-proofs the Bill by ensuring that the power remains functional and effective as technologies develop.

As the noble Lord highlighted, these are significant powers that can require that services be blocked in the UK. Clearly, limiting access to services in this way substantially affects the business interests of the service in question and the interests of the relevant third-party service, and it could affect users’ freedom of expression. It is therefore essential that appropriate safeguards are included and that due process is followed. That is why Ofcom will be required to seek a court order to be able to use these powers, ensuring that the courts have proper oversight.

To ensure that due process is upheld, an application by the regulator for a court order will have to specify the non-compliant provider, the grounds of the order and the steps that Ofcom considers should be imposed on the third parties in order to withdraw services and block users’ access. These requirements will ensure that the need to act quickly to tackle harm is appropriately balanced against upholding fundamental rights.

It might be useful to say a little about how blocking works—

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes; he made a helpful point, and I will come back on it.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

We share a common interest in understanding whether it would be used against VPNs, but we may not necessarily have the same view about whether it should be. Do not take that as an encouragement—take it as a request for information.