All 17 Lord Moylan contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Wed 1st Feb 2023
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 11th May 2023
Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1 & Report stage: Minutes of Proceedings
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 2
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 3
Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1
Wed 12th Jul 2023
Mon 17th Jul 2023
Wed 19th Jul 2023
Wed 6th Sep 2023

Online Safety Bill

Lord Moylan Excerpts
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - -

My Lords, it is hard to think of something new to say at the end of such a long debate, but I am going to try. I am helped by the fact that I find myself, very unusually, somewhat out of harmony with the temper of the debate in your Lordships’ House over the course of this afternoon and evening. I rather felt at some points that I had wandered into a conference of medieval clerics trying to work out what measures to take to mitigate the harmful effects of the invention of moveable type.

In fact, it probably does require an almost religious level of faith to believe that the measures we are discussing are actually going to work, given what my noble friends Lord Camrose and Lord Sarfraz have said about the agility of the cyber world and the avidity of its users for content. Now, we all want to protect children, and if what had come forward had been a Bill which made it a criminal offence to display or allow to be displayed to children specified harmful content—with condign punishment—we would all, I am sure, have rallied around that and rejoiced. That is how we would have dealt with this 50 years ago. But instead we have this; this is not a short Bill doing that.

Let me make three brief points about the Bill in the time we have available. The first is a general one about public administration. We seem to be wedded to the notion that the way in which we should be running large parts of the life of the country is through regulators rather than law, and that the independence of those regulators must be sacrosanct. In a different part of your Lordships’ House, there has been discussion in the last few days of the Financial Services and Markets Bill in Committee. There, of course, we have been discussing the systemic failures of regulators—that is, the box ticking, the legalism, the regulatory capture and the emergence of the interests of the regulator and how they motivate them. None the less, we carry on giving more and more powers. Ofcom is going to be one of the largest regulators and one of the most important in our lives, and it is going to be wholly unaccountable. We are not going to be happy about that.

The second point I want to make is that the Bill represents a serious threat to freedom of speech. This is not contentious; the Front Bench admits it. The Minister says that it is going to strike the right balance. I have seen very little evidence in the Bill, or indeed in the course of the day’s debate, that that balance is going to be struck at all, let alone in what I might consider the right place—and what I might consider the right place might not be what others consider it to be. These are highly contentious issues; we will be hiving them off to an unaccountable regulator, in effect, at the end.

The third point that I want to make, because I think that I am possibly going to come in under my four minutes, is that I did vote Conservative at the last general election; I always have. But that does not mean that I subscribe to every jot and tittle of the manifesto; in particular, I do not think that I ever signed up to live in a country that was the safest place in the world to be on the internet. If I had, I would have moved to China already, where nothing is ever out of place on the internet. That is all I have to say, and I shall be supporting amendments that move in the general direction that I have indicated.

Online Safety Bill

Lord Moylan Excerpts
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - -

My Lords, before I speak to my Amendment 9, which I will be able to do fairly briefly because a great deal of the material on which my case rests has already been given to the Committee by the noble Baroness, Lady Fox of Buckley, I will make the more general and reflective point that there are two different views in the Committee that somehow need to be reconciled over the next few weeks. There is a group of noble Lords who are understandably and passionately concerned about child safety. In fact, we all share that concern. There are others of us who believe that this Bill, its approach and the measures being inserted into it will have massive ramifications outside the field of child safety, for adults, of course, but also for businesses, as the noble Baroness explained. The noble Baroness and I, and others like us, believe that these are not sufficiently taken into account either by the Bill or by those pressing for measures to be harsher and more restrictive.

Some sort of balance needs to be found. At Second Reading my noble friend the Minister said that the balance had been struck in the right place. It is quite clear that nobody really agrees with that, except on the principle, which I think is always a cop-out, that if everyone disagrees with you, you must be right, which I have never logically understood in any sense at all. I hope my noble friend will not resort to claiming that he has got it right simply because everyone disagrees with him in different ways.

My amendment is motivated by the considerations set out by the noble Baroness, which I therefore do not need to repeat. It is the Government’s own assessment that between 20,000 and 25,000 businesses will be affected by the measures in this Bill. A great number of those—some four-fifths—are small businesses or micro-businesses. The Government appear to think in their assessment that only 120 of those are high risk. The reason they think they are high risk is not that they are engaged in unpleasant activities but simply that they are engaged in livestreaming and contacting new people. That might be for nefarious purposes but equally, it might not, so the 120 we need to worry about could actually be a very small number. We handle this already through our own laws; all these businesses would still be subject to existing data protection laws and complying with the law generally on what they are allowed to publish and broadcast. It would not be a free-for-all or a wild west, even among that very small number of businesses.

My Amendment 9 takes a slightly different approach to dealing with this. I do not in any way disagree with or denigrate the approach taken by the noble Baroness, Lady Fox, but my approach would be to add two categories to the list of exemptions in the schedules. The first of these is services provided by small and medium-sized enterprises. We do not have to define those because there is already a law that helps define them for us: Section 33 of the Small Business, Enterprise and Employment Act 2015. My proposal is that we take that definition, and that those businesses that comply with it be outside the scope of the Bill.

The second area that I would propose exempting was also referred to by the noble Baroness, Lady Fox of Buckley: community-based services. The largest of these, and the one that frequently annoys us because it gets things wrong, is Wikipedia. I am a great user of Wikipedia but I acknowledge that it does make errors. Of course, most of the errors it makes, such as saying, “Lord Moylan has a wart on the end of his nose”, would not be covered by the Bill anyway. Nothing in the Bill will force people to correct factual statements that have been got wrong—my year of birth or country of birth, or whatever. That is not covered. Those are the things they usually get wrong and that normally annoy us when we see them.

However, I do think that these services are extremely valuable. Wikipedia is an immense achievement and a tremendous source of knowledge and information for people. The fact that it has been put together in this organic, community-led way over a number of years, in so many languages, is a tremendous advantage and a great human advance. Yet, under the proposed changes, Wikipedia would not be able to operate its existing model of people posting their comments.

Currently, you go on Wikipedia and you can edit it. Now, I know this would not apply to any noble Lords but, in the other place, it has been suggested that MPs have discovered how to do this. They illicitly and secretly go on to and edit their own pages, usually in a flattering way, so it is possible to do this. There is no prior restraint, and no checking in advance. There are moderators at Wikipedia—I do not know whether they are employed—who review what has been done over a period, but they do not do what this Bill requires, which is checking in advance.

It is not simply about Wikipedia; there are other community sites. Is it sensible that Facebook should be responsible if a little old lady alters the information on a community Facebook page about what is happening in the local parish? Why should Facebook be held responsible for that? Why would we want it to be responsible for it—and how could it do it without effectively censoring ordinary activities that people want to carry out, using the advantages of the internet that have been so very great?

What I am asking is not dramatic. We have many laws in which we very sensibly create exemptions for small and medium-sized enterprises. I am simply asking that this law be considered under that heading as well, and similarly for Wikipedia and community-based sites. It is slightly unusual that we have had to consider that; it is not normal, but it is very relevant to this Bill and I very much hope the Government will agree to it.

The answer that I would not find satisfactory—I say this in advance for the benefit of my noble friend the Minister, in relation to this and a number of other amendments I shall be moving in Committee—is that it will all be dealt with by Ofcom. That would not be good enough. We are the legislators and we want to know how these issues will be dealt with, so that the legitimate objectives of the Bill can be achieved without causing massive disruption, cost and disbenefit to adults.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to speak in support of Amendment 9, tabled by the noble Lord, Lord Moylan, and in particular the proposed new paragraph 10A to Schedule 1. I hope I will find myself more in tune with the mood of the Committee on this amendment than on previous ones. I would be interested to know whether any noble Lords believe that Ofcom should be spending its limited resources supervising a site like Wikipedia under the new regime, as it seems to me patently obvious that that is not what we intend; it is not the purpose of the legislation.

The noble Lord, Lord Moylan, is right to remind us that one of the joys of the internet is that you buy an internet connection, plug it in and there is a vast array of free-to-use services which are a community benefit, produced by the community for the community, with no harm within them. What we do not want to do is interfere with or somehow disrupt that ecosystem. The noble Baroness, Lady Fox, is right to remind us that there is a genuine risk of people withdrawing from the UK market. We should not sidestep that. People who try to be law-abiding will look at these requirements and ask themselves, “Can I meet them?” If the Wikimedia Foundation that runs Wikipedia does not think it can offer its service in a lawful way, it will have to withdraw from the UK market. That would be to the detriment of children in the United Kingdom, and certainly not to their benefit.

There are principle-based and practical reasons why we do not want Ofcom to be operating in this space. The principle-based one is that it makes me uncomfortable that a Government would effectively tell their regulator how to manage neutral information sites such as Wikipedia. There are Governments around the world who seek to do that; we do not want to be one of those.

The amendment attempts to define this public interest, neutral, informational service. It happens to be user-to-user but it is not like Facebook, Instagram or anything similar. I would feel much more comfortable making it clear in law that we are not asking Ofcom to interfere with those kinds of services. The practical reason is the limited time Ofcom will have available. We do not want it to be spending time on things that are not important.

Definitions are another example of how, with the internet, it can often be extremely hard to draw bright lines. Functionalities bleed into each other. That is not necessarily a problem, until you try to write something into law; then, you find that your definition unintentionally captures a service that you did not intend to capture, or unintentionally misses out a service that you did intend to be in scope. I am sure the Minister will reject the amendment because that is what Ministers do; but I hope that, if he is not willing to accept it, he will at least look at whether there is scope within the Bill to make it clear that Wikipedia is intended to be outside it.

Paragraph 4 of Schedule 1 refers to “limited functionality services”. That is a rich vein to mine. It is clear that the intention is to exclude mainstream media, for example. It refers to “provider content”. In this context, Encyclopaedia Britannica is not in scope but Wikipedia is, the difference being that Wikipedia is constructed by users, while Encyclopaedia Britannica is regarded as being constructed by a provider. The Daily Mail is outside scope; indeed, all mainstream media are outside scope. Anyone who declares themselves to be media—we will debate this later on—is likely to be outside scope.

Such provider exemption should be offered to other, similar services, even if they happen to be constructed from the good will of users as opposed to a single professional author. I hope the Minister will be able to indicate that the political intent is not that we should ask Ofcom to spend time and energy regulating Wikipedia-like services. If so, can he point to where in the legislation we might get that helpful interpretation, in order to ensure that Ofcom is focused on what we want it to be focused on and not on much lower priority issues?

--- Later in debate ---
In addition, the Bill includes explicit exemptions for many small and medium-sized enterprises, through the low-risk functionality exemptions in Schedule 1. This includes an exemption for any service that offers users the ability only to post comments or reviews on digital content published by it, which will exempt many online retailers, news sites and web logs. The Bill also provides the Secretary of State with a power to exempt further types of user-to-user or search services from the Bill if the risk of harm presented by a particular service is low, ensuring that other low-risk services are not subject to unnecessary regulation. There was quite a lot of talk about Wikipedia—
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

My Lords, while my noble friend is talking about the possibility of excessive and disproportionate burden on businesses, can I just ask him about the possibility of excessive and disproportionate burden on the regulator? He seems to be saying that Ofcom is going to have to maintain, and keep up to date regularly, 25,000 risk assessments—this is on the Government’s own assessment, produced 15 months ago, of the state of the market then—even if those assessments carried out by Ofcom result in very little consequence for the regulated entity.

We know from regulation in this country that regulators already cannot cope with the burdens placed on them. They become inefficient, sclerotic and unresponsive; they have difficulty in recruiting staff of the same level and skills as the entities that they regulate. We have a Financial Services and Markets Bill going through at the moment, and the FCA is a very good example of that. Do we really think that this is a sensible burden to place on a regulator that is actually able to discharge it?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

The Bill creates a substantial new role for Ofcom, but it has already substantially recruited and prepared for the effective carrying out of that new duty. I do not know whether my noble friend was in some of the briefings with officials from Ofcom, but it is very happy to set out directly the ways in which it is already discharging, or preparing to discharge, those duties. The Government have provided it with further resource to enable it to do so. It may be helpful for my noble friend to have some of those discussions directly with the regulator, but we are confident that it is ready to discharge its duties, as set out in the Bill.

I was about to say that we have already had a bit of discussion on Wikipedia. I am conscious that we are going to touch on it again in the debate on the next group of amendments so, at the risk of being marked down for repetition, which is a black mark on that platform, I shall not pre-empt what I will say shortly. But I emphasise that the Bill does not impose prescriptive, one-size-fits-all duties on services. The codes of practice from Ofcom will set out a range of measures that are appropriate for different types of services in scope. Companies can follow their own routes to compliance, so long as they are confident that they are effectively managing risks associated with legal content and, where relevant, harm to children. That will ensure that services that already use community moderation effectively can continue to do so—such as Wikipedia, which successfully uses that to moderate content. As I say, we will touch on that more in the debate on the next group.

Amendment 9, in the name of my noble friend Lord Moylan, is designed to exempt small and medium sized-enterprises working to benefit the public from the scope of the Bill. Again, I am sympathetic to the objective of ensuring that the Bill does not impose undue burdens on small businesses, and particularly that it should not inhibit services from providing valuable content of public benefit, but I do not think it would be feasible to exempt service providers deemed to be

“working to benefit the public”.

I appreciate that this is a probing amendment, but the wording that my noble friend has alighted on highlights the difficulties of finding something suitably precise and not contestable. It would be challenging to identify which services should qualify for such an exemption.

Taking small services out of scope would significantly undermine the framework established by the Bill, as we know that many smaller services host illegal content and pose a threat to children. Again, let me reassure noble Lords that the Bill has been designed to avoid disproportionate or unnecessary regulatory burdens on small and low-risk services. It will not impose a disproportionate burden on services or impede users’ access to value content on smaller services.

Amendment 9A in the name of the noble Lord, Lord Knight of Weymouth, is designed to exempt “sector specific search services” from the scope of the Bill, as the noble Baroness, Lady Merron, explained. Again, I am sympathetic to the intention here of ensuring that the Bill does not impose a disproportionate burden on services, but this is another amendment that is not needed as it would exempt search services that may pose a significant risk of harm to children, or because of illegal content on them. The amendment aims to exempt specialised search services—that is, those that allow users to

“search for … products or services … in a particular sector”.

It would exempt specialised search services that could cause harm to children or host illegal content—for example, pornographic search services or commercial search services that could facilitate online fraud. I know the noble Lord would not want to see that.

The regulatory duties apply only where there is a significant risk of harm and the scope has been designed to exclude low-risk search services. The duties therefore do not apply to search engines that search a single database or website, for example those of many retailers or other commercial websites. Even where a search service is in scope, the duties on services are proportionate to the risk of harm that they pose to users, as well as to a company’s size and capacity. Low-risk services, for example, will have minimal duties. Ofcom will ensure that these services can quickly and easily comply by publishing risk profiles for low-risk services, enabling them easily to understand their risk levels and, if necessary, take steps to mitigate them.

The noble Lord, Lord McCrea, asked some questions about the 200 most popular pornographic websites. If I may, I will respond to the questions he posed, along with others that I am sure will come in the debate on the fifth group, when we debate the amendments in the names of the noble Lord, Lord Morrow, and the noble Baroness, Lady Ritchie of Downpatrick, because that will take us on to the same territory.

I hope that provides some assurance to my noble friend Lord Moylan, the noble Baroness, Lady Fox, and others, and that they will be willing not to press their amendments in this group.

--- Later in debate ---
Moved by
10: Clause 4, page 4, line 8, at end insert—
“(2A) This Act does not apply in relation to moderation actions taken, or not taken, by users of a Part 3 service.”Member’s explanatory statement
The drafting of some Bill provisions, such as Clauses 17(4)(c) or 65(1), leaves room for debate as to whether community moderation gives rise to liability and obligations for the provider. This amendment, along with the other amendment to Clause 4 in the name of Lord Moylan, clarifies that moderation carried out by the public, for example on Wikipedia, is not fettered by this Bill.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

My Lords, I have to start with a slightly unprofessional confession. I accepted the Bill team’s suggestion on how my amendments might be grouped after I had grouped them rather differently. The result is that I am not entirely clear why some of these groupings are quite as they are. As my noble friend the Minister said, my original idea of having Amendments 9, 10 and 11 together would perhaps have been better, as it would have allowed him to give a single response on Wikipedia. Amendments 10 and 11 in this group relate to Wikipedia and services like it.

I am, I hope, going to cause the Committee some relief as I do not intend to repeat remarks made in the previous group. The extent to which my noble friend wishes to amplify his comments in response to the previous group is entirely a matter for him, since he said he was reserving matter that he would like to bring forward but did not when commenting on the previous group. If I do not speak further on Amendments 10 and 11, it is not because I am not interested in what my noble friend the Minister might have to say on the topic of Wikipedia.

To keep this fairly brief, I turn to Amendment 26 on age verification. I think we have all agreed in the Chamber that we are united in wanting to see children kept safe. On page 10 of the Bill, in Clause 11(3), it states that there will be a duty to

“prevent children of any age from encountering”

this content—“prevent” them “encountering” is extremely strong. We do not prevent children encountering the possibility of buying cigarettes or encountering the possibility of being injured crossing the road, but we are to prevent children from these encounters. It is strongly urged in the clause—it is given as an example—that age verification will be required for that purpose.

Of course, age verification works only if it applies to everybody: one does not ask just the children to prove their age; one has to ask everybody online. Unlike when I go to the bar in a pub, my grey hair cannot be seen online. So this provision will almost certainly have to extend to the entire population. In Clause 11(3)(b), we have an obligation to protect. Clearly, the Government intend a difference between “prevent” and “protect”, or they would not have used two different verbs, so can my noble friend the Minister explain what is meant by the distinction between “prevent” and “protect”?

My amendment would remove Clause 11(3) completely. But it is, in essence, a probing amendment and what I want to hear from the Government, apart from how they interpret the difference between “prevent” and “protect”, is how they expect this duty to be carried out without having astonishingly annoying and deterring features built into every user-to-user platform and website, so that every time one goes on Wikipedia—in addition to dealing with the GDPR, accepting cookies and all the other nonsense we have to go through quite pointlessly—we then have to provide age verification of some sort.

What mechanism that might be, I do not know. I am sure that there are many mechanisms available for age verification. I do not wish to get into a technical discussion about what particular techniques might be used—I accept that there will be a range and that they will respond and adapt in the light of demand and technological advance—but I would like to know what my noble friend the Minister expects and how wide he thinks the obligation will be. Will it be on the entire population, as I suspect? Focusing on that amendment—and leaving the others to my noble friend the Minister to respond to as he sees fit—and raising those questions, I think that the Committee would like to know how the Government imagine that this provision will work. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to the amendments in the name of the noble Lord, Lord Moylan, on moderation, which I think are more important than he has given himself credit for—they go more broadly than just Wikipedia.

There is a lot of emphasis on platform moderation, but the reality is that most moderation of online content is done by users, either individually or in groups, acting as groups in the space where they operate. The typical example, which many Members of this House have experienced, is when you post something and somebody asks, “Did you mean to post that?”, and you say, “Oh gosh, no”, and then delete it. A Member in the other place has recently experienced a rather high-profile example of that through the medium of the newspaper. On a much smaller scale, it is absolutely typical that people take down content every day, either because they regret it or, quite often, because their friends, families or communities tell them that it was unwise. That is the most effective form of moderation, because it is the way that people learn to change their behaviour online, as opposed to the experience of a platform removing content, which is often experienced as the big bad hand of the platform. The person does not learn to change their behaviour, so, in some cases, it can reinforce bad behaviour.

Community moderation, not just on Wikipedia but across the internet, is an enormous public good, and the last thing that we want to do in this legislation is to discourage people from doing it. In online spaces, that is often a volunteer activity: people give up their time to try to keep a space safe and within the guidelines they have set for that space. The noble Lord, Lord Moylan, has touched on a really important area: in the Bill, we must be absolutely clear to those volunteers that we will not create all kinds of new legal operations and liabilities on them. These are responsible people, so, if they are advised that they will incur all kinds of legal risk when trying to comply with the Online Safety Bill, they will stop doing the moderation—and then we will all suffer.

On age-gating, we will move to a series of amendments where we will discuss age assurance, but I will say at the outset, as a teaser to those longer debates, that I have sympathy with the points made by the noble Lord, Lord Moylan. He mentioned pubs—we often talk about real-world analogies. In most of the public spaces we enter in the real world, nobody does any ID checking or age checking; we take it on trust, unless and until you carry out an action, such as buying alcohol, which requires an age check.

It is legitimate to raise this question, because where we fall in this debate will depend on how we see public spaces. I see a general-purpose social network as equivalent to walking into a pub or a town square, so I do not expect to have my age and ID checked at the point at which I enter that public space. I might accept that my ID is checked at a certain point where I carry out various actions. Others will disagree and will say that the space should be checked as soon as you go into it—that is the boundary of the debate we will have across a few groups. As a liberal, I am certainly on the side that says that it is incumbent on the person wanting to impose the extra checks to justify them. We should not just assume that extra checks are cost-free and beneficial; they have a cost for us all, and it should be imposed only where there is a reasonable justification.

--- Later in debate ---
Contrary to what some have said, there is currently no requirement in the Bill for users to verify their age before accessing search engines and user-to-user services. We expect that only services which pose the highest risk to children will use age-verification technologies, but this is indeed a debate to which we will return in earnest and in detail on later groups of amendments. Amendment 26 would remove a key child safety duty, significantly weakening the Bill’s protections for children. The Bill takes a proportionate approach to regulation, which recognises the diverse range of services that are in scope of it. My noble friend’s amendments run counter to that and would undermine the protections in the Bill. I hope he will feel able not to press them and allow us to return to the debates on age verification in full on another group.
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - -

My Lords I am grateful to all noble Lords who have contributed to this slightly disjointed debate. I fully accept that there will be further opportunities to discuss age verification and related matters, so I shall say no more about that. I am grateful, in particular, to the noble Lord, Lord Allan of Hallam, for supplying the deficiency in my opening remarks about the importance of Amendments 10 and 11, and for explaining just how important that is too. I also thank the noble Lord, Lord Stevenson. It was good of him to say, in the open approach he took on the question of age, that there are issues still to be addressed. I do not think anybody feels that we have yet got this right and I think we are going to have to be very open in that discussion, when we get to it. That is also true about what the noble Lord, Lord Allan of Hallam, said: we have not yet got clarity as to where the age boundary is—I like his expression—for the public space. Where is the point at which, if checks are needed, those checks are to be applied? These are all matters to discuss and I hope noble Lords will forgive me if I do not address each individual contribution separately.

I would like to say something, I hope not unfairly or out of scope, about what was said by the noble Baronesses, Lady Finlay of Llandaff and Lady Kidron, when they used, for the first time this afternoon, the phrase “zero tolerance”, and, at the same time, talked about a risk-based approach. I have, from my own local government experience, a lot of experience of risk-based approaches taken in relation to things—very different, of course, from the internet—such as food safety, where local authorities grade restaurants and food shops and take enforcement action and supervisory action according to their assessment of the risk that those premises present. That is partly to do with their assessment of the management and partly to do with their experience of things that have gone wrong in the past. If you have been found with mouse droppings and you have had to clean up the shop, then you will be examined a great deal more frequently until the enforcement officers are happy; whereas if you are always very well run, you will get an inspection visit maybe only once a year. That is what a risk-based assessment consists of. The important thing to say is that it does not produce zero tolerance or zero outcomes.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I just want to make the point that I was talking about zero tolerance at the end of a ladder of tolerance, just to be clear. Letting a seven-year-old child into an 18-plus dating app or pornographic website is where the zero tolerance is—everything else is a ladder up to that.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

I beg the noble Baroness’s pardon; I took that for granted. There are certain things—access to pornography, material encouraging self-harm and things of that sort—where one has to have zero tolerance, but not everything. I am sorry I took that for granted, so I fully accept that I should have made that more explicit in my remarks. Not everything is to be zero-toleranced, so to speak, but certain things are. However, that does not mean that they will not happen. One has to accept that there will be leakage around all this, just as some of the best-run restaurants that have been managed superbly for years will turn out, on occasion, to be the source of food poisoning. One has to accept that this is never going to be as tight as some of the advocates wanted, but with that, I hope I will be given leave to withdraw—

Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- Hansard - - - Excerpts

May I intervene, because I have also been named in the noble Lord’s response? My concern is about the most extreme, most violent, most harmful and destructive things. There are some terrible things posted online. You would not run an open meeting on how to mutilate a child, or how to stab somebody most effectively to do the most harm. It is at this extreme end that I cannot see anyone in society in the offline world promoting classes for any of these terrible activities. Therefore, there is a sense that exposure to these things is of no benefit but promotes intense harm. People who are particularly vulnerable at a formative age in their development should not be exposed to them, because they would not be exposed to them elsewhere. I am speaking personally, not for anybody else, but I stress that this is the level at which the tolerance should be set to zero because we set it to zero in the rest of our lives.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

Everything the noble Baroness has said is absolutely right, and I completely agree with her. The point I simply want to make is that no form of risk-based assessment will achieve a zero-tolerance outcome, but—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

I am so sorry, but may I offer just one final thought from the health sector? While the noble Lord is right that where there are human beings there will be error, there is a concept in health of the “never event”—that when that error occurs, we should not tolerate it, and we should expect the people involved in creating that error to do a deep inspection and review to understand how it occurred, because it is considered intolerable. I think the same exists in the digital world in a risk assessment framework, and it would be a mistake to ignore it.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

My Lords, I am now going to attempt for the third time to beg the House’s leave to withdraw my amendment. I hope for the sake of us all, our dinner and the dinner break business, for which I see people assembling, that I will be granted that leave.

Amendment 10 withdrawn.

Online Safety Bill

Lord Moylan Excerpts
Moved by
13: Clause 6, page 5, line 33, after “services” insert “that are not Category 2A services”
Member’s explanatory statement
This amendment is consequential on other amendments in the name of Lord Moylan to remove Clause 23(3) and the subsequent new Clause after 23, the effect of which is that the duties imposed on search services vary depending on whether or not they are Category 2A services: this needs to be reflected in the provision about combined services (regulated user-to-user services that include public search services) in Clause 6.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

My Lords, in moving my Amendment 13 I will speak to all the amendments in the group, all of which are in my name with the exception of Amendment 157 in the name of my noble friend Lord Pickles. These are interlinked amendments; they work together. There is effectively only one amendment going on. A noble Lord challenged me a day or two ago as to whether I could summarise in a sentence what the amendment does, and the answer is that I think I can: Clause 23 imposes various duties on search engines, and this amendment would remove one of those duties from search engines that fall into category 2B.

There are two categories of search engines, 2A and 2B, and category 2B is the smaller search engines. We do not know the difference between them in greater detail than that because the schedule that relates to them reserves to the Secretary of State the power to set the thresholds that will define which category a search engine falls into, but I think it is clear that category 2B is the smaller ones.

These amendments pursue a theme that I brought up in Committee earlier in the week when I argued that the Bill would put excessively onerous and unnecessary obligations on smaller businesses. The particular duty that these amendments would take away from smaller search engines is referred to in Clause 23(2):

“A duty, in relation to a service, to take or use proportionate measures relating to the design or operation of the service to effectively mitigate and manage the risks of harm to individuals, as identified in the most recent illegal content risk assessment of the service”.


The purpose of that is to recognise that very large numbers of smaller businesses do not pose a risk, according to the Government’s own assessment of the market, and to allow them to get on with their business without taking these onerous and difficult measures. They are probing amendments to try to find out what the Government are willing to do in relation to smaller businesses that will make this a workable Bill.

I can already imagine that there are noble Lords in the Chamber who will say that small does not equal safe, and that small businesses need to be covered by the same rigorous regulations as larger businesses. But I am not saying that small equals safe. I am saying—as I attempted to say when the Committee met earlier—that absolute safety is not attainable. It is not attainable in the real world, nor can we expect it to be attainable in the online world. I imagine that objection will be made. I see it has some force, but I do not think it has sufficient compelling force to put the sort of burden on small businesses that this Bill would do, and I would like to hear more about it.

I will say one other thing. Those who object to this approach need to be sure in their own minds that they are not contributing to creating a piece of legislation that, when it comes into operation, is so difficult to implement that it becomes discredited. There needs to be a recognition that this has to work in practice. If it does not—if it creates resentment and opposition—we will find the Government not bringing sections of it into force, needing to repeal them or going easy on them once the blowback starts, so to speak. With that, I beg to move.

Baroness Deech Portrait Baroness Deech (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendment 157 in the name of the noble Lord, Lord Pickles, and others, since the noble Lord is unavoidably absent. It is along the same lines as Amendment 13; it is relatively minor and straightforward, and asks the Government to recognise that search services such as Google are enormously important as an entry to the internet. They are different from social media companies such as Twitter. We ask that the Government be consistent in applying their stated terms when these are breached in respect of harm to users, whether that be through algorithms, through auto-prompts or otherwise.

As noble Lords will be aware, the Bill treats user-to-user services, such as Meta, and search services, such as Google, differently. The so-called third shield or toggle proposed for shielding users from legal but harmful content, should they wish to be shielded, does not apply when it comes to search services, important though they are. Indeed, at present, large, traditional search services, including Google and Microsoft Bing, and voice search assistants, including Alexa and Siri, will be exempted from several of the requirements for large user-to-user services—category 1 companies. Why the discrepancy? Though search services rightly highlight that the content returned by a search is not created or published by them, the algorithmic indexing, promotion and search prompts provided in search bars—the systems they design and employ—are their responsibility, and these have been proven to do harm.

Some of the examples of such harm have already been cited in the other place, but not before this Committee. I do not want to give them too much of an airing because they were in the past, and the search people have taken them down after complaints, but some of the dreadful things that emerge from searching on Google et cetera are a warning of what could occur. It has been pointed out that search engines would in the past have thrown up, for example, swastikas, SS bolts and other Nazi memorabilia when people searched for desk ornaments. If George Soros’s name came up, he would be included in a list of people responsible for world evils. The Bing service, which I dislike anyway, has been directing people—at last, it did in the past—to anti-Semitic and homophobic searches through its auto-complete, while Google’s image carousel highlighted pictures of portable barbecues to those searching for the term “Jewish baby stroller”.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - -

My Lords, I am grateful to all noble Lords who have spoken in this debate. I hope that the noble Baroness, Lady Deech, and the noble Lord, Lord Weir of Ballyholme, will forgive me if I do not comment on the amendment they spoke to in the name of my noble friend Lord Pickles, except to say that of course they made their case very well.

I will briefly comment on the remarks of the noble Baroness, Lady Kidron. I am glad to see a degree of common ground among us in terms of definitions and so forth—a small piece of common ground that we could perhaps expand in the course of the many days we are going to be locked up together in your Lordships’ House.

I am grateful too to the noble Lord, Lord Allan of Hallam. I am less clear on “2B or not 2B”, if that is the correct way of referring to this conundrum, than I was before. The noble Baroness, Lady Kidron, said that size does not matter and that it is all about risk, but my noble friend the Minister cunningly conflated the two and said at various points “the largest” and “the riskiest”. I do not see why the largest are necessarily the riskiest. On the whole, if I go to Marks & Spencer as opposed to going to a corner shop, I might expect rather less risk. I do not see why the two run together.

I address the question of size in my amendment because that is what the Bill focuses on. I gather that the noble Baroness, Lady Kidron, may want to explore at some stage in Committee why that is the case and whether a risk threshold might be better than a size threshold. If she does that, I will be very interested in following and maybe even contributing to that debate. However, at the moment, I do not think that any of us is terribly satisfied with conflating the two—that is the least satisfactory way of explaining and justifying the structure of the Bill.

On the remarks of my noble friend Lady Harding of Winscombe, I do not want in the slightest to sound as if there is any significant disagreement between us—but there is. She suggested that I was opening the way to businesses building business models “not taking children into account at all”. My amendment is much more modest than that. There are two ways of dealing with harm in any aspect of life. One is to wait for it to arrive and then to address it as it arises; the other is constantly to look out for it in advance and to try to prevent it arising. The amendment would leave fully in place the obligation to remove harm, which is priority illegal content or other illegal content, that the provider knows about, having been alerted to it by another person or become aware of it in any other way. That duty would remain. The duty that is removed, especially from small businesses—and really this is quite important—is the obligation constantly to be looking out for harm, because it involves a very large, and I suggest possibly ruinous, commitment to constant monitoring of what appears on a search engine. That is potentially prohibitive, and it arises in other contexts in the Bill as well.

There should be occasions when we can say that knowing that harmful stuff will be removed as soon as it appears, or very quickly afterwards, is adequate for our purposes, without requiring firms to go through a constant monitoring or risk-assessment process. The risk assessment would have to be adjudicated by Ofcom, I gather. Even if no risk was found, of course, that would not be the end of the matter, because I am sure that Ofcom would, very sensibly, require an annual renewal of that application, or after a certain period, to make sure that things had not changed. So even to escape the burden is quite a large burden for small businesses, and then to implement the burden is so onerous that it could be ruinous, whereas taking stuff down when it appears is much easier to do.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

Perhaps I might briefly come in. My noble friend Lord Moylan may have helped explain why we disagree: our definition of harm is very different. I am most concerned that we address the cumulative harms that online services, both user-to-user services and search, are capable of inflicting. That requires us to focus on the design of the service, which we need to do at the beginning, rather than the simpler harm that my noble friend is addressing, which is specific harmful content—not in the sense in which “content” is used in the Bill but “content” as in common parlance; that is, a piece of visual or audio content. My noble friend makes the valid point that that is the simplest way to focus on removing specific pieces of video or text; I am more concerned that we should not exclude small businesses from designing and developing their services such that they do not consider the broader set of harms that are possible and that add up to the cumulative harm that we see our children suffering from today.

So I think our reason for disagreement is that we are focusing on a different harm, rather than that we violently disagree. I agree with my noble friend that I do not want complex bureaucratic processes imposed on small businesses; they need to design their services when they are small, which makes it simpler and easier for them to monitor harm as they grow, rather than waiting until they have grown. That is because the backwards re-engineering of a technology stack is nigh-on impossible.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

My noble friend makes a very interesting point, and there is much to ponder in it—too much to ponder for me to respond to it immediately. Since I am confident that the issue is going to arise again during our sitting in Committee, I shall allow myself the time to reflect on it and come back later.

While I understand my noble friend’s concern about children, the clause that I propose to remove is not specific to children; it relates to individuals, so it covers adults as well. I think I understand what my noble friend is trying to achieve—I shall reflect on it—but this Bill and the clauses we are discussing are a very blunt way of going at it and probably need more refinement even than the amendments we have seen tabled so far. But that is for her to consider.

I think this debate has been very valuable. I did not mention it, but I am grateful also for the contribution from the noble Baroness, Lady Merron. I beg leave to withdraw the amendment.

Amendment 13 withdrawn.
--- Later in debate ---
Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - - - Excerpts

I am glad I gave the noble Baroness the opportunity for that intervention. I have a reasonable level of technical knowledge—I hand-coded my first website in 1999, so I go back some way—but given the structures we are dealing with, I question the capacity and whether it is possible to create the tools and say they will be used only in a certain way. If you break the door open, anyone can walk through the door—that is the situation we are in.

As the noble Lord, Lord Allan, said, this is a crucial part of the Bill that was not properly examined and worked through in the other place. I will conclude by saying that it is vital we have a full and proper debate in this area. I hope the Minister can reassure us that he and the department will continue to be in dialogue with noble Lords as the Bill goes forward.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - -

My Lords, I rise to speak to Amendment 205 in my name, but like other noble Lords I will speak about the group as a whole. After the contributions so far, not least from the noble Lord, Lord Allan of Hallam, and the noble Baroness, Lady Bennett of Manor Castle, there is not a great deal left for me to add. However, I will say that we have to understand that privacy is contextual. At one extreme, I know the remarks I make in your Lordships’ House are going to be carefully preserved and cherished; for several centuries, if not millennia, people will be able to see what I said today. If I am in my sitting room, having a private conversation, I expect that not to be heard by somebody, although at the same time I am dimly aware that there might be somebody on the other side of the wall who can hear what I am saying. Similarly, I am aware that if I use the telephone, it is possible that somebody is listening to the call. Somebody may have been duly authorised to do so by reference to a tribunal, having taken all the lawful steps necessary in order to listen to that call, because there are reasons that have persuaded a competent authority that the police service, or whatever, listening to my telephone call has a reason to do so, to avoid public harm or meet some other justified objective agreed on through legislation.

Here, we are going into a sphere of encryption where one assumes privacy and feels one is entitled to some privacy. However, it is possible that the regulator could at any moment step in and demand records from the past—records up to that point—without the intervention of a tribunal, as far as I can see, or without any reference to a warrant or having to explain to anybody their basis for doing so. They would be able to step in and do it. This is the point made by the noble Baroness, Lady Bennett of Manor Castle: unlike the telephone conversation, where it does not have to be everyone, everywhere, all the time—they are listening to just me and the person to whom I am talking—the provider has to have the capacity to go back, get all those records and be able to show Ofcom what it is that Ofcom is looking for. To do that requires them to change their encryption model fundamentally. It is not really possible to get away from everyone, everywhere, all the time, because the model has to be changed in order to do it.

That is why this is such an astonishing thing for the Government to insert in this Bill. I can understand why the security services and so forth want this power, and this is a vehicle to achieve something they have been trying to achieve for a long time. But there is very strong public resistance to it, and it is entirely understood, and to do it in this space is completely at odds with the way in which we felt it appropriate to authorise listening in on private conversations in the past—specific conversations, with the authority of a tribunal. To do it this way is a very radical change and one that needs to be considered almost apart from the Bill, not slipped in as a mere clause and administrative adjunct to it.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, there have been some excellent speeches so far. The noble Lord, Lord Allan of Hallam, brilliantly laid out why these amendments matter, and the noble Lord, Lord Moylan, explained why this has gained popular interest outside of the House. Not everything that goes on in this House is of interest and people do not study all of the speeches made by the noble Lord, Lord Moylan, even though they are always in the public sphere, but this particular group of amendments has elicited a huge amount of discussion.

We should remember that encrypted chat has become an indispensable part of the way that we live in this country and around the world. According to the Open Rights Group it has replaced the old-fashioned wired telephone—a rather quaint phrase. The fact that the citizens of the United Kingdom think that chat services matter so much that they are used by 60% of the total population should make us think about what we are doing regarding these services.

End-to-end encryption—the most secure form of encryption available—means that your messages are stored on your phone; people feel that they are in control because they are not on some server somewhere. Even WhatsApp cannot read your WhatsApp messages; that is the point of encryption. That is why people use it: the messages are secured with a lock which only you and the recipient have the special key to unlock to read them.

Obviously, there are certain problems. Certain Government Ministers wanted to voluntarily share all of their WhatsApp messages with a journalist who would then share them with the rest of us. If your Lordships were in that group you might have thought that was a rude thing to do. People have their WhatsApp messages leaked all the time, and when it happens we all think, “Oh my God, I’m glad I wasn’t in that WhatsApp group”, because you assume a level of privacy, even though as a grown-up you need to remember that somebody might leak them. But the main point is that they are a secure form of conversation that is widely used.

Everyone has a right to private conversations. I was thinking about how, when society closed down during the lockdown period, we needed technology in order to communicate with each other. We understood that we needed to WhatsApp message or Zoom call our friends and family, and the idea that this would involve the state listening in would have appalled us—we considered that our private life.

We want to be able to chat in confidence and be confident that only ourselves and the recipients can see what we are sharing and hear what we are saying. That is true of everyday life, but there are very good specific cases to be made for its importance, ranging through everything from Iranian women resisting the regime and communicating with each other, to all the civil liberties organisations around the world that use WhatsApp. The security of knowing that you can speak without Putin listening in or that President Xi will not be sent your WhatsApp messages is important.

The Government keep assuring us that we do not need to worry, but the Bill gives Ofcom the power to require services to install tools that would require the surveillance of encrypted communications regarding child exploitation and terrorism content, for example. Advocates and people on my side argue that this is not possible without undermining encryption because, just as you cannot be half pregnant, you cannot be half encrypted once you install tools for scanning for certain content. There is a danger that we say, “We’re only doing it for those things”, but actually it would be an attack on encryption itself.

Unlike the noble Baroness, Lady Bennett of Manor Castle, I know nothing about the technical aspects of this, as noble Lords can hear from the way I am speaking about it. But I can see from a common-sense point of view what encryption is: you cannot say, “We’re only going to use it a little bit”. That is my point.

I want to tackle the issue of child abuse, because I know that it lurks around here. It is what really motivates the people who say, “It’s ok as long as we can deal with that”. This is put forward as a proposed solution to the problem of encrypted chat services that send messages of that nature and the question of what we can do about it. Of course I stress that images of child abuse and exploitation are abhorrent—that is a very important background to this conversation—but I want to draw attention to the question of what we are prepared to do about child abuse, because I think it was referred to in an earlier group. I am nervous that we are promising a silver bullet through this Bill that it will all be solved through some of these measures.

Online Safety Bill

Lord Moylan Excerpts
Moved by
17: Clause 9, page 7, line 30, leave out “prevent individuals from” and insert “protect individuals from harms arising due to them”
Member’s explanatory statement
This amendment, along with the other amendment to Clause 9 in the name of Lord Moylan, adds a requirement to protect individuals from harm, rather than monitoring, prior restraint and/or denial of access. Further obligations to mitigate and manage harm, including to remove unlawful content that is signalled to the service provider, are unchanged by this amendment.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

My Lords, this is a very large and wide-ranging group of amendments. Within it, I have a number of amendments that, on their own, span three separate subjects. I propose to address these one after the other in my opening remarks, but other subjects will be brought in as the debate continues and other noble Lords speak to their own amendments.

If I split the amendments that I am speaking to into three groups, the first is Amendments 17 and 18. These relate to Clause 9, on page 7, where safety duties about illegal content are set out. The first of those amendments addresses the obligation to prevent individuals encountering priority illegal content by means of the service.

Earlier this week in Committee, I asked the Minister whether the Government understood “prevent” and “protect”, both of which they use in the legislation, to have different weight. I did not expect my noble friend to give an answer at that point, but I know that he will have reflected on it. We need clarity about this at some point, because courts will be looking at, listening to and reading what the Government say at the Dispatch Box about the weight to be given to these words. To my mind, to prevent something happening requires active measures in advance that ensure as far as reasonably and humanly possible that it does not actually happen, but one could be talking about something more reactive to protect someone from something happening.

This distinction is of great importance to internet companies—I am not talking about the big platforms—which will be placed, as I say repeatedly, under very heavy burdens by the Bill. It is possible that they simply will not be able to discharge them and will have to go out of business.

Let us take Wikipedia, which was mentioned earlier in Committee. It operates in 300 languages but employs 700 moderators globally to check what is happening. If it is required by Clause 9 to

“prevent individuals from encountering priority illegal content by means of the service”,

it will have to scrutinise what is put up on this community-driven website as or before it appears. Quite clearly, something such as Welsh Wikipedia—there is Wikipedia in Welsh—simply would not get off the ground if it had to meet that standard, because the number of people who would have to be employed to do that would be far more than the service could sustain. However, if we had something closer to the wording I suggest in my amendment, where services have to take steps to “protect” people—so they could react to something and take it down when they become aware of it—it all becomes a great deal more tolerable.

Similarly, Amendment 18 addresses subsection (3) of the same clause, where there is a

“duty to operate a service using proportionate systems and processes … to … minimise the length of time”

for which content is present. How do you know whether you are minimising the length of time? How is that to be judged? What is the standard by which that is to be measured? Would it not be a great deal better and more achievable if the wording I propose, which is that you simply are under an obligation to take it down, were inserted? That is my first group of amendments. I put that to my noble friend and say that all these amendments are probing to some extent at this stage. I would like to hear how he thinks that this can actually be operated.

My second group is quite small, because it contains only Amendment 135. Here I am grateful to the charity JUSTICE for its help in drawing attention to this issue. This amendment deals with Schedule 7, on page 202, where the priority offences are set out. Paragraph 4 of the schedule says that a priority offence includes:

“An offence under any of the following provisions of the Public Order Act 1986”.


One of those is Section 5 of that Act, “Harassment, alarm or distress”. Here I make a very different point and return to territory I have been familiar with in the past. We debated this only yesterday in Grand Committee, although I personally was unable to be there: the whole territory of hate crimes, harmful and upsetting words, and how they are to be judged and dealt with. In this case, my amendment would remove Section 5 of the Public Order Act from the list of priority offences.

If society has enough problems tolerating the police going round and telling us when we have done or said harmful and hurtful things and upbraiding us for it, is it really possible to consider—without the widest form of censorship—that it is appropriate for internet platforms to judge us, shut us down and shut down our communications on the basis of their judgment of what we should be allowed to say? We already know that there is widespread suspicion that some internet platforms are too quick to close down, for example, gender critical speech. We seem to be giving them something close to a legislative mandate to be very trigger-happy when it comes to closing down speech by saying that it engages, or could engage, Section 5 of the Public Order Act. I will come to the question of how they judge it in my third group, in a moment—but the noble Lord might be able to help me.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

Just to reinforce the point the noble Lord, Lord Moylan, made on that, I certainly had experience of where the police became the complainants. They would request, for example, that you take down an English Defence League event, claiming that it would be likely to cause a public order problem. I have no sympathy whatever with the English Defence League, but I am very concerned about the police saying “You must remove a political demonstration” to a platform and citing the legal grounds for doing that. The noble Lord is on to a very valid point to be concerned about that.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

I am grateful to the noble Lord. I really wonder whether the Government realise what they are walking into here. On the one hand, yesterday the Grand Committee was debating the statutory instrument putting in place new statutory guidance for the police on how to enforce, much more sensitively than in the past, non-crime hate incidents. However, on the other hand, the next day in this Chamber we are putting an obligation on a set of mostly foreign private companies to act as a police force to go around bullying us and closing us down if we say something that engages Section 5 of the Public Order Act. I think this is something the Government are going to regret, and I would very much like to hear what my noble friend has to say about that.

Finally, I come to my third group of amendments: Amendments 274, 278, 279 and 283. They are all related and on one topic. These relate to the text of the Bill on page 145, in Clause 170. Here we are discussing what judgments providers have to make when they come to decide what material to take down. Inevitably, they will have to make judgments. That is one of the unfortunate things about this Bill. A great deal of what we do in our lives is going to have to be based on judgments made by private companies, many of which are based abroad but which we are trying to legislate for.

It makes a certain sense that the law should say what they should take account of in making those judgments. But the guidance—or rather, the mandate—given to those companies by Clause 170 is, again, very hair-trigger. Clause 170(5), which I am proposing we amend, states:

“In making such judgements, the approach to be followed is whether a provider has reasonable grounds to infer that content is … of the kind in question”.


I am suggesting that “reasonable grounds to infer” should be replaced with “sufficient evidence to infer”, so that they have to be able to produce some evidence that they are justified in taking content down. The test should be higher than simply having “reasonable grounds”, which may rest on a suspicion and little evidence at all. So one of those amendments relates to strengthening that bar so that they must have real evidence before they can take censorship action.

I add only two words to subsection (6), which talks about reasonable grounds for the inference—it defines what the reasonable grounds are—that

“exist in relation to content and an offence if, following the approach in subsection (2)”

and so on. I am saying “if and only if”—in other words, I make it clear that this is the only basis on which material can be censored using the provisions in this section, so as to limit it from going more widely. The third amendment in my group is essentially consequential to that.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I am struggling a little to understand why the Minister thinks that sufficient evidence is subjective, and therefore, I assume, reasonable grounds to infer is objective. Certainly, in my lexicon, evidence is more objective than inference, which is more subjective. I was reacting to that word. I am not sure that he has fully made the case as to why his wording is better.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - -

Or indeed any evidence.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I take the noble Lord’s point and my noble friend’s further contribution. I will see whether I can give a clearer and more succinct description in writing to flesh that out, but that it is the reason that we have alighted on the words that we have.

The noble Lord, Lord Allan, also asked about jurisdiction. If an offence has been committed in the UK and viewed by a UK user, it can be treated as illegal content. That is set out in Clause 53(11), which says:

“For the purposes of determining whether content amounts to an offence, no account is to be taken of whether or not anything done in relation to the content takes place in any part of the United Kingdom”.


I hope that that bit, at least, is clearly set out to the noble Lord’s satisfaction. It looks like it may not be.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

If it has been committed in the UK and is viewed by a UK user, it can be treated as illegal. I will follow up on the noble Lord’s further points ahead of the next stage.

Amendment 272 explicitly provides that relevant information that is reasonably available to a provider includes information submitted by users in complaints. Providers will already need to do this when making judgments about content, as it will be both relevant and reasonably available.

My noble friend Lord Moylan returned to the question that arose on day 2 in Committee, querying the distinction between “protect” and “prevent”, and suggesting that a duty to protect would or could lead to the excessive removal of content. To be clear, the duty requires platforms to put in place proportionate systems and processes designed to prevent users encountering content. I draw my noble friend’s attention to the focus on systems and processes in that. This requires platforms to design their services to achieve the outcome of preventing users encountering such content. That could include upstream design measures, as well as content identification measures, once content appears on a service. By contrast, a duty to protect is a less stringent duty and would undermine the proactive nature of the illegal content duties for priority offences.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

Before he moves on, is my noble friend going to give any advice to, for example, Welsh Wikipedia, as to how it will be able to continue, or are the concerns about smaller sites simply being brushed aside, as my noble friend explicates what the Bill already says?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will deal with all the points in the speech. If I have not done so by the end, and if my noble friend wants to intervene again, I would be more than happy to hear further questions, either to answer now or write to him about.

Amendments 128 to 133 and 143 to 153, in the names of the right reverend Prelate the Bishop of Derby and the noble Lord, Lord Stevenson of Balmacara, seek to ensure that priority offences relating to modern slavery and human trafficking, where they victimise children, are included in Schedule 6. These amendments also seek to require technology companies to report content which relates to modern slavery and the trafficking of children—including the criminal exploitation of children—irrespective of whether it is sexual exploitation or not. As noble Lords know, the strongest provisions in the Bill relate to children’s safety, and particularly to child sexual exploitation and abuse content. These offences are captured in Schedule 6. The Bill includes a power for Ofcom to issue notices to companies requiring them to use accredited technology or to develop new technology to identify, remove and prevent users encountering such illegal content, whether communicated publicly or privately.

These amendments would give Ofcom the ability to issue such notices for modern slavery content which affects children, even when there is no child sexual exploitation or abuse involved. That would not be appropriate for a number of reasons. The power to tackle illegal content on private communications has been restricted to the identification of content relating to child sexual exploitation and abuse because of the particular risk to children posed by content which is communicated privately. Private spaces online are commonly used by networks of criminals to share illegal images—as we have heard—videos, and tips on the commitment of these abhorrent offences. This is highly unlikely to be reported by other offenders, so it will go undetected if companies do not put in place measures to identify it. Earlier in Committee, the noble Lord, Lord Allan, suggested that those who receive it should report it, but of course, in a criminal context, a criminal recipient would not do that.

Extending this power to cover the identification of modern slavery in content which is communicated privately would be challenging to justify and could represent a disproportionate intrusion into someone’s privacy. Furthermore, modern slavery is usually identified through patterns of behaviour or by individual reporting, rather than through content alone. This reduces the impact that any proactive technology required under this power would have in tackling such content. Schedule 6 already sets out a comprehensive list of offences relating to child sexual exploitation and abuse which companies must tackle. If these offences are linked to modern slavery—for example, if a child victim of these offences has been trafficked—companies must take action. This includes reporting content which amounts to an offence under Schedule 6 to the National Crime Agency or another reporting body outside of the UK.

My noble friend Lord Moylan’s Amendment 135 seeks to remove the offence in Section 5 of the Public Order Act 1986 from the list of priority offences. His amendment would mean that platforms were not required to take proactive measures to reduce the risk of content which is threatening or abusive, and intended to cause a user harassment, alarm or distress, from appearing on their service. Instead, they would be obliged to respond only once they are made aware of the content, which would significantly reduce the impact of the Bill’s framework for tackling such threatening and abusive content. Given the severity of the harm which can be caused by that sort of content, it is right that companies tackle it. Ofcom will have to include the Public Order Act in its guidance about illegal content, as provided for in Clause 171.

Government Amendments 136A to 136C seek to strengthen the illegal content duties by adding further priority offences to Schedule 7. Amendments 136A and 136B will add human trafficking and illegal entry offences to the list of priority offences in the Bill. Crucially, this will mean that platforms will need to take proactive action against content which encourages or assists others to make dangerous, illegal crossings of the English Channel, as well as those who use social media to arrange or facilitate the travel of another person with a view to their exploitation.

The noble Lord, Lord Allan, asked whether these amendments would affect the victims of trafficking themselves. This is not about going after the victims. Amendment 136B addresses only content which seeks to help or encourage the commission of an existing immigration offence; it will have no impact on humanitarian communications. Indeed, to flesh out a bit more detail, Section 2 of the Modern Slavery Act makes it an offence to arrange or facilitate the travel of another person, including through recruitment, with a view to their exploitation. Facilitating a victim’s travel includes recruiting them. This offence largely appears online in the form of advertisements to recruit people into being exploited. Some of the steps that platforms could put in place include setting up trusted flagger programmes, signposting users to support and advice, and blocking known bad actors. Again, I point to some of the work which is already being done by social media companies to help tackle both illegal channel crossings and human trafficking.

--- Later in debate ---
My noble friend Lord Bethell anticipated later debates on age verification and pornography. If he permits, I will come back on his points then. I have noted his question for that discussion as well as the question from the noble Lord, Lord Stevenson, on financial scams and fraud, which we will have the chance to discuss in full. I am not sure if my noble friend Lord Moylan wants to ask a further question at this juncture or to accept a reassurance that I will consult the Official Report and write on any further points he raised which I have not dealt with.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

My Lords, it is genuinely difficult to summarise such a wide-ranging debate, which was of a very high standard. Only one genuinely bright idea has emerged from the whole thing: as we go through Committee, each group of amendments should be introduced by the noble Lord, Lord Allan of Hallam, because it is only after I have heard his contribution on each occasion that I have begun to understand the full complexity of what I have been saying. I suspect I am not alone in that and that we could all benefit from hearing the noble Lord before getting to our feet. That is not meant to sound the slightest bit arch; it is absolutely genuine.

The debate expressed a very wide range of concerns. Concerns about gang grooming and recruiting were expressed on behalf of the right reverend Prelate the Bishop of Derby and my noble friend Lady Buscombe expressed concerns about trolling of country businesses. However, I think it is fair to say that most speakers focused on the following issues. The first was the definition of legality, which was so well explicated by the noble Lord, Lord Allan of Hallam. The second was the judgment bar that providers have to pass to establish whether something should be taken down. The third was the legislative mandating of private foreign companies to censor free speech rights that are so hard-won here in this country. These are the things that mainly concern us.

I was delighted that I found myself agreeing so much with what the noble Baroness, Lady Kidron, said, even though she was speaking in another voice or on behalf of another person. If her own sentiments coincide with the sentiments of the noble Viscount—

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I am sorry to intrude, but I must say now on the record that I was speaking on my own behalf. The complication of measuring and those particular things are terribly important to establish, so I am once again happy to agree with the noble Lord.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

I am delighted to hear the noble Baroness say that, and it shows that that pool of common ground we share is widening every time we get to our feet. However, the pool is not particularly widening, I am afraid to say—at least in respect of myself; other noble Lords may have been greatly reassured—as regards my noble friend the Minister who, I am afraid, has not in any sense addressed the issues about free speech that I and many other noble Lords raised. On some issues we in the Committee are finding a consensus that is drifting away from the Minister. We probably need to put our heads together more closely on some of these issues with the passage of time in Committee.

My noble friend also did not say anything that satisfied me in respect of the practical operation of these obligations for smaller sites. He speaks smoothly and persuasively of risk-based proactive approaches without saying that, for a large number of sites, this legislation will mean a complete re-engineering of their business model. For example, where Wikipedia operates in a minority language, such as in Welsh Wikipedia, which is the largest Welsh language website in the world, if its model is to involve monitoring what is put out by the community and correcting it as it goes along, rather than having a model in advance that is designed to prevent things being put there in the first place, then it is very likely to close down. If that is one of the consequences of this Bill the Government will soon hear about it.

Finally, although I remain concerned about public order offences, I have to say to the Minister that if he is so concerned about the dissemination of alarm among the population under the provisions of the Public Order Act, what does he think that His Majesty’s Government were doing on Sunday at 3 pm? I beg leave to withdraw the amendment.

Amendment 17 withdrawn.
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - -

My Lords, I was not going to speak on this group, but I was provoked into offering some reflections on the speech by the noble Lord, Lord Russell of Liverpool, especially his opening remarks about cars and planes, which he said were designed to be safe. He did not mention trains, about which I know something as well, and which are also designed to be safe. These are a few initial reflective points. They are designed in very different ways. An aeroplane is designed never to fail; a train is designed so that if it fails, it will come to a stop. They are two totally different approaches to safety. Simply saying that something must be designed to be safe does not answer questions; it opens questions about what we actually mean by that. The noble Lord went on to say that we do not allow children to drive cars and fly planes. That is absolutely true, but the thrust of his amendment is that we should design the internet so that it can be driven by children and used by children— so that it is designed for them, not for adults. That is my problem with the general thrust of many of these amendments.

A further reflection that came to mind as the noble Lord spoke was on a book of great interest that I recommend to noble Lords. It is a book by the name of Risk written in 1995 by Professor John Adams, then professor of geography at University College London. He is still an emeritus professor of geography there. It was a most interesting work on risk. First, it reflected how little we actually know of many of the things of which we are trying to assess risk.

More importantly, he went on to say that people have an appetite for risk. That appetite for risk—that risk budget, so to speak—changes over the course of one’s life: one has much less appetite for risk when one gets to a certain age than perhaps one had when one was young. I have never bungee jumped in my life, and I think I can assure noble Lords that the time has come when I can say I never shall, but there might have been a time when I was younger when I might have flung myself off a cliff, attached to a rubber band and so forth—noble Lords may have done so. One has an appetite for risk.

The interesting thing that he went on to develop from that was the notion of risk compensation: that if you have an appetite for risk and your opportunities to take risks are taken away, all you do is compensate by taking risks elsewhere. So a country such as New Zealand, which has some of the strictest cycling safety laws, also has a very high incidence of bungee jumping among the young; as they cannot take risks on their bicycles, they will find ways to go and do it elsewhere.

Although these reflections are not directly germane to the amendments, they are important as we try to understand what we are seeking to achieve here, which is a sort of hermetically sealed absence of risk for children. I do not think it will work. I said at Second Reading that I thought the flavour of the debate was somewhat similar to a late medieval conclave of clerics trying to work out how to mitigate the harmful effects of the invention of movable type. That did not work either, and I think we are in a very similar position today as we discuss this.

There is also the question of harm and what it means. While the examples being given by noble Lords are very specific and no doubt genuinely harmful, and are the sorts of things that we should like to stop, the drafting of the amendments, using very vague words such as “harm”, is dangerous overreach in the Bill. To give just one example, for the sake of speed, when I was young, administering the cane periodically was thought good for a child in certain circumstances. The mantra was, “Spare the rod and spoil the child”, though I never heard it said. Nowadays, we would not think it morally or psychologically good to do physical harm to a child. We would regard it as an unmitigated harm and, although not necessarily banned or illegal, it is something that—

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- Hansard - - - Excerpts

My Lords, I respond to the noble Lord in two ways. First, I ask him to reflect on how the parents of the children who have died through what the parents would undoubtedly view as serious and unbearable harm would feel about his philosophical ruminations. Secondly, as somebody who has the privilege of being a Deputy Speaker in your Lordships’ House, it is incumbent and germane for us all to focus on the amendment in question and stay on it, to save time and get through the business.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

Well, I must regard myself as doubly rebuked, and unfairly, because my reflections are very relevant to the amendments, and I have developed them in that direction. In respect of the parents, they have suffered very cruelly and wrongly, but although it may sound harsh, as I have said in this House before on other matters, hard cases make bad law. We are in the business of trying to make good law that applies to the whole population, so I do not think that these are wholly—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

If my noble friend could, would he roll back the health and safety regulations for selling toys, in the same way that he seems so happy to have no health and safety regulations for children’s access to digital toys?

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

My Lords, if the internet were a toy, aimed at children and used only by children, those remarks would of course be very relevant, but we are dealing with something of huge value and importance to adults as well. It is the lack of consideration of the role of adults, the access for adults and the effects on freedom of expression and freedom of speech, implicit in these amendments, that cause me so much concern.

I seem to have upset everybody. I will now take issue with and upset the noble Baroness, Lady Benjamin, with whom I have not engaged on this topic so far. At Second Reading and earlier in Committee, she used the phrase, “childhood lasts a lifetime”. There are many people for whom this is a very chilling phrase. We have an amendment in this group—a probing amendment, granted—tabled by the noble Lord, Lord Knight of Weymouth, which seeks to block access to VPNs as well. We are in danger of putting ourselves in the same position as China, with a hermetically sealed national internet, attempting to put borders around it so that nobody can breach it. I am assured that even in China this does not work and that clever and savvy people simply get around the barriers that the state has erected for them.

Before I sit down, I will redeem myself a little, if I can, by giving some encouragement to the noble Baroness, Lady Kidron, on Amendments 28 and 32 —although I think the amendments are in the name of the noble Lord, Lord Russell of Liverpool. These amendments, if we are to assess the danger posed by the internet to children, seek to substitute an assessment of the riskiness of the provider for the Government’s emphasis on the size of the provider. As I said earlier in Committee, I do not regard size as being a source of danger. When it comes to many other services— I mentioned that I buy my sandwich from Marks & Spencer as opposed to a corner shop—it is very often the bigger provider I feel is going to be safer, because I feel I can rely on its processes more. So I would certainly like to hear how my noble friend the Minister responds on that point in relation to Amendments 28 and 32, and why the Government continue to put such emphasis on size.

More broadly, in these understandable attempts to protect children, we are in danger of using language that is far too loose and of having an effect on adult access to the internet which is not being considered in the debate—or at least has not been until I have, however unwelcomely, raised it.

Lord Vaizey of Didcot Portrait Lord Vaizey of Didcot (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I assure your Lordships that I rise to speak very briefly. I begin by reassuring my noble friend Lord Moylan that he is loved in this Chamber and outside. I was going to say that he is the grit in the oyster that ensures that a consensus does not establish itself and that we think hard about these amendments, but I will revise that and say he is now the bungee jumper in our ravine. I think he often makes excellent and worthwhile points about the scope and reach of the Bill and the unintended consequences. Indeed, we debated those when we debated the amendments relating to Wikipedia, for example.

Obviously, I support these amendments in principle. The other reason I wanted to speak was to wish the noble Baroness, Lady Kidron—Beeban—a happy birthday, because I know that these speeches will be recorded on parchment bound in vellum and presented to her, but also to thank her for all the work that she has done for many years now on the protection of children’s rights on the internet. It occurred to me, as my noble friend Lady Harding was speaking, that there were a number of points I wanted to seek clarity on, either from the Minister or from the proponents of the amendments.

First, the noble Baroness, Lady Harding, mentioned the age-appropriate design code, which was a victory for the noble Baroness, Lady Kidron. It has, I think, already had an impact on the way that some sites that are frequented by children are designed. I know, for instance, that TikTok—the noble Baroness will correct me—prides itself on having made some changes as a result of the design code; for example, its algorithms are able, to a certain extent, to detect whether a child is under 13. I know anecdotally that children under 13 sometimes do have their accounts taken away; I think that is a direct result of the amendments made by the age-appropriate design code.

I would like to understand how these amendments, and the issue of children’s rights in this Bill, will interact with the age-appropriate design code, because none of us wants the confetti of regulations that either overlap or, worse, contradict themselves.

Secondly, I support the principle of functionality. I think it is a very important point that these amendments make: the Bill should not be focused solely on content but should take into account that functionality leads to dangerous content. That is an important principle on which platforms should be held to account.

Thirdly, going back to the point about the age-appropriate design code, the design of websites is extremely important and should be part of the regulatory system. Those are the points I wanted to make.

Online Safety Bill

Lord Moylan Excerpts
Debate on Amendment 33B resumed.
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - -

My Lords, I will speak to Amendment 155 in my name, and I am grateful for the support of the noble Baroness, Lady Fox of Buckley, and my noble friend Lord Strathcarron. Some of my remarks in Committee last week did not go down terribly well with Members and, in retrospect, I realise that that was because I was the only Member of the Committee that day who did not take the opportunity to congratulate the noble Baroness, Lady Kidron, on her birthday. So at this very late stage—a week later —I make good that deficiency and hope that, in doing so, I will get a more jocular and welcoming hearing than I did last week. I will speak in a similar vein, though on a different topic and part of the Bill.

This amendment relates to Clause 65, which has 12 subsections. I regard the first subsection as relatively uncontroversial; it imposes a duty on all service providers. The effect of this amendment would be to remove all the remaining subsections, which fall particularly on category 1 providers. What Clause 65 does, in brief, is to make it a statutory obligation for category 1 providers to live up to their terms of service. Although it does not seek to specify what the terms of service must be, it does, in some ways, specify how they should be operated once they have been written—I regard that as very odd, and will come back to the reason why.

I say at the outset that I understand the motivation behind this section of the Bill. It addresses the understandable feeling that if a service provider of any sort says that they have terms of service which mean that, should there be complaints, they will be dealt with in a certain way and to a certain timetable and that you will get a response by a certain time, or if they say that they will remove certain material, that they should do what they say they will do in the terms of service. I understand what the clause is trying to do —to oblige service providers to live up to their terms of service—but this is a very dangerous approach.

First of all, while terms of service are a civil contract between the provider and the user, they are not an equal contract, as we all know. They are written for the commercial benefit and advantage of the companies that write them—not just in the internet world; this is generally true—and they are written on a take it or leave it basis. Of course, they cannot be egregiously disadvantageous to the customer or else the customer would not sign up to them; none the less, they are drafted with the commercial and legal advantage of the companies in question. Terms of service can be extreme. Noble Lords may be aware that, if you have a bank account, the terms of service that your bank has, in effect, imposed on you almost certainly include a right for the bank to close your account at any time it wishes and to give no reason for doing so. I regard that as an extreme terms of service provision, but it is common. They are not written as equal contracts between consumers and service providers.

Why, therefore, would we want to set terms of service in statute? That is what this clause does: to make them enforceable by a regulator under statute. Moreover, why would we want to do it when the providers we are discussing will have, in practice, almost certainly drafted their terms of service under the provisions of a foreign legal system, which we are then asking our regulator to ensure is enforced? My objection is not to try to find a way of requiring providers to live up to the terms of service they publish—indeed, the normal process for doing so would be through a civil claim; instead, I object to the method of doing so set out in this section of the Bill.

We do not use this method with other terms of service features. For example, we do not have a regulator who enforces terms of service on data protection; we have a law that says what companies must do to protect data, and then we expect them to draft terms of service, and to conduct themselves in other ways, that are compatible with that law. We do not make the terms of services themselves enforceable through statute and regulation, yet that is what this Bill does.

When we look at the terms of service of the big providers on the internet—the sorts of people we have in mind for the scope of the Bill—we find that they give themselves, in their terms of service, vast powers to remove a wide range of material. Much of that would fall—I say this without wanting to be controversial —into the category of “legal but harmful”, which in some ways this clause is reviving through the back door.

Of course, what could be “harmful” is extremely wide, because it will have no statutory bounds: it will be whatever Twitter or Google say they will remove in their terms of service. We have no control over what they say in their terms of service; we do not purport to seek such control in the Bill or in this clause. Twitter policy, for example, is to take down material that offends protected characteristics such as “gender” and “gender identity”. Now, those are not protected characteristics in the UK; the relevant protected characteristics in the Equality Act are “sex” and “gender reassignment”. So this is not enforcing our law; our regulator will be enforcing a foreign law, even though it is not the law we have chosen to adopt here.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

My Lords, my noble friend has explained clearly how terms of service would normally work, which is that, as I said myself, a business might write its own terms of service to its own advantage but it cannot do so too egregiously or it will lose customers, and businesses may aim themselves at different customers. All this is part of normal commercial life, and that is understood. What my noble friend has not really addressed is the question of why uniquely and specifically in this case, especially given the egregious history of censorship by Silicon Valley, he has chosen to put that into statute rather than leave it as a commercial arrangement, and to make it enforceable by Ofcom. For example, when my right honourable friend David Davis was removed from YouTube for his remarks about Covid passes, it would have been Ofcom’s obligation not to vindicate his right to free speech but to cheer on YouTube and say how well it had done for its terms of service.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Our right honourable friend’s content was reuploaded. This makes the point that the problem at the moment is the opacity of these terms and conditions; what platforms say they do and what they do does not always align. The Bill makes sure that users can hold them to account for the terms of service that they publish, so that people can know what to expect on platforms and have some form of redress when their experience does not match their expectations.

I was coming on to say a bit more about that after making some points about foreign jurisdictions and my noble friend’s Amendment 155. As I say, parts or versions of the service that are used in foreign jurisdictions but not in the UK are not covered by the duties in Clause 65. As such, the Bill does not require a provider to have systems and processes designed to enforce any terms of service not applicable in the UK.

In addition, the duties do not give powers to Ofcom to enforce a provider’s terms of service directly. Ofcom’s role will be focused on ensuring that platforms have systems and processes in place to enforce their own terms of service consistently rather than assessing individual pieces of content.

Requiring providers to set terms of service for specific types of content suggests that the Government view that type of content as harmful or risky. That would encourage providers to prohibit such content, which of course would have a negative impact on freedom of expression, which I am sure is not what my noble friend wants to see. Freedom of expression is essential to a democratic society. Throughout the passage of the Bill, the Government have always committed to ensuring that people can speak freely online. We are not in the business of indirectly telling companies what legal content they can and cannot allow online. Instead, the approach that we have taken will ensure that platforms are transparent and accountable to their users about what they will and will not allow on their services.

Clause 65 recognises that companies, as private entities, have the right to remove content that is legal from their services if they choose to do so. To prevent them doing so, by requiring them to balance this against other priorities, would have perverse consequences for their freedom of action and expression. It is right that people should know what to expect on platforms and that they are able to hold platforms to account when that does not happen. On that basis, I invite the noble Lords who have amendments in this group not to press them.

--- Later in debate ---
Moved by
38: Clause 12, page 12, line 24, leave out subsection (6)
Member’s explanatory statement
This amendment, along with the other amendment to Clause 12 in the name of Lord Moylan, removes requirements on sites to display, on demand, only the parts of a conversation (or in the case of collaboratively-edited content, only the parts of a paragraph, sentence or article) that were written by “verified” users, and to prevent other users from amending (e.g. improving), or otherwise interacting with, such contributions.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

My Lords, I am going to endeavour to be relatively brief. I rise to move Amendment 38 and to speak to Amendments 39, 139 and 140 in this group, which are in my name. All are supported by my noble friend Lord Vaizey of Didcot, to whom I am grateful.

Amendments 38 and 39 relate to Clause 12. They remove subsections (6) and (7) from the Bill; that is, the duty to filter out non-verified users. Noble Lords will understand that this is different from the debate we have just had, which was about content. This is about users and verification of the users, rather than the harm or otherwise of the content. I am sure I did not need to say that, but perhaps it helps to clarify my own thinking to do so. Amendments 139 and 140 are essentially consequential but make it clear that my amendments do not prohibit category 1 services from offering this facility. They make it a choice, not a duty.

I want to make one point only in relation to these amendments. It has been well said elsewhere that this is a Twitter-shaped Bill, but it is trying to apply itself to a much broader part of the internet than Twitter, or things like it. In particular, community-led services like Wikipedia, to which I have made reference before, operate on a totally different basis. The Bill seeks to create a facility whereby members of the public like you and me can, first, say that we want the provider to offer a facility for verifying those who might use their service, and secondly, for us, as members of the public, to be able to say we want to see material from only those verified accounts. However, the contributors to Wikipedia are not verified, because Wikipedia has no system to verify them, and therefore it would be impossible for Wikipedia, as a category 1 service, to be able to comply with this condition on its current model, which is a non-commercial, non-profit one, as noble Lords know from previous comments. It would not be able to operate this clause; it would have to say that either it is going to require every contributing editor to Wikipedia to be verified first in order to do so, which would be extremely onerous; or it would have to make it optional, which would be difficult, but lead to the bizarre conclusion that you could open an article on Wikipedia and find that some of its words or sentences were blocked, and you could not read them because those amendments to the article had been made by someone who had not been verified. Of course, putting a system in place to allow that absurd outcome would itself be an impossible burden on Wikipedia.

My complaint—as always, in a sense—about the Bill is that it misfires. Every time you touch it, it misfires in some way because it has not been properly thought through. It is perhaps trying to do too much across too broad a front, when it is clear that the concern of the Committee is much narrower than trying to bowdlerize Wikipedia articles. That is not the objective of anybody here, but it is what the Bill is tending to do.

I will conclude by saying—I invite my noble friend to comment on this if he wishes; I think he will have to comment on it at some stage—that in reply to an earlier Committee debate, I heard him say somewhat tentatively that he did not think that Wikipedia would qualify as a category 1 service. I am not an advocate for Wikipedia; I am just a user. But we need to know what the Government’s view is on the question of Wikipedia and services like it. Wikipedia is the only community-led service, I think, of such a scale that it would potentially qualify as category 1 because of its size and reach.

If the Minister’s view is that Wikipedia would not qualify as a category 1 service—in which case, my amendments are irrelevant because it would not be caught by this clause—then he needs to say so. More than that, he needs to say on what basis it would not qualify as a category 1 service. Would it be on the face of the Bill? If not, would it be in the directions given by the Secretary of State to the regulator? Would it be a question of the regulator deciding whether it was a category 1 service? Obviously, if you are trying to run an operation such as Wikipedia with a future, you need to know which of those things it is. Do you have legal security against being determined as a category 1 provider or is it merely at the whim—that is not the right word; the decision—of the regulator in circumstances that may legitimately change? The regulator may have a good or bad reason for changing that determination later. You cannot run a business not knowing these things.

I put it to noble Lords that this clause needs very careful thinking through. If it is to apply to community-led services such as Wikipedia, it is an absurdity. If it is not to apply to them because what I think I heard my noble friend say pertains and they are not, in his view, a category 1 service, why are they not a category 1 service? What security do they have in knowing either way? I beg to move.

Baroness Buscombe Portrait Baroness Buscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendment 106 in my name and the names of my noble and learned friend Lord Garnier and the noble Lord, Lord Moore of Etchingham. This is one of five amendments focused on the need to address the issue of activist-motivated online bullying and harassment and thereby better safeguard the mental health and general well-being of potential victims.

Schedule 4, which defines Ofcom’s objectives in setting out codes of practice for regulated user-to-user services, should be extended to require the regulator to consider the protection of individuals from communications offences committed by anonymous users. The Government clearly recognise that there is a threat of abuse from anonymous accounts and have taken steps in the Bill to address that, but we are concerned that their approach is insufficient and may be counterproductive.

I will explain. The Government’s approach is to require large social media platforms to make provision for users to have their identity verified, and to have the option of turning off the ability to see content shared by accounts whose owners have not done this. However, all this would mean is that people could not see abuse being levelled at them. It would not stop the abuse happening. Crucially, it would not stop other people seeing it, or the damage to his or her reputation or business that the victim may suffer as a result. If I am a victim of online bullying and harassment, I do not want to see it, but I do not want it to be happening at all. The only means I have of stopping it is to report it to the platform and then hope that it takes the right action. Worse still, if I have turned off the ability to see content posted by unverified—that is, anonymous—accounts, I will not be able to complain to the platform as I will not have seen it. It is only when my business goes bust or I am shunned in the street that I realise that something is wrong.

The approach of the Bill seems to be that, for the innocent victim—who may, for example, breed livestock for consumption—it is up that breeder to be proactive to correct harm already done by someone who does not approve of eating meat. This is making a nonsense of the law. This is not how we make laws in this country —until now, it seems. Practically speaking, the worst that is likely to happen is that the platform might ban their account. However, if their victims have had no opportunity to read the abuse or report it, even that fairly low-impact sanction could not be levelled against them. In short, the Bill’s current approach, I am sorry to say, would increase the sense of impunity, not lessen it.

One could argue that, if a potential abuser believes that their victim will not read their abuse, they will not bother issuing it. Unfortunately, this misunderstands the psyche of the online troll. Many of them are content to howl into the void, satisfied that other people who have not turned on the option to filter out content from unverified accounts will still be able to read it. The troll’s objective of harming the victim may be partially fulfilled as a result.

There is also the question of how much uptake there will be of the option to verify one’s identity, and numerous questions about the factors that this will depend on. Will it be attractive? Will there be a cost? How quick and efficient will the process be? Will platforms have the capacity to implement it at scale? Will it have to be done separately for every platform?

If uptake of verification is low, most people simply will not use the option to filter content of unverified accounts, even if it means that they remain more susceptible to abuse, since they would be cutting themselves off from most of their users. Clearly, that is not an option for anyone using social media for any promotional purpose. Even those who use it for purely social reasons will find that they have friends who do not want to be verified. Fundamentally, people use social media because other people use it. Carving oneself off from most of them defeats the purpose of the exercise.

It is not clear what specific measures the Bill could take to address the issue. Conceivably, it could simply ban online platforms from maintaining user accounts whose owners have not had their identities verified. However, this would be truly draconian and most likely lead to major platforms exiting the UK market, as the noble Baroness, Lady Fox, has rightly argued in respect of other possible measures. It would also be unenforceable, since users could simply turn on a VPN, pretend to be from some other country where the rules do not apply and register an account as though they were in that country.

There are numerous underlying issues that the Bill recognises as problems but does not attempt to prescribe solutions for. Its general approach is to delegate responsibility to Ofcom to frame its codes of practice for operators to follow in order to effectively tackle these problems. Specifically, it sets out a list of objectives that Ofcom, in drawing up its codes of practice, will be expected to meet. The protection of users from abuse, specifically by unverified or anonymous users, would seem to be an ideal candidate for inclusion in this list of amendments. If required to do so, Ofcom could study the issue closely and develop more effective solutions over time.

I was pleased to see, in last week’s Telegraph, an article that gave an all too common example of where the livelihood of a chef running a pub in Cornwall has suffered what amounts to vicious abuse online from a vegan who obviously does not approve of the menu, and who is damaging the business’s reputation and putting the chef’s livelihood at risk. This is just one tiny example, if I can put it that way, of the many thousands that are happening all the time. Some 584 readers left comments, and just about everyone wrote in support of the need to do something to support that chef and tackle this vicious abuse.

I return to a point I made in a previous debate: livelihoods, which we are deeply concerned about, are at stake here. I am talking not about big business but about individuals and small and family businesses that are suffering—beyond abuse—loss of livelihood, financial harm and/or reputational damage to business, and the knock-on effects of that.

Online Safety Bill

Lord Moylan Excerpts
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the range of the amendments in this group indicates the importance of the Government’s approach to user verification and non-verified user duties. The way these duties have been designed seeks to strike a careful balance between empowering adults while safeguarding privacy and anonymity.

Amendments 38, 39, 139 and 140 have been tabled by my noble friend Lord Moylan. Amendments 38 and 39 seek to remove subsections (6) and (7) of the non-verified users’ duties. These place a duty on category 1 platforms to give adult users the option of preventing non-verified users interacting with their content, reducing the likelihood that a user sees content from non-verified users. I want to be clear that these duties do not require the removal of legal content from a service and do not impinge on free speech.

In addition, there are already existing duties in the Bill to safeguard legitimate online debate. For example, category 1 services will be required to assess the impact on free expression of their safety policies, including the impact of their user empowerment tools. Removing subsections (6) and (7) of Clause 12 would undermine the Bill’s protection for adult users of category 1 services, especially the most vulnerable. It would be entirely at the service provider’s discretion to offer users the ability to minimise their exposure to anonymous and abusive users, sometimes known as trolls. In addition, instead of mandating that users verify their identity, the Bill gives adults the choice. On that basis, I am confident that the Bill already achieves the effect of Amendment 139.

Amendment 140 seeks to reduce the amount of personal data transacted as part of the verification process. Under subsection (3) of Clause 57, however, providers will be required to explain in their terms of service how the verification process works, empowering users to make an informed choice about whether they wish to verify their identity. In addition, the Bill does not alter the UK’s existing data protection laws, which provide people with specific rights and protections in relation to the processing of their personal data. Ofcom’s guidance in this area will reflect existing laws, ensuring that users’ data is protected where personal data is processed. I hope my noble friend will therefore be reassured that these duties reaffirm the concept of choice and uphold the importance of protecting personal data.

While I am speaking to the questions raised by my noble friend, I turn to those he asked about Wikipedia. I have nothing further to add to the comments I made previously, not least that it is impossible to pre-empt the assessments that will be made of which services fall into which category. Of course, assessments will be made at the time, based on what the services do at the time of the assessment, so if he will forgive me, I will not be drawn on particular services.

To speak in more general terms, category 1 services are those with the largest reach and the greatest influence over public discourse. The Bill sets out a clear process for determining category 1 providers, based on thresholds set by the Secretary of State in secondary legislation following advice from Ofcom. That is to ensure that the process is objective and evidence based. To deliver this advice, Ofcom will undertake research into the relationship between how quickly, easily and widely user-generated content is disseminated by that service, the number of users and functionalities it has and other relevant characteristics and factors.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - -

Will my noble friend at least confirm what he said previously: namely, that it is the Government’s view—or at least his view—that Wikipedia will not qualify as a category 1 service? Those were the words I heard him use at the Dispatch Box.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

That is my view, on the current state of play, but I cannot pre-empt an assessment made at a point in the future, particularly if services change. I stand by what I said previously, but I hope my noble friend will understand if I do not elaborate further on this, at the risk of undermining the reassurance I might have given him previously.

Amendments 40, 41, 141 and 303 have been tabled by the noble Lord, Lord Stevenson of Balmacara, and, as noble Lords have noted, I have added my name to Amendment 40. I am pleased to say that the Government are content to accept it. The noble Baroness, Lady Merron, should not minimise this, because it involves splitting an infinitive, which I am loath to do. If this is a statement of intent, I have let that one go, in the spirit of consensus. Amendment 40 amends Clause 12(7) to ensure that the tools which will allow adult users to filter out content from non-verified users are effective and I am pleased to add my name to it.

Amendment 41 seeks to make it so that users can see whether another user is verified or not. I am afraid we are not minded to accept it. While I appreciate the intent, forcing users to show whether they are verified or not may have unintended consequences for those who are unable to verify themselves for perfectly legitimate reasons. This risks creating a two-tier system online. Users will still be able to set a preference to reduce their interaction with non-verified users without making this change.

Amendment 141 seeks to prescribe a set of principles and standards in Ofcom’s guidance on user verification. It is, however, important that Ofcom has discretion to determine, in consultation with relevant persons, which principles will have the best outcomes for users, while ensuring compliance with the duties. Further areas of the Bill also address several issues raised in this amendment. For example, all companies in scope will have a specific legal duty to have effective user reporting and redress mechanisms.

Existing laws also ensure that Ofcom’s guidance will reflect high standards. For example, it is a general duty of Ofcom under Section 3 of the Communications Act 2003 to further the interests of consumers, including by promoting competition. This amendment would, in parts, duplicate existing duties and undermine Ofcom’s independence to set standards on areas it deems relevant after consultation with expert groups.

Amendment 303 would add a definition of user identity verification. The definition it proposes would result in users having to display their real name online if they decide to verify themselves. In answer to the noble Baroness’s question, the current requirements do not specify that users must display their real name. The amendment would have potential safety implications for vulnerable users, for example victims and survivors of domestic abuse, whistleblowers and others of whom noble Lords have given examples in their contributions. The proposed definition would also create reliance on official forms of identification. That would be contrary to the existing approach in Clause 57 which specifically sets out that verification need not require such forms of documentation.

The noble Baroness, Lady Kidron, talked about paid-for verification schemes. The user identity verification provisions were brought in to ensure that adult users of the largest services can verify their identity if they so wish. These provisions are different from the blue tick schemes and others currently in place, which focus on a user’s status rather than verifying their identity. Clause 57 specifically sets out that providers of category 1 services will be required to offer all adult users the option to verify their identity. Ofcom will provide guidance for user identity verification to assist providers in complying with these duties. In doing so, it will consult groups that represent the interests of vulnerable adult users. In setting out recommendations about user verification, Ofcom must have particular regard to ensuring that providers of category 1 services offer users a form of identity verification that is likely to be available to vulnerable adult users. Ofcom will also be subject to the public sector equality duty, so it will need to take into account the ways in which people with certain characteristics may be affected when it performs this and all its duties under the Bill.

A narrow definition of identity verification could limit the range of measures that service providers might offer their users in the future. Under the current approach, Ofcom will produce and publish guidance on identity verification after consulting those with technical expertise and groups which represent the interests of vulnerable adult users.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes. The blue tick is certainly not identity verification. I will write to confirm on Meta, but they are separate and, as the example of blue ticks and Twitter shows, a changing feast. That is why I am talking in general terms about the approach, so as not to rely too much on examples that are changing even in the course of this Committee.

Government Amendment 43A stands in my name. This clarifies that “non-verified user” refers to users whether they are based in the UK or elsewhere. This ensures that, if a UK user decides he or she no longer wishes to interact with non-verified users, this will apply regardless of where they are based.

Finally, Amendment 106 in the name of my noble friend Lady Buscombe would make an addition to the online safety objectives for regulated user-to-user services. It would amend them to make it clear that one of the Bill’s objectives is to protect people from communications offences committed by anonymous users.

The Bill already imposes duties on services to tackle illegal content. Those duties apply across all areas of a service, including the way it is designed and operated. Platforms will be required to take measures—for instance, changing the design of functionalities, algorithms, and other features such as anonymity—to tackle illegal content.

Ofcom is also required to ensure that user-to-user services are designed and operated to protect people from harm, including with regard to functionalities and other features relating to the operation of their service. This will likely include the use of anonymous accounts to commit offences in the scope of the Bill. My noble friend’s amendment is therefore not needed. I hope she will be satisfied not to press it, along with the other noble Lords who have amendments in this group.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

My Lords, I would like to say that that was a rewarding and fulfilling debate in which everyone heard very much what they wanted to hear from my noble friend the Minister. I am afraid I cannot say that. I think it has been one of the most frustrating debates I have been involved in since I came into your Lordships’ House. However, it gave us an opportunity to admire the loftiness of manner that the noble Lord, Lord Clement-Jones, brought to dismissing my concerns about Wikipedia—that I was really just overreading the whole thing and that I should not be too bothered with words as they appear in the Bill because the noble Lord thinks that Wikipedia is rather a good thing and why is it not happy with that as a level of assurance?

I would like to think that the Minister had dealt with the matter in the way that I hoped he would, but I do thin, if I may say so, that it is vaguely irresponsible to come to the Dispatch Box and say, “I don’t think Wikipedia will qualify as a category 1 service”, and then refuse to say whether it will or will not and take refuge in the process the Bill sets up, when at least one Member of the House of Lords, and possibly a second in the shape of the noble Lord, Lord Clement-Jones, would like to know the answer to the question. I see a Minister from the business department sitting on the Front Bench with my noble friend. This is a bit like throwing a hand grenade into a business headquarters, walking away and saying, “It was nothing to do with me”. You have to imagine what the position is like for the business.

We had a very important amendment from my noble friend Lady Buscombe. I think we all sympathise with the type of abuse that she is talking about—not only its personal effects but its deliberate business effects, the deliberate attempt to destroy businesses. I say only that my reading of her Amendment 106 is that it seeks to impose on Ofcom an objective to prevent harm, essentially, arising from offences under Clauses 160 and 162 of the Bill committed by unverified or anonymous users. Surely what she would want to say is that, irrespective of verification and anonymity, one would want action taken against this sort of deliberate attempt to undermine and destroy businesses. While I have every sympathy with her amendment, I am not entirely sure that it relates to the question of anonymity and verification.

Apart from that, there were in a sense two debates going on in parallel in our deliberations. One was to do with anonymity. On that question, I think the noble Lord, Lord Clement-Jones, put the matter very well: in the end, you have to come down on one side or the other. My personal view, with some reluctance, is that I have come down on the same side as the Government, the noble Lord and others. I think we should not ban anonymity because there are costs and risks to doing so, however satisfying it would be to be able to expose and sue some of the people who say terrible and untrue things about one another on social media.

The more important debate was not about anonymity as such but about verification. We had the following questions, which I am afraid I do not think were satisfactorily answered. What is verification? What does it mean? Can we define what verification is? Is it too expensive? Implicitly, should it be available for free? Is there an obligation for it to be free or do the paid-for services count, and what happens if they are so expensive that one cannot reasonably afford them? Is it real, in the sense that the verification processes devised by the various platforms genuinely provide verification? Various other questions like that came up but I do not think that any of them was answered.

I hate to say this as it sounds a little harsh about a Government whom I so ardently support, but the truth is that the triple shield, also referred to as a three-legged stool in our debate, was hastily cobbled together to make up for the absence of legal but harmful, but it is wonky; it is not working, it is full of holes and it is not fit for purpose. Whatever the Minister says today, there has to be a rethink before he comes back to discuss these matters at the next stage of the Bill. In the meantime, I beg leave to withdraw my amendment.

Amendment 38 withdrawn.
--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - -

My Lords, I hung back in the hope that the noble and learned Lord, Lord Hope of Craighead, would speak before me, because I suspected that his remarks would help elucidate my amendments, as I believe they have. I have a large number of amendments in this group, but all of them, with one exception, work together as, effectively, a single amendment. They are Amendments 101, 102, 109, 112, 116, 121, 191 and 220. The exception is Amendment 294, to which the noble Baroness, Lady Fox of Buckley, alluded and to which I shall return in a moment.

Taking that larger group of amendments first, I can describe their effect relatively briefly. In the Bill, there are requirements on services to consider how their practices affect freedom of expression, but there is no equivalent explicit duty on the regulator, Ofcom, to have regard to freedom of expression.

These amendments, taken together, would require Ofcom to

“have special regard to freedom of expression”

within the law when designing codes of practice, writing guidance and undertaking enforcement action. They would insert a new clause requiring Ofcom to have special regard to rights to freedom of expression within the law in preparing a code of practice; they would also require Ofcom, when submitting a draft code to the Secretary of State, to submit a statement setting out it had complied with the duty imposed by that new requirement; and they would require the Secretary of State to submit that statement to Parliament when laying a draft code before Parliament. They would impose similar obligations on Ofcom and the Secretary of State when making amendments to codes that might be made later. Finally, they would have a similar effect relating to guidance issued by Ofcom.

It is so glaringly obvious that Ofcom should be under this duty that it must be a mere omission that the balancing, corresponding duty has not been placed on it that has been placed on the providers. I would hope, though experience so far in Committee does not lead me to expect, that my noble friend would accept this, and that it would pass relatively uncontroversially.

Senior clinicians including Sir Jeremy Farrar, Professor John Bell and the noble Lord, Lord Darzi, have written to the Secretary of State to raise their concerns. These are serious players voicing serious concerns. The approach in Amendment 52 is, in my view, the best and most proportionate way to protect those who are most vulnerable to false and misleading information.
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - -

My Lords, I shall speak to Amendments 59, 107 and 264 in this group, all of which are in my name. Like the noble Baroness, Lady Merron, I express gratitude to Full Fact for its advice and support in preparing them.

My noble friend Lord Bethell has just reminded us of the very large degree of discretion that is given to platforms by the legislation in how they respond to information that we might all agree, or might not agree, is harmful, misinformation or disinformation. We all agree that those categories exist. We might disagree about what falls into them, but we all agree that the categories exist, and the discretion given to the providers in how to handle it is large. My amendments do not deal specifically with health-related misinformation or disinformation but are broader.

The first two, Amendments 59 and 107—I am grateful to my noble friend Lord Strathcarron for his support of Amendment 59—try to probe what the Government think platforms should do when harmful material, misinformation and disinformation appear on their platforms. As things stand, the Government require that the platforms should decide what content is not allowed on their platforms; then they should display this in their terms of service; and they should apply a consistent approach in how they manage content that is in breach of their terms of service. The only requirement is for consistency. I have no objection to their being required to behave consistently, but that is the principal requirement.

What Amendments 59 and 107 do—they have similar effects in different parts of the Bill; one directly on the platforms; the other in relation to codes of practice—is require them also to act proportionately. Here, it might be worth articulating briefly the fact that there are two views about platforms and how they respond, both legitimate. One is that some noble Lords may fear that platforms will not respond at all: in other words, they will leave harmful material on their site and will not properly respond.

The other fear, which is what I want to emphasise, is that platforms will be overzealous in removing material, because they will have written their terms of service, as I said on a previous day in Committee, not only for their commercial advantage but also for their legal advantage. They will have wanted to give themselves a wide latitude to remove material, or to close accounts, because that will help cover their backs legally. Of course, once they have granted themselves those powers, the fear is that they will use them overzealously, even in cases where that would be an overreaction. These two amendments seek to oblige the platforms to respond proportionately, to consider alternative approaches to cancellation and removal of accounts and to be obliged to look at those as well.

There are alternative approaches that they could consider. Some companies already set out to promote good information, if you like, and indeed we saw that in the Covid-19 pandemic. My noble friend Lord Bethell said that they did so, and they did so voluntarily. This amendment would not explicitly but implicitly encourage that sort of behaviour as a first resort, rather than cancellation, blocking and removal of material as a first resort. They would still have the powers to cancel, block and remove; it is a question of priority and proportionality.

There are also labels that providers can put on material that they think is dubious, saying, “Be careful before you read this”, or before you retweet it; “This is dubious material”. Those practices should also be encouraged. These amendments are intended to do that, but they are intended, first and foremost, to probe what the Government’s attitude is to this, whether they believe they have any role in giving guidance on this point and how they are going to do so, whether through legislation or in some other way, because many of us would like to know.

Amendment 264, supported by my noble friend Lord Strathcarron and the noble Lord, Lord Clement-Jones, deals with quite a different matter, although it falls under the general category of misinformation and disinformation: the role the Government take directly in seeking to correct misinformation and disinformation on the internet. We know that No. 10 has a unit with this explicit purpose and that during the Covid pandemic it deployed military resources to assist it in doing so. Nothing in this amendment would prevent that continuing; nothing in it is intended to create scare stories in people’s minds about an overweening Government manipulating us. It is intended to bring transparency to that process.

--- Later in debate ---
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

The Minister mentioned “acute” examples of misinformation and used the example of the pandemic. I tried to illustrate that perhaps, with hindsight, what were seen as acute examples of misinformation turned out to be rather more accurate than we were led to believe at the time. So my concern is that there is already an atmosphere of scepticism about official opinion, which is not the same as misinformation, as it is sometimes presented. I used the American example of the Hunter Biden laptop so we could take a step away.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

This might be an appropriate moment for me to say—on the back of that—that, although my noble friend explained current government practice, he has not addressed my point on why there should not be an annual report to Parliament that describes what government has done on these various fronts. If the Government regularly meet newspaper publishers to discuss the quality of information in their newspapers, I for one would have entire confidence that the Government were doing so in the public interest, but I would still quite like—I think the Government would agree on this—a report on what was happening, making an exception for national security. That would still be a good thing to do. Will my noble friend explain why we cannot be told?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

While I am happy to elaborate on the work of the counter-disinformation unit in the way I just have, the Government cannot share operational details about its work, as that would give malign actors insight into the scope and scale of our capabilities. As my noble friend notes, this is not in the public interest. Moreover, reporting representations made to platforms by the unit would also be unnecessary as this would overlook both the existing processes that govern engagements with external parties and the new protections that are introduced through the Bill.

In the first intervention, the noble Baroness, Lady Fox, gave a number of examples, some of which are debatable, contestable facts. Companies may well choose to keep them on their platforms within their terms of service. We have also seen deliberate misinformation and disinformation during the pandemic, including from foreign actors promoting more harmful disinformation. It is right that we take action against this.

I hope that I have given noble Lords some reassurance on the points raised about the amendments in this group. I invite them not to press the amendments.

Online Safety Bill

Lord Moylan Excerpts
We have a very simple remedy here, which goes with the grain of British fair play, the need for justice to be done and a Government who care for the people they govern, look after and make sure do not fall victim unwittingly and unknowingly—unknowingly in the sense that they do not know who is trying to hurt them, but they know what has happened to them because their profits, turnover and ability to feed their families have been grossly affected by these malicious, dishonest people. This amendment needs careful consideration and deserves wholehearted support across the House.
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - -

My Lords, as the noble Lord, Lord Clement-Jones, said, this is a very broad group, so I hope noble Lords will forgive me if I do not comment on every amendment in it. However, I have a great deal of sympathy for the case put forward by my noble friend Lady Buscombe and my noble and learned friend Lord Garnier. The addition of the word “financial” to Clause 160 is not only merited on the case made but is a practical and feasible thing to do in a way that the current inclusion of the phrase “non-trivial psychological” is not. After all, a financial loss can be measured and we know how it stands. I will also say that I have a great deal of sympathy with what the noble Lord, Lord Clement-Jones, said about his amendment. In so far as I understand them—I appreciate that they have not yet been spoken to—I am also sympathetic to the amendments in the names of the noble Baroness, Lady Kennedy of The Shaws, and the noble Lord, Lord Allan of Hallam.

I turn to my Amendment 265, which removes the word “psychological” from this clause. We have debated this already, in relation to other amendments, so I am going to be fairly brief about it. Probably through an oversight of mine, this amendment has wandered into the wrong group. I am going to say simply that it is still a very, very good idea and I hope that my noble friend, when he comes to reflect on your Lordships’ Committee as a whole, will take that into account and respond appropriately. Instead, I am going to focus my remarks on the two notices I have given about whether Clauses 160 and 161 should stand part of the Bill; Clause 161 is merely consequential on Clause 160, so the meat is whether Clause 160 should stand part of the Bill.

I was a curious child, and when I was learning the Ten Commandments—I am sorry to see the right reverend Prelate has left because I hoped to impress him with this—I was very curious as to why they were all sins, but some of them were crimes and others were not. I could not quite work out why this was; murder is a crime but lying is not a crime—and I am not sure that at that stage I understood what adultery was. In fact, lying can be a crime, of course, if you undertake deception with intent to defraud, and if you impersonate a policeman, you are lying and committing a crime, as I understand it—there are better-qualified noble Lords than me to comment on that. However, lying in general has never been a crime, until we get to this Bill, because for the first time this Bill makes lying in general—that is, the making of statements you know to be false—a crime. Admittedly, it is a crime dependent on the mode of transmission: it has to be online. It will not be a crime if I simply tell a lie to my noble and learned friend Lord Garnier, for example, but if I do it online, any form of statement which is not true, and I know not to be true, becomes a criminal act. This is really unprecedented and has a potentially chilling effect on free speech. It certainly seems to be right that, in your Lordships’ Committee, the Government should be called to explain what they think they are doing, because this is a very portentous matter.

The Bill states that a person commits the false communications offence if they send a message that they know to be false, if they intend the message to cause a degree of harm of a non-trivial psychological or physical character, and if they have no reasonable excuse for sending the message. Free speech requires that one should be allowed to make false statements, so this needs to be justified. The wording of the offence raises substantial practical issues. How is a court meant to judge what a person knows to be false? How is a committee of the House of Commons meant to judge, uncontroversially, what a person knows to be false at the time they say it? I say again: what is non-trivial psychological harm and what constitutes an excuse? None of these things is actually defined; please do not tell me they are going to be defined by Ofcom—I would not like to hear that. This can lead to astonishing inconsistency in the courts and the misapplication of criminal penalties against people who are expressing views as they might well be entitled to do.

Then there is the question of the audience, because the likely audience is not just the person to whom the false statement is directed but could be anybody who subsequently encounters the message. How on earth is one going to have any control over how that message travels through the byways and highways of the online world and be able to say that one had some sense of who it was going to reach and what non-trivial psychological harm it might cause when it reached them?

We are talking about this as if this criminal matter is going to be dealt with by the courts. What makes this whole clause even more disturbing is that in the vast majority of cases, these offences will never reach the courts, because there is going to be, inevitably, an interaction with the illegal content duties in the Bill. By definition, these statements will be illegal content, and the platforms have obligations under the Bill to remove and take down illegal content when they become aware of it. So, the platform is going to have to make some sort of decision about not only the truth of the statement but whether the person knows what the statement is, that the statement is false and what their intention is. Under the existing definition of illegal content, they will be required to remove anything they reasonably believe is likely to be false and to prevent it spreading further, because the consequences of it, in terms of the harm it might do, are incalculable by them at that point.

We are placing a huge power of censorship—and mandating it—on to the platforms, which is one of the things that some of us in this Committee have been very keen to resist. Just exploring those few points, I think my noble friend really has to explain what he thinks this clause is doing, how it is operable and what its consequences are going to be for free speech and censorship. As it stands, it seems to me unworkable and dangerous.

Lord Garnier Portrait Lord Garnier (Con)
- Hansard - - - Excerpts

Does my noble friend agree with me that our courts are constantly looking into the state of mind of individuals to see whether they are lying? They look at what they have said, what they have done and what they know. They can draw an inference based on the evidence in front of them about whether the person is dishonest. This is the daily bread and butter of court. I appreciate the points he is making but, if I may say so, he needs to dial back slightly his apoplexy. Underlying this is a case to be made in justice to protect the innocent.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

I did not say that it would be impossible for a court to do this; I said it was likely to lead to high levels of inconsistency. We are dealing with what is likely to be very specialist cases. You can imagine this in the context of people feeling non-trivially psychologically harmed by statements about gender, climate, veganism, and so forth. These are the things where you see this happening. The idea that there is going to be consistency across the courts in dealing with these issues is, I think, very unlikely. It will indeed have a chilling effect on people being able to express views that may be controversial but are still valid in an open society.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I want to reflect on the comments that the noble Lord, Lord Moylan, has just put to us. I also have two amendments in the group; they are amendments to the government amendment, and I am looking to the Minister to indicate whether it is helpful for me to explain the rationale of my amendments now or to wait until he has introduced his. I will do them collectively.

First, the point the noble Lord, Lord Moylan, raised is really important. We have reached the end of our consideration of the Bill; we have spent a lot of time on a lot of different issues, but we have not spent very much time on these new criminal offences, and there may be other Members of your Lordships’ House who were also present when we discussed the Communications Act back in 2003, when I was a Member at the other end. At that point, we approved something called Section 127, which we were told was essentially a rollover of the dirty phone call legislation we had had previously, which had been in telecoms legislation for ever to prevent that deep-breathing phone call thing.

--- Later in debate ---
Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- Hansard - - - Excerpts

It needs to be addressed, because these very small websites already alluded to are providing some extremely nasty stuff. They are not providing support to people and helping decrease the amount of harm to those self-harming but seem to be enjoying the spectacle of it. We need to differentiate and make sure that we do not inadvertently let one group get away with disseminating very harmful material simply because it has a small website somewhere else. I hope that will be included in the Minister’s letter; I do not expect him to reply now.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

Some of us are slightly disappointed that my noble friend did not respond to my point on the interaction of Clause 160 with the illegal content duty. Essentially, what appears to be creating a criminal offence could simply be a channel for hyperactive censorship on the part of the platforms to prevent the criminal offence taking place. He has not explained that interaction. He may say that there is no interaction and that we would not expect the platforms to take any action against offences under Clause 160, or that we expect a large amount of action, but nothing was said.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

If my noble friend will forgive me, I had better refresh my memory of what he said—it was some time ago—and follow up in writing.

Online Safety Bill

Lord Moylan Excerpts
Lord Bishop of Manchester Portrait The Lord Bishop of Manchester
- View Speech - Hansard - - - Excerpts

My Lords, I too support the Minister’s Amendment 1. I remember vividly, at the end of Second Reading, the commitments that we heard from both Front-Benchers to work together on this Bill to produce something that was collaborative, not contested. I and my friends on these Benches have been very touched by how that has worked out in practice and grateful for the way in which we have collaborated across the whole House. My plea is that we can use this way of working on other Bills in the future. This has been exemplary and I am very grateful that we have reached this point.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - -

My Lords, I am grateful to my noble friend the Minister for the meeting that he arranged with me and the noble Baroness, Lady Fox of Buckley, on Monday of this week.

Although we are on Report, I will start with just one preliminary remark of a general character. The more closely one looks at this Bill, the clearer it is that it is the instrument of greatest censorship that we have introduced since the liberalisation of the 1960s. This is the measure with the greatest capacity for reintroducing censorship. It is also the greatest assault on privacy. These principles will inform a number of amendments that will be brought forward on Report.

Turning now to the new clause—I have no particular objection to there being an introductory clause—it is notable that it has been agreed by the Front Benches and by the noble Baroness, Lady Kidron, but that it has not been discussed with those noble Lords who have spoken consistently and attended regularly in Committee to speak up in the interests of free speech and privacy. I simply note that as a fact. There has been no discussion about it with those who have made those arguments.

Now, it is true that the new clause does refer to both free speech and privacy, but it sounds to me very much as though these are written almost as add-ons and afterthoughts. We will be testing, as Report stage continues, through a number of amendments, whether that is in fact the case or whether that commitment to free speech and privacy is actually being articulated and vindicated in the Bill.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, needless to say, I disagree with what the noble Lord, Lord Moylan, has just been saying precisely because I believe that the new clause that the Minister has put forward, which I have signed and has support across the House, expresses the purpose of the Bill in the way that the original Joint Committee wanted. I pay tribute to the Minister, who I know has worked extremely hard, in co-operation with the noble Lord, Lord Stevenson of Balmacara, to whom I also pay tribute for getting to grips with a purpose clause. The noble Baronesses, Lady Kidron and Lady Harding, have put their finger on it: this is more about activity and design than it is about content, and that is the reason I fundamentally disagree with the noble Lord, Lord Moylan. I do not believe that will be the impact of the Bill; I believe that this is about systemic issues to do with social media, which we are tackling.

I say this slightly tongue-in-cheek, but if the Minister had followed the collective wisdom of the Joint Committee originally, perhaps we would not have worked at such breakneck speed to get everything done for Report stage. I believe that the Bill team and the Minister have worked extremely hard in a very few days to get to where we are on many amendments that we will be talking about in the coming days.

I also want to show my support for the noble Baroness, Lady Merron. I do not believe it is just a matter of the Interpretation Act; I believe this is a fundamental issue and I thank her for raising it, because it was not something that was immediately obvious. The fact is that a combination of characteristics is a particular risk in itself; it is not just about having several different characteristics. I hope the Minister reflects on this and can give a positive response. That will set us off on a very good course for the first day of Report.

Online Safety Bill

Lord Moylan Excerpts
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, as we enter the final stages of consideration of this Bill, it is a good time to focus a little more on what is likely to happen once it becomes law, and my Amendment 28 is very much in that context. We now have a very good idea of what the full set of obligations that in-scope services will have to comply with will look like, even if the detailed guidance is still to come.

With this amendment I want to return to the really important question that I do not believe we answered satisfactorily when we debated it in Committee. That is that there is a material risk that, without further amendment or clarification, Wikipedia and other similar services may feel that they can no longer operate in the United Kingdom.

Wikipedia has already featured prominently in our debates, but there are other major services that might find themselves in a similar position. As I was discussing the definitions in the Bill with my children yesterday—this may seem an unusual dinner conversation with teenagers, but I find mine to be a very useful sounding board—they flagged that OpenStreetMap, to which we all contribute, also seems to be in the scope of how we have defined user-to-user services. I shall start by asking some specific questions so that the Minister has time to find the answers in his briefing or have them magically delivered to him before summing up: I shall ask the questions and then go on to make the argument.

First, is it the Government’s view that Wikipedia and OpenStreetMap fall within the definition of user-to-user services as defined in Clause 2 and the content definition in Clause 211? We need to put all these pieces together to understand the scope. I have chosen these services because each is used by millions of people in the UK and their functionality is very well known, so I trust that the Government had them in mind when they were drafting the legislation, as well as the more obvious services such as Instagram, Facebook et cetera.

Secondly, can the Minister confirm whether any of the existing exemptions in the Bill would apply to Wikipedia and OpenStreetMap such that they would not have to comply with the obligations of a category 1 or 2B user-to-user service?

Thirdly, does the Minister believe that the Bill as drafted allows Ofcom to use its discretion in any other way to exempt Wikipedia and OpenStreetMap, for example through the categorisation regulations in Schedule 11? As a spoiler alert, I expect the answers to be “Yes”, “No” and “Maybe”, but it is really important that we have the definitive government response on the record. My amendment would seek to turn that to “Yes”, “Yes” and therefore the third would be unnecessary because we would have created an exemption.

The reason we need to do this is not in any way to detract from the regulation or undermine its intent but to avoid facing the loss of important services at some future date because of situations we could have avoided. This is not hyperbole or a threat on the part of the services; it is a natural consequence if we impose legal requirements on a responsible organisation that wants to comply with the law but knows it cannot meet them. I know it is not an intended outcome of the Bill that we should drive these services out, but it is certainly one intended outcome that we want other services that cannot meet their duties of care to exit the UK market rather than continue to operate here in defiance of the law and the regulator.

We should remind ourselves that at some point, likely to be towards the end of 2024, letters will start to arrive on the virtual doormats of all the services we have defined as being in scope—these 25,000 services—and their senior management will have a choice. I fully expect that the Metas, the Googles and all such providers will say, “Fine, we will comply. Ofcom has told us what we need to do, and we will do it”. There will be another bunch of services that will say, “Ofcom, who are they? I don’t care”, and the letter will go in the bin. We have a whole series of measures in the Bill by which we will start to make life difficult for them: we will disrupt their businesses and seek to prosecute them and we will shut them out of the market.

However, there is a third category, which is the one I am worried about in this amendment, who will say, “We want to comply, we are responsible, but as senior managers of this organisation”, or as directors of a non-profit foundation, “we cannot accept the risk of non-compliance and we do not have the resources to comply. There is no way that we can build an appeals mechanism, user reporter functions and all these things we never thought we would need to have”. If you are Wikipedia or OpenStreetMap, you do not need to have that infrastructure, yet as I read the Bill, if they are in scope and there is no exemption, then they are going to be required to build all that additional infrastructure.

The Bill already recognises that there are certain classes of services where it would be inappropriate to apply this new regulatory regime, and it describes these in Schedule 1, which I am seeking to amend. My amendment just seeks to add a further class of exempted service and it does this quite carefully so that we would exclude only services that I believe most of us in this House would agree should not be in scope. There are three tests that would be applied.

The first is a limited functionality test—we already have something similar in Schedule 1—so that the user-to-user functions are only those that relate to the production of what I would call a public information resource. In other words, users engage with one another to debate a Wikipedia entry or a particular entry on a map on OpenStreetMap. So, there is limited user-to-user functionality all about this public interest resource. They are not user-to-user services in the classic sense of social media; they are a particular kind of collective endeavour. These are much closer to newspaper publishers, which we have explicitly excluded from the Bill. It is much more like a newspaper; it just happens to be created by users collectively, out of good will, rather than by paid professional journalists. They are very close to that definition, but if you read Schedule 1, I do not think the definition of “provider content” in paragraph 4(2) includes at the moment these collective-user endeavours, so they do not currently have the exemption.

I have also proposed that Ofcom would carry out a harm test to avoid the situation where someone argues that their services are a public information resource, while in practice using it to distribute harmful material. That would be a rare case, but noble Lords can conceive of it happening. Ofcom would have the ability to say that it recognises that Wikipedia does not carry harmful content in any meaningful way, but it would also have the right not to grant the exemption to service B that says it is a new Wikipedia but carries harmful content.

Thirdly, I have suggested that this is limited to non-commercial services. There is an argument for saying any public information resource should benefit, and that may be more in line with the amendment proposed by the noble Lord, Lord Moylan, where it is defined in terms of being encyclopaedic or the nature of the service. I recognise that I have put in “non-commercial” as belt and braces because there is a rationale for saying that, while we do not really want an encyclopaedic resource to be in the 2B service if it has got user-to-user functions, if it is commercial, we could reasonably expect it to find some way to comply. It is different when it is entirely non-commercial and volunteer-led, not least because the Wikimedia Foundation, for example, would struggle to justify spending the money that it has collected from donors on compliance costs with the UK regime, whereas a commercial company could increase its resources from commercial customers to do that.

I hope this is a helpful start to a debate in which we will also consider Amendment 29, which has similar goals. I will close by asking the Minister some additional questions. I have asked him some very specific ones to which I hope he can provide answers, but first I ask: does he acknowledges the genuine risk that services like Wikipedia and OpenStreetMap could find themselves in a position where they have obligations under the Bill that they simply cannot comply with? It is not that they are unwilling, but there is no way for them to do all this structurally.

Secondly, I hope the Minister would agree that it is not in the public interest for Ofcom to spend significant time and effort on the oversight of services like these; rather, it should spend its time and effort on services, such as social media services, that we believe to be creating harms and are the central focus of the Bill.

Thirdly, will the Minister accept that there is something very uncomfortable about a government regulator interfering with the running of a neutral public resource like Wikipedia, when there is so much benefit from it and little or no demonstrative harm? It is much closer to the model that exists for a newspaper. We have debated endlessly in this House—and I am sure we will come back to it—that there is, rightly, considerable reluctance to have regulators going too far and creating this relationship with neutral public information goods. Wikipedia falls into that category, as does OpenStreetMap and others, and there would be fundamental in principle challenges around that.

I hope the Government will agree that we should be taking steps to make sure we are not inadvertently creating a situation where, in one or two years’ time, Ofcom will come back to us saying that it wrote to Wikipedia, because the law told it to do so, and told Wikipedia all the things that it had to do; Wikipedia took it to its senior management and then came back saying that it is shutting shop in the UK. Because it is sensible, Ofcom would come back and say that it did not want that and ask to change the law to give it the power to grant an exemption. If such things deserve an exemption, let us make it clear they should have it now, rather than lead ourselves down this path where we end up effectively creating churn and uncertainty around what is an extraordinarily valuable public resource. I beg to move.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - -

My Lords, Amendments 29 and 30 stand in my name. I fully appreciated, as I prepared my thoughts ahead of this short speech, that a large part of what I was going to say might be rendered redundant by the noble Lord, Lord Allan of Hallam. I have not had a discussion with him about this group at all, but it is clear that his amendment is rather different from mine. Although it addresses the same problem, we are coming at it slightly differently. I actually support his amendment, and if the Government were to adopt it I think the situation would be greatly improved. I do prefer my own, and I think he put his finger on why to some extent: mine is a little broader. His relates specifically to public information, whereas mine relates more to what can be described as the public good. So mine can be broader than information services, and I have not limited it to non-commercial operations, although I fully appreciate that quite a lot of the services we are discussing are, in practice, non-commercial. As I say, if his amendment were to pass, I would be relatively satisfied, but I have a moderate preference for my own.

--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I shall speak briefly to Amendment 174 in my name and then more broadly to this group—I note that the Minister got his defence in early.

On the question of misinformation and disinformation, I recognise what he said and I suppose that, in my delight at hearing the words “misinformation and disinformation”, I misunderstood to some degree what he was offering at the Dispatch Box, but I make the point that this poses an enormous risk to children. As an example, children are the fastest-growing group of far-right believers/activists online, and there are many areas in which we are going to see an exponential growth in misinformation and disinformation as large language models become the norm. So I ask him, in a tentative manner, to look at that.

On the other issue, I have to push back at the Minister’s explanation. Content classification around sexual content is a well-established norm. The BBFC does it and has done it for a very long time. There is an absolute understanding that what is suitable for a U, a PG, a 12 or a 12A are different things, and that as children’s capacities evolve, as they get older, there are things that are more suitable for older children, including, indeed, stronger portrayals of sexual behaviour as the age category rises. So I cannot accept that this opens a new can of worms: this is something that we have been doing for many, many years.

I think it is a bit wrongheaded to imagine that if we “solve” the porn problem, we have solved the problem—because there is still sexualisation and the commercialisation of sex. Now, if you say something about feet to a child, they start to giggle uproariously because, in internet language, you get paid for taking pictures of feet and giving them to strange people. There are such detailed and different areas that companies should be looking at. This amendment in my name and the names of the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford, should be taken very seriously. It is not new ground, so I would ask the Minister to reconsider it.

More broadly, the Minister will have noticed that I liberally added my name to the amendments he has brought forward to meet some of the issues we raised in Committee, and I have not added my name to the schedule of harms. I want to be nuanced about this and say I am grateful to the Government for putting them in the Bill, I am grateful that the content harms have been discussed in this Chamber and not left for secondary legislation, and I am grateful for all the conversations around this. However, harm cannot be defined only as content, and the last grouping got to the core of the issue in the House. Even when the Minister was setting out this amendment, he acknowledged that the increase in harm to users may be systemic and by design. In his explanation, he used the word “harm”; in the Bill, it always manifests as “harmful content”.

While the systemic risk of increasing the presence of harmful content is consistently within the Bill, which is excellent, the concept that the design of service may in and of itself be harmful is absent. In failing to do that, the Government, and therefore the Bill, have missed the bull’s-eye. The bull’s-eye is what is particular about this method of communication that creates harm—and what is particular are the features, functionalities and design. I draw noble Lords back to the debate about Wikipedia. It is not that we all love Wikipedia adoringly; it is that it does not pursue a system of design for commercial purposes that entraps people within its grasp. Those are the harms we are trying to get at. I am grateful for the conversations I have had, and I look forward to some more. I have laid down some other amendments for Monday and beyond that would, I hope, deal with this—but until that time, I am afraid this is an incomplete picture.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

My Lords, I have a comment about Amendment 174 in the name of the noble Baroness, Lady Kidron. I have no objection to the insertion of subsection (9B), but I am concerned about (9A), which deals with misinformation and disinformation. It is far too broad and political, and if we start at this late stage to try to run off into these essentially political categories, we are going to capsize the Bill altogether. So I took some heart from the fact that my noble friend on the Front Bench appeared disinclined to accept at least that limb of the amendment.

I did want to ask briefly some more detailed questions about Amendment 172 and new subsection (2) in particular. This arises from the danger of having clauses added at late stages of the Bill that have not had the benefit of proper discussion and scrutiny in Committee. I think we are all going to recognise the characteristics that are listed in new subsection (2) as mapping on to the Equality Act, which appears to be their source. I note in passing that it refers in that regard to gender reassignment. I would also note that most of the platforms, in their terms and conditions, refer not to gender reassignment but to various other things such as gender identity, which are really very different, or at least different in detail, and I would be interested to ask my noble friend how effectively he expects it to be enforced that the words used in English statute are actually applied by these foreign platforms—I am going to come back to this in a further amendment later—or how the words used in English statute are applied by what are, essentially, foreign platforms when they are operating for an audience in the United Kingdom.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

I take this opportunity to ask my noble friend the Minister a question; I want some clarity about this. Would an abusive comment about a particular religion—let us say a religion that practised cannibalism or a historical religion that sacrificed babies, as we know was the norm in Carthage—count as “priority harmful content”? I appreciate that we are mapping the language of the Equality Act, but are we creating a new offence of blasphemy in this Bill?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

As was pointed out by others in the debate, the key provision in Amendment 172 is subsection (2) of the proposed new clause, which relates to:

“Content which is abusive and which targets any of the following characteristics”.


It must both be abusive and target the listed characteristics. It does not preclude legitimate debate about those things, but if it were abusive on the basis of those characteristics—rather akin to the debate we had in the previous group and the points raised by the noble Baroness, Lady Kennedy of The Shaws, about people making oblique threats, rather than targeting a particular person, by saying, “People of your characteristic should be abused in the following way”—it would be captured.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

My noble friend seemed to confirm what I said. If I wish to be abusive—in fact, I do wish to be abusive—about the Carthaginian religious practice of sacrificing babies to Moloch, and I were to do that in a way that came to the attention of children, would I be caught as having created “priority harmful content”? My noble friend appears to be saying yes.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Does my noble friend wish to do that and direct it at children?

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

With respect, it does not say “directed at children”. Of course, I am safe in expressing that abuse in this forum, but if I were to do it, it came to the attention of children and it were abusive—because I do wish to be abusive about that practice—would I have created “priority harmful content”, about which action would have to be taken?

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I will leap to the Minister’s defence on this occasion. I remind noble colleagues that this is not about individual pieces of content; there would have to be a consistent flow of such information being proffered to children before Ofcom would ask for a change.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

My Lords, these words have obviously appeared in the Bill in one of those unverified sections; I have clicked the wrong button, so I cannot see them. Where does it say in Amendment 172 that it has to be a consistent flow?

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

May I attempt to assist the Minister? This is the “amber” point described by the noble Lord, Lord Allan: “priority content” is not the same as “primary priority content”. Priority content is our amber light. Even the most erudite and scholarly description of baby eating is not appropriate for five year-olds. We do not let it go into “Bod” or any of the other of the programmes we all grew up on. This is about an amber warning: that user-to-user services must have processes that enable them to assess the risk of priority content and primary priority content. It is not black and white, as my noble friend is suggesting; it is genuinely amber.

Online Safety Bill

Lord Moylan Excerpts
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - -

My Lords, I speak to Amendments 56, 58, 63 and 183 in my name in this group. I have some complex arguments to make, but time is pressing, so I shall attempt to do so as briefly as possible. I am assisted in that by the fact that my noble friend on the Front Bench very kindly explained that the Government are not going to accept my worthless amendments, without actually waiting to hear what it is I might have said on their behalf.

None the less, I turn briefly to Amendment 183. The Bill has been described, I think justly, as a Twitter-shaped Bill: it does not take proper account of other platforms that operate in different ways. I return to the question of Wikipedia, but also platforms such as Reddit and other community-driven platforms. The requirement for a user-verification tool is of course intended to lead to the possibility that ordinary, unverified users—people like you and me—could have the option to see only that content which comes from those people who are verified.

This is broadly a welcome idea, but when we combine that with the fact that there are community-driven sites such as Wikipedia where there are community contributions and people who contribute to those sites are not always verified—sometimes there are very good reasons why they would want to preserve their anonymity —we end up with the possibility of whole articles having sentences left out and so on. That is not going to happen; the fact is that nobody such as Wikipedia can operate a site like that, so it is another one of those existential questions that the Government have not properly grappled with and really must address before we come to Third Reading, because this will not work the way it is.

As for my other amendments, they are supportive of and consistent with the idea of user verification, and they recognise—as my noble friend said—that user verification is intended to be a substitute for the abandoned “legal but harmful” clause. I welcome the abandonment of that clause and recognise that this provision is more consistent with individual freedom and autonomy and the idea that we can make choices of our own, but it is still open to the possibility of abuse by the platforms themselves. The amendments that I am put forward address, first, the question of what should be the default position. My argument is that the default position should be that filtering is not on and that one has to opt into it, because that that seems to me the adult proposition, the adult choice.

The danger is that the platforms themselves will either opt you into filtering automatically as the default, so you do not see what might be called the full-fat milk that is available on the internet, or that they harass you to do so with constant pop-ups, which we already get. If you go on the Nextdoor website, you constantly get the pop-up saying, “You should switch on notifications”. I do not want notifications; I want to look at it when I want to look at it. I do not want notifications, but I am constantly being driven into pressing the button that says, “Switch on notifications”. You could have something similar here—constantly being driven into switching on the filters—because the platforms themselves will be very worried about the possibility that you might see illegal content. We should guard against that.

Secondly, on Amendment 58, if we are going to have user verification—as I say, there is a lot to be said for that approach—it should be applied consistently. If the platform decides to filter out racist abuse and you opt in to filtering out racist abuse or some other sort of specified abuse, it has to filter all racist abuse, not simply racist abuse that comes from people they do not like; or, with gender assignment abuse, they cannot filter out stuff from only one side or other of the argument. The word “consistently” that is included here is intended to address that, and to require policies that show that, if you opt in to having something filtered out, it would be done on a proper, consistent and systematic basis and not influenced by the platform’s own particular political views.

Finally, we come to Amendment 63 and the question of how this is communicated to users of the internet. This amendment would force the platforms to make these policies about how user verification will operate a part of their terms and conditions in a public and visible way and to ensure that those provisions are applied consistently. It goes a little further than the other amendments—the others could stand on their own—but would also add a little bit more by requiring public and consistent policies that people can see. This works with the grain of what the Government are trying to do; I do not see that the Government can object to any of this. There is nothing wrecking here. It is trying to make everything more workable, more transparent and more obvious.

I hope, given the few minutes or short period of time that will elapse between my sitting down and the Minister returning to the Dispatch Box, that he will have reflected on the negative remarks that he made in his initial speech and will find it possible to accept these amendments now that he has heard the arguments for them.

Online Safety Bill

Lord Moylan Excerpts
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I am rather disappointed that, while this is a large group on freedom of expression, it is dominated by amendments by myself and the noble Lord, Lord Moylan. I welcome the noble Baroness, Lady Fraser of Craigmaddie, and the noble Lord, Lord Stevenson of Balmacara, dipping their toes in the free-expression water here and I am glad that the Minister has added his name to their amendment, although it is a shame that he did not add his name to one of mine.

Earlier today we heard a lot of congratulations to the Government for listening. I have to say, it depends who you are, because the Government have not listened to all of us. It is notable that, of the hundreds of new government concessions that have taken the form of amendments on Report, none relates to free speech. Before I go through my amendments, I want to note that, when the noble Lord, Lord Moylan, and I raise concerns about free speech, it can be that we get treated as being slightly eccentric. There has been a generally supportive and generous mood from the regulars in this House. I understand that, but I worry that free speech is being seen as peripheral.

This country, our country, that we legislate for and in, has a long history of boasting that it is the home of liberty and adopts the liberal approach that being free is the default position: that free speech and the plurality and diversity of views it engenders are the cornerstone of democracy in a free society and that any deviation from that approach must require extraordinary and special justification. A comprehensive piece of law, such as the one we are dealing with, that challenges many of those norms, deserves thorough scrutiny through the prism of free speech.

When I approached this Bill, which I had been following long before I arrived in this House, I assumed that there would be packed Benches—as there are on the Illegal Migration Bill—and that everybody, including all these Law Lords, would be in quoting the European Court of Human Rights on Article 8 and Article 10. I assumed there would be complaints about Executive power grabs and so on. But it has been a bit sparse.

That is okay; I can live with that, even if it is a bit dispiriting. But I am concerned when the Government cite that the mood of the Committee has been reflected in their amendments, because it has not been a very large Committee. Many of the amendments that I, the noble Lord, Lord Moylan, and others tabled about free expression represent the concerns of a wide range of policy analysts, civil rights groups, academics, lawyers, free speech campaigners and industry representatives. They have been put forward in good faith—I continue to do that—to suggest ways of mitigating some of the grave threats to free speech in this Bill, with constructive ideas about how to tackle flaws and raising some of the problems of unintended consequences. I have, at times, felt that those concerns were batted away with a certain indifference. Despite the Minister being very affable and charming, none the less it can be a bit disappointing.

Anyway, I am here to bat again. I hope that the Government now will listen very closely and consider how to avoid the UK ending up with the most restrictive internet speech laws of any western democracy at the end of this. I have a lot of different amendments in my name in this group. I wholeheartedly support the amendments in the name of the noble Lord, Lord Moylan, requiring Ofcom to assess the impact of its codes on free speech, but I will not speak to them.

I will talk about my amendments, starting with Amendments 77, 78, 79, 80 and 81. These require platforms to have particular regard to freedom of expression, not just when implementing safety measures and policies but when writing their terms of service. This is to ensure that freedom of expression is not reduced to an abstract “have regard to” secondary notion but is visible in drafting terms of services. This would mean that users know their rights in clear and concrete terms. For example, a platform should be expected to justify how a particular term of service, on something such as religious hatred, will be balanced with consideration of freedom of expression and conscience, in order to allow discussions over different beliefs to take place. Users need to be able to point to specific provisions in the terms of service setting out their free speech protections.

This is all about parity between free speech and safety. Although the Government—and I welcome this—have attempted some balance, via Clause 18, to mitigate the damage to individual rights of free expression from the Bill, it is a rather weak, poor cousin. We need to recognise that, if companies are compelled to prevent and minimise so-called harmful content via operational safety duties, these amendments are saying that there should be parity with free expression. They should be compelled to do the same on freedom of expression, with a clear and positive duty, rather than Clause 64, which is framed rather negatively.

Amendment 188 takes on the issue of terms of service from a different direction, attempting to ensure that duties with regard to safety must not be allowed to restrict lawful expression or that protected by Article 10 of the European Convention on Human Rights. That states that interference in free speech rights is not lawful unless it is a last resort. I note, in case anyone is reading the amendment carefully, and for Hansard, that the amendment cites Article 8—a rather Freudian slip on my part that was not corrected by the Table Office. That is probably because privacy rights are also threatened by the Bill, but I meant Article 10 of course.

Amendment 188 addresses a genuine dilemma in terms of Ofcom enforcing safety duties via terms and conditions. These will transform private agreements between companies and users into statutory duties under Clause 65. This could mean that big tech companies would be exercising public law functions by state-backed enforcement of the suppression of lawful speech. One worry is that platforms’ terms of service are not neutral; they can change due to external political or commercial pressures. We have all been following with great interest what is happening at Twitter. They are driven by values which can be at odds with UK laws. So I hope the Minister will answer the query that this amendment poses: how is the UK able to uphold its Article 10 obligations if state regulators are legally instructed to enforce terms of service attitudes to free speech, even when they censor far more than UK domestic law requires?

Amendment 162 has a different focus and removes offences under Section 5 of the Public Order Act from the priority offences to be regulated as priority illegal content, as set out in Schedule 7. This amendment is prompted by a concern that the legislation enlists social media companies to act as a private online police force and to adjudicate on the legality of online content. This is especially fraught in terms of the legal limits on speech, where illegality is often contested and contentious—offline as well as online.

The inclusion of Section 5 would place a duty on service providers to take measures to prevent individuals ever encountering content that includes

“threatening or abusive words or behaviour, or disorderly behaviour”

that is likely to cause “harassment, alarm or distress”. It would also require service providers to minimise the length of time such content is present on the service.

I am not sure whether noble Lords have been following the dispute that broke out over the weekend. There is a film on social media doing the rounds of a trans speaker, Sarah Jane Baker, at the Trans Pride event screaming pretty hysterically “If you see a TERF, punch them in the effing face”—and I am being polite. You would think that that misogynistic threat would be the crime people might be concerned about, yet some apologists for Trans Pride claim that those women—TERFs such as myself—who are outraged, and have been treating the speech as saying that, are the ones who are stirring up hate.

Now, that is a bit of a mess, but asking service providers, or indeed algorithms, to untangle such disputes can surely lead only to the over-removal of online expression, or even more of a muddle. As the rule of law charity Justice points out, this could also catch content that depicts conflict or atrocities, such as those taking place in the Russia-Ukraine war. Justice asks whether the inclusion of Section 5 of the POA could lead to the removal of posts by individuals sharing stories of their own abuse or mistreatment on internet support forums.

Additionally, under Schedule 7 to the Bill, versions of Section 5 could also be regulated as priority illegal conduct, meaning that providers would have to remove or restrict content that, for instance, encourages what is called disorderly behaviour that is likely to cause alarm. Various organisations are concerned that this could mean that content that portrayed protest activity, that might be considered disorderly by some, was removed unless you condemned it, or even that content which encouraged people to attend protests would be in scope.

I am not a fan of Section 5 of the Public Order Act, which criminalises stirring up hatred, at the best of times, but at least those offences have been and are subject to the full rigour of the criminal justice system and case law. Of course, the courts, the CPS and the police are also bound, for example by Article 10, to protect free speech. But that is very different to compelling social media companies, their staff or automated algorithms to make such complex assessments of the Section 5 threshold of illegality. Through no fault of their own, those companies are just not qualified to make such determinations, and it is obvious that that could mean that legitimate speech will end up being restricted. Dangerously, it also makes a significant departure from the UK’s rule of law in deciding what is legal or illegal speech. It has the potential to limit UK users’ ability to engage in important aspects of public life, and prevent victims of abuse from sharing their stories, as I have described.

I turn finally to the last amendment, Amendment 275—I will keep this short, for time’s sake. I will not go into detail, but I hope that the Minister will take a look at it, see that there is a loophole, and discuss it with the department. In skeleton form, the Free Speech Union has discovered that the British Board of Film Classification runs a mobile classification network, an agreement with mobile network providers that means that it advises mobile providers on what content should be filtered because it is considered suitable for adults only. This arrangement is private, not governed by statute, and as such means that even the weak free speech safeguards in this Bill can be sidestepped. This affects not only under-18s but anyone with factory settings on their phone. It led to a particular bizarre outcome when last year the readers of the online magazine, “The Conservative Woman”, reported that the website was inaccessible. This small online magazine was apparently blacklisted by the BBFC because of comments below the line on its articles. The potential for such arbitrary censorship is a real concern, and the magazine cannot even appeal to the BBFC, so I ask the Minister to take this amendment back to the DCMS, which helped set up this mobile classification network, and find out what is going on.

That peculiar tale illustrates my concerns about what happens when free speech is not front and centre, even when you are concerned about safety and harm. I worry that when free speech is casually disregarded, censorship and bans can become the default, and a thoughtless option. That is why I urge the Minister before Third Reading to at least make sure that some of the issues and amendments in this group are responded to positively.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - -

My Lords, my noble friend on the Front Bench said at various points when we were in Committee that the Bill struck an appropriate balance between protecting the rights of children and the rights of those wishing to exercise their freedom of expression. I have always found it very difficult indeed to discern that point of balance in the Bill as originally drafted, but I will say that if there were such a point, it has been swamped by the hundreds of amendments tabled to the Bill by my noble friend since Committee which push the Bill entirely in the opposite direction.

Among those amendments, I cannot find—it may be my fault, because I am just looking by myself; I have no help to find these things—a single one which seeks to redress the balance back in favour of freedom of expression. My Amendments 123, 128, 130, 141, 148 and 244 seek to do that to some extent, and I am grateful to the noble Baroness, Lady Fox of Buckley, for the support she has expressed for them.

--- Later in debate ---
Another thing to recognise—and this is where I perhaps depart from the noble Baroness, Lady Fox, and the noble Lord, Lord Moylan—is that we are in a sense dealing with privately managed public spaces on the internet. There is a lot of debate around this but, for me, they are functionally equivalent to other privately managed public spaces such as pubs, hotels or sports grounds. In none of those context do we expect all legal speech to be permissible. Rather, they all have their own norms and they enforce them. I cannot go into a sports ground and say what I like; I will get thrown out if I carry out certain actions within most of those public spaces. We are talking about privately managed public spaces; anyone can go in but, in entering that space, you have to conform to the norms of that space. As I said, I am not aware of many spaces where all legal speech is permitted.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

I understand the point the noble Lord is making but, if he were thrown out, sacked or treated in some other way that was incompatible with his rights to freedom of expression under Article 10 of the European convention, he would have cause for complaint and, possibly, cause for legal redress.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

That point is well made. In support of that, if the public space treated me in a discriminatory way, I would expect to have redress, but I do not think I have a right in every public space to say everything I like in the classic Article 8 sense. My right vis-à-vis the state is much broader than my right vis-à-vis any public space that I am operating in where norms apply as well as my basic legal rights. Again, to take the pub example, if I went in and made a racist speech, I may well be thrown out of the pub even though it is sub-criminal and the police are never called; they do not need to be as the space itself organises it.

I am making the point that terms of service are about managing these privately managed public services, and it would be a mistake to equate them entirely with our right to speak or the point at which the state can step in and censor us. I understand the point about state interference but it cuts both ways: both the state interfering in excessively censoring what we can say but also the state potentially interfering in the management of what is, after all, a private space. To refer back to the US first amendment tradition, a lot of that was about freedom of religion and precisely about enabling heterodoxy. The US did not want an orthodoxy in which one set of rules applied everywhere to everybody. Rather, it wanted people to have the right to dissent, including in ways that were exclusive. You could create your own religious sect and you could not be told not to have those beliefs.

Rolling that power over to the online world, online services, as long as they are non-discriminatory, can have quite different characters. Some will be very restrictive of speech like a restrictive religious sect; some will be very open and catholic, with a small “c”, in the sense of permitting a broad range of speech. I worry about some of the amendments in case there is a suggestion that Ofcom would start to tell a heterodox community of online services that there is an orthodox way to run their terms of service; I would rather allow this to be a more diverse environment.

Having expressed some concerns, I am though very sympathetic to Amendment 162 on Section 5 of the Public Order Act. I have tried in our debates to bring some real experience to this. There are two major concerns about the inclusion of the Public Order Act in the Bill. One is a lack of understanding of what that means. If you look at the face of the language that has been quoted at us, and go back to that small service that does not have a bunch of lawyers on tap, it reads as though it is stopping any kind of abusive content. Maybe you will google it, as I did earlier, and get a little thing back from the West Yorkshire Police. I googled: “Is it illegal to swear in the street?”. West Yorkshire Police said, “Yes, it is”. So if you are sitting somewhere googling to find out what this Public Order Act thing means, you mind end up thinking, “Crikey, for UK users, I have to stop them swearing”. There is a real risk of misinterpretation.

The second risk is that of people deliberately gaming the system; again, I have a real-life example from working in one of the platforms. I had people from United Kingdom law enforcement asking us to remove content that was about demonstrations by far-right groups. They were groups I fundamentally disagree with, but their demonstrations did not appear to be illegal. The grounds cited were that, if you allow this content to go ahead and the demonstration happens, there will be a Public Order Act offence. Once you get that on official notepaper, you have to be quite robust to say, “No, I disagree”, which we did on occasion.

I think there will be other services that receive Public Order Act letters from people who seem official and they will be tempted to take down content that is entirely legal. The critical thing here is that that content will often be political. In other parts of the Bill, we are saying that we should protect political speech, yet we have a loophole here that risks that.

I am sure the Minister will not concede these amendments, but I hope he will concede that it is important that platforms are given guidance so that they do not think that somebody getting upset about a political demonstration is sufficient grounds to remove the content as a Public Order Act offence. If you are a local police officer it is much better to get rid of that EDL demonstration, so you write to the platform and it makes your life easier, but I do not think that would be great from a speech point of view.

Finally, I turn to the point made by the noble Lord, Lord Moylan, on Amendment 188 about the ECHR Article 8 exemption. As I read it, if your terms of service are not consistent with ECHR Article 8—and I do not think they will be for most platforms—you then get an exemption from all the other duties around appeals and enforcing them correctly. It is probably a probing amendment but it is a curious way of framing it; it essentially says that, if you are more restrictive, you get more freedom in terms of the Ofcom relationship. I am just curious about the detail of that amendment.

It is important that we have this debate and understand this relationship between the state, platforms and terms of service. I for one am persuaded that the general framework of the Bill makes sense; there are necessary and proportionate restrictions. I am strongly of the view that platforms should be allowed to be heterodox in their terms of service. Ofcom’s job is very much to make sure that they are done correctly but not to interfere with the content of those terms of service beyond that which is illegal. I am persuaded that we need to be extraordinarily careful about including Public Order Act offences; that particular amendment needs a good hearing.

--- Later in debate ---
Amendments 286 and 294 would insert a definition of “freedom of expression” into the Bill. As I mentioned, I am grateful to the noble and learned Lord, Lord Hope, and my noble friend Lady Fraser for proposing these amendments, which align the definition of freedom of expression in the Bill with that in the European Convention on Human Rights. We agree with them that it will increase clarity about freedom of expression in the Bill, which is why I have added my name to their amendments and, when we come to the very end of Report—to which I look forward as well—I will be very glad to support them.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

My Lords, before my noble friend sits down, perhaps I could seek a point of clarification. I think I heard him say, at the beginning of his response to this short debate, that providers will be required to have terms of service which respect users’ rights. May I ask him a very straightforward question: do those rights include the rights conferred by Article 10 of the European Convention on Human Rights? Put another way, is it possible for a provider operating in the United Kingdom to have terms and conditions that abridge the rights conferred by Article 10? If it is possible, what is the Government’s defence of that? If it is not possible, what is the mechanism by which the Bill achieves that?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

As I set out, I think my noble friend and the noble Baroness, Lady Fox, are not right to point to the European Convention on Human Rights here. That concerns individuals’ and entities’ rights

“to receive and impart ideas without undue interference”

by public authorities, not private entities. We do not see how a service provider deciding not to allow certain types of content on its platform would engage the Article 10 rights of the user, but I would be very happy to discuss this further with my noble friend and the noble Baroness in case we are talking at cross-purposes.

Online Safety Bill

Lord Moylan Excerpts
Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to the amendments in this group in my name: Amendments 139, 140, 144 and 145. I thank the noble Lords, Lord Stevenson and Lord Clement-Jones, and the noble Viscount, Lord Colville, for signing those amendments and for their continued support on this group. I am also grateful to my noble friend the Minister and his team for engaging with me on the issue of Secretary of State powers. He has devoted a lot of time and energy to this, which is reflected in the wide- ranging group of amendments tabled by him.

Before I go any further, it is worth emphasising that the underlying concern here is making sure that we have confidence, through this new regulation regime, that the Bill strikes the right balance of power between government, Parliament, the regulator and big tech firms. The committee that I chair—the Communications and Digital Select Committee of your Lordships’ House—has most focused on that in our consideration of the Bill. I should say also that the amendments I have brought forward in my name very much have the support of the committee as well.

These amendments relate to Clause 39, which is where the main issue lies in the context of Secretary of State powers, and we have three broad concerns. First, as it stood, the Bill handed the Secretary of State unprecedented powers to direct the regulator on pretty much anything. Secondly, these powers allowed the Government to conduct an infinite form of ping-pong with the regulator, enabling the Government to prevail in a dispute. Thirdly, this ping-pong could take place in private with no possibility of parliamentary oversight or being able to intervene, as would be appropriate in the event of a breakdown in the relationship between executive and regulator.

This matters because the Online Safety Bill creates a novel form for regulating the internet and what we can or cannot see online, in particular political speech, and it applies to the future. It is one thing for the current Government, who I support, to say that they would never use the powers in this way. That is great but, as we know, current Governments cannot speak for whoever is in power in the generations to come, so it is important that we get this right.

As my noble friend said, he has brought forward amendments to Clause 39 that help to address this. I support him in and commend him for that. The original laundry list of powers to direct Ofcom has been shortened and now follows the precedent set out in the Communications Act 2003. The government amendments also say that the Secretary of State must now publish their directions to Ofcom, which will improve transparency, and once the code is agreed Ofcom will publish changes so that Parliament can see what changes have been made and why. These are all very welcome and, as I say, they go a long way to addressing some of our concerns, but two critical issues remain.

First, the Government retain an opt-out, which means that they do not have to publish their directions if the Secretary of State believes that doing so would risk

“national security or public safety”,

or international relations. However, those points are now the precise grounds on which the Secretary of State may issue a direction and, if history is any guide, there is a real risk that we will never hear about the directions because the Government have decided that they are a security issue.

My Amendments 139 and 140 would require the Secretary of State to at least notify Parliament of the fact that a direction has been issued and what broad topic it relates to. That would not require any details to be published, so it does not compromise security, but it does give assurance that infinite, secretive ping-pong is not happening behind the scenes. My noble friend spoke so quickly at the beginning that I was not quite sure whether he signalled anything, but I hope that he may be able to respond enthusiastically to Amendments 139 and 140.

Secondly, the Government still have powers for infinite ping-pong. I appreciate that the Government have reservations about capping the number of exchanges between the Secretary of State and Ofcom, but they must also recognise the concern that they appear to be preparing the ground for any future Government to reject infinitely the regulator’s proposals and therefore prevail in a dispute about a politically contentious topic. My Amendments 144 and 145 would clarify that the Government will have a legally binding expectation that they will use no more than the bare minimum number of directions to achieve the intent set out in their first direction.

The Government might think that adding this to the Bill is superfluous, but it is necessary in order to give Parliament and the public confidence about the balance of power in this regime. If Parliament felt that the Secretary of State was acting inappropriately, we would have sufficient grounds to intervene. As I said, the Government acknowledged in our discussions the policy substance of these concerns, and as we heard from my noble friend the Minister in introducing this group, there is an understanding on this. For his part, there is perhaps a belief that what they have done goes far enough. I urge him to reconsider Amendments 144 and 145, and I hope that, when he responds to the debate on this group, he can say something about not only Amendments 139 and 140 but the other two amendments that will give me some grounds for comfort.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - -

My Lords, I realise that I am something of a fish out of water in this House, as I was in Committee, on the Bill, which is fundamentally flawed in a number of respects, including its approach to governance, which we are discussing today. Having said that, I am generally sympathetic to the amendments proposed by my noble friend Lady Stowell of Beeston. If we are to have a flawed approach, her amendments would improve it somewhat.

However, my approach is rather different and is based on the fairly simple but important principle that we live in a free democracy. If we are to introduce a new legislative measure such as this Bill, which has far- reaching powers of censorship taking us back 70 or 80 years in terms of the freedom of expression we have been able to develop since the 1950s and 1960s— to the days of Lady Chatterleys Lover and the Lord Chamberlain, in equivalent terms, as far as the internet and the online world are concerned—then decisions of such a far-reaching character affecting our lives should be taken by somebody who is democratically accountable.

My approach is utterly different from that which my noble friend on the Front Bench has proposed. He has proposed amendments which limit yet further the Secretary of State’s power to give directions to Ofcom, but the Secretary of State is the only party in that relationship who has a democratic accountability. We are transferring huge powers to a completely unaccountable regulator, and today my noble friend proposes transferring, in effect, even more powers to that unaccountable regulator.

To go back to a point that was discussed in Committee and earlier on Report, if Ofcom takes certain decisions which make it impossible for Wikipedia to operate its current model, such that it has to close down at least its minority language websites—my noble friend said that the Government have no say over that and no idea what Ofcom will do—to whom do members of the public protest? To whom do they offer their objections? There is no point writing to the Secretary of State because, as my noble friend told us, they will not have had any say in the matter and we in this House will have forsworn the opportunity, which I modestly proposed, to take those powers here. There is no point writing to their MP, because all their MP can do is badger the Secretary of State. It is a completely unaccountable structure that is completely indefensible in a modern democratic society. So I object to the amendments proposed by my noble friend, particularly Amendments 136 and 137.

--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, not for the first time I find myself in quite a different place from my noble friend Lord Moylan. Before I go through some detailed comments on the amendments, I want to reflect that at the root of our disagreement is a fundamental view about how serious online safety is. The logical corollary of my noble friend’s argument is that all decisions should be taken by Secretaries of State and scrutinised in Parliament. We do not do that in other technical areas of health and safety in the physical world and we should not do that in the digital world, which is why I take such a different view—

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

My Lords—

Online Safety Bill

Lord Moylan Excerpts
Moved by
186A: Before Clause 64, insert the following new Clause—
“Terms of service as a contract
The terms of service under which a Category 1 service is provided to a person who is a consumer for the purposes of the Consumer Rights Act 2015 must be treated as being a contract for a trader to provide a service to a consumer.”Member’s explanatory statement
This purpose of this amendment is to ensure that providers’ terms of service are treated as consumer contracts, and to give users recourse to the remedies under the Consumer Rights Act 2015 in the event of breach.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

My Lords, in speaking to my Amendment 186A, I hope that noble Lords will forgive me for not speaking in detail to the many other amendments in this group correctly branded “miscellaneous” by those who compile our lists for us. Many of them are minor and technical, especially the government amendments. However, that is not true of all of them: Amendment 253 in the name of the noble Lord, Lord Clement-Jones, is a substantial amendment relating to regulatory co-operation, while Amendment 275A, in the name of the noble Baroness, Lady Finlay of Llandaff, is also of some interest, relating to the reports that Ofcom is being asked to produce on technological developments.

Nor is Amendment 191A lacking in importance and substance, although—I hope I will be forgiven for saying this, not in a snarky sort of way—for those of us who are worried about the enormous powers being given to Ofcom as a result of the Bill, the idea that it should be required by statute to give guidance to coroners, who are part of the courts system, seems to me strange and worth examining more closely. There might be a more seemly way of achieving the effect that the noble Baroness, Lady Kidron, understandably wants to achieve.

I turn to my own Amendment 186A, which, I hope, ought to be relatively straightforward. It concerns the terms of service of a contract with a category 1 service provider, and it is intended to improve the rights that consumers or users of that service have. It is the case that the Government want users of those services to have the ability to enforce their rights under contract against the service providers, as set out in Clause 65, and this is entirely welcome. However, it is well known that bringing claims in contract can be an expensive and onerous business, as I have pointed out in the past, particularly when the service is provided on the one-sided terms of the service provider—often, of course, drafted under the legal system of a foreign jurisdiction.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will make some arguments in favour of Amendment 191A, in the name of the noble Baroness, Lady Kidron, and inject some notes of caution around Amendment 186A.

On Amendment 191A, it has been my experience that when people frequently investigate something that has happened on online services, they do it well, and well-formed requests are critical to making this work effectively. This was the case with law enforcement: when an individual police officer is investigating something online for the first time, they often ask the wrong questions. They do not understand what they can get and what they cannot get. It is like everything in life: the more you do it, the better you get at it.

Fortunately, in a sense, most coroners will only very occasionally have to deal with these awful circumstances where they need data related to the death of a child. At that point, they are going to be very dependent on Ofcom—which will be dealing with the companies day in and day out across a range of issues—for its expertise. Therefore, it makes absolute sense that Ofcom’s expertise should be distributed widely and that coroners—at the point where they need to access this information—should be able to rely on that. So Amendment 191A is very well intended and, from a practical point of view, very necessary if we are going to make this new system work as I know the noble Baroness, Lady Kidron, and I would like to see it work.

On Amendment 186A around consumer law, I can see the attraction of this, as well as some of the read-across from the United States. A lot of the enforcement against online platforms in the US takes place through the Federal Trade Commission precisely in this area of consumer law and looking at unfair and deceptive practices. I can see the attraction of seeking to align with European Union law, as the noble Lord, Lord Moylan, argued we should be doing with respect to consumer law. However, I think this would be much better dealt with in the context of the digital markets Bill and it would be a mistake to squeeze it in here. My reasons for this are about both process and substance.

In terms of process, we have not done the impact assessment on this. It is quite a major change, for two reasons. First, it could potentially have a huge impact in terms of legal costs and the way businesses will have to deal with that—although I know nobody is going to get too upset if the impact assessment says there will be a significant increase in legal costs for category 1 companies. However, we should at least flesh these things out when we are making regulations and have them in an impact assessment before going ahead and doing something that would have a material impact.

Secondly in process terms, there are some really interesting questions about the way this might affect the market. The consumer law we have does exclude services that are offered for free, because so much of consumer law is about saying, “If the goods are not delivered correctly, you get your money back”. With free services, we are clearly dealing with a different model, so the notion that we have a law that is geared towards making sure you either get the goods or you get the money may not be the best fit. To try to shoehorn in these free-at-the-point-of-use services may not be the best way to do it, even from a markets and consumer point of view. Taking our time to think about how to get this right would make sense.

More fundamentally, in terms of the substance, we need to recognise that, as a result of the Online Safety Bill, Ofcom will be requiring regulated services to rewrite their terms of service in quite a lot of detail. We see this throughout the Bill. We are going to have to do all sorts of things—we will debate other amendments in this area today—to make sure that their terms of service are conformant with what we want from them in this Bill. They are going to have to redo their complaints and redress mechanisms. All of this is going to have to change and Ofcom is going to be the regulator that tells them how to do it; that is what we are asking Ofcom to tell them to do.

My fundamental concern here, if we introduce another element, is that there is a whole different structure under consumer law where you might go to local trading standards or the CMA, or you might launch a private action. In many cases, this may overlap. The overlap is where consumer law states that goods must be provided with reasonable care and skill and in a reasonable time. That sounds great, but it is also what the Online Safety Bill is going to be doing. We do not want consumer law saying, “You need to write your terms of service this way and handle complaints this way”, and then Ofcom coming along and saying, “No, you must write your terms of service that way and handle complaints that way”. We will end up in a mess. So I just think that, from a practical point of view, we should be very focused in this Bill on getting all of this right from an Online Safety Bill point of view, and very cautious about introducing another element.

Perhaps one of the attractions of the consumer law point for those who support the amendment is that it says, “Your terms must be fair”. It is the US model; you cannot have unfair terms. Again, I can imagine a scenario in which somebody goes to court and tries to get the terms struck down because they are unfair but the platform says, “They’re the terms Ofcom told me to write. Sort this out, please, because Ofcom is saying I need to do this but the courts are now saying the thing I did was unfair because somebody feels that they were badly treated”.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

Does the noble Lord accept that that is already a possibility? You can bring an action in contract law against them on the grounds that it is an unfair contract. This could happen already. It is as if the noble Lord is not aware that the possibility of individual action for breach of contract is already built into Clause 65. This measure simply supplements it.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I am certainly aware that it is there but, again, the noble Lord has just made the point himself: this supplements it. The intent of the amendment is to give consumers more rights under this additional piece of legislation; otherwise, why bother with the amendment at all? The noble Lord may be arguing against himself in saying that this is unnecessary and, at the same time, that we need to make the change. If we make the change, it is, in a sense, a material change to open the door to more claims being made under consumer law that terms are unfair. As I say, we may want this outcome to happen eventually, but I find it potentially conflicting to do it precisely at a time when we are getting Ofcom to intervene much more closely in setting those terms. I am simply arguing, “Let’s let that regime settle down”.

The net result and rational outcome—again, I am speaking to my noble friend’s Amendment 253 here—may be that other regulators end up deferring to Ofcom. If Ofcom is the primary regulator and we have told it, under the terms of the Online Safety Bill, “You must require platforms to operate in this way, handle complaints in this way and have terms that do these things, such as excluding particular forms of language and in effect outlawing them on platforms”, the other regulators will eventually end up deferring to it. All I am arguing is that, at this stage, it is premature to try to introduce a second, parallel route for people to seek changes to terms or different forms of redress, however tempting that may be. So I am suggesting a note of caution. It is not that we are starting from Ground Zero—people have routes to go forward today—but I worry about introducing something that I think people will see as material at this late stage, having not looked at the full impact of it and potentially running in conflict with everything else that we are trying to do in this legislation.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - -

My Lords, this has indeed been a wide-ranging and miscellaneous debate. I hope that since we are considering the Bill on Report noble Lords will forgive me if I do not endeavour to summarise all the different speeches and confine myself to one or two points.

The first is to thank the noble Baroness, Lady Kidron, for her support for my amendment but also to say that having heard her argument in favour of her Amendment 191A, I think the difference between us is entirely semantic. Had she worded it so as to say that Ofcom should be under a duty to offer advice to the Chief Coroner, as opposed to guidance to coroners, I would have been very much happier with it. Guidance issued under statute has to carry very considerable weight and, as my noble friend the Minister said, there is a real danger in that case of an arm of the Executive, if you like, or a creature of Parliament—however one wants to regard Ofcom—interfering in the independence of the judiciary. Had she said “advice to the Chief Coroner and whoever is the appropriate officer in Scotland”, that would have been something I could have given wholehearted support to. I hope she will forgive me for raising that quibble at the outset, but I think it is a quibble rather than a substantial disagreement.

On my own amendment, I simply say that I am grateful to my noble friend for the brevity and economy with which he disposed of it. He was of course assisted in that by the remarks and arguments made by many other noble Lords in the House as they expressed their support for it in principle.

I think there is a degree of confusion about what the Bill is doing. There seemed to be a sense that somehow the amendment was giving individuals the right to bring actions in the courts against providers, but of course that already happens because that right exists and is enshrined in Article 65. All the amendment would do is give some balance so that consumers actually had some protections in what is normally, in essence, an unequal contest, which is trying to ensure that a large company enforces the terms and contracts that it has written.

In particular, my amendment would give, as I think noble Lords know, the right to demand repeat performance—that is, in essence, the right to put things right, not monetary compensation—and it would frustrate any attempts by providers, in drafting their own terms and conditions, to limit their own liability. That is of course what they seek to do but the Consumer Rights Act frustrates them in their ability to do so.

We will say no more about that for now. With that, I beg leave to withdraw my amendment.

Amendment 186A withdrawn.
--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I associate myself with my noble friend Lady Fraser of Craigmaddie’s incredibly well-made points. I learned a long time ago that, when people speak very softly and say they have a very small point to make, they are often about to deliver a zinger. She really did; it was hugely powerful. I will say no more than that I wholeheartedly agree with her; thank you for helping us to understand the issue properly.

I will speak in more detail about access to data for researchers and in support of my noble friend Lord Bethell’s amendments. I too am extremely grateful to the Minister for bringing forward all the government amendments; the direction of travel is encouraging. I am particularly pleased to see the movement from “may” to “must”, but I am worried that it is Ofcom’s rather than the regulated services’ “may” that moves to “must”. There is no backstop for recalcitrant regulated services that refuse to abide by Ofcom’s guidance. As the noble Baroness, Lady Kidron, said, in other areas of the Bill we have quite reasonably resorted to launching a review, requiring Ofcom to publish its results, requiring the Secretary of State to review the recommendations and then giving the Secretary of State backstop powers, if necessary, to implement regulations that would then require regulated companies to change.

I have a simple question for the Minister: why are we not following the same recipe here? Why does this differ from the other issues, on which the House agrees that there is more work to be done? Why are we not putting backstop powers into the Bill for this specific issue, when it is clear to all of us that it is highly likely that there will be said recalcitrant regulated firms that are not willing to grant access to their data for researchers?

Before my noble friend the Minister leaps to the hint he gave in his opening remarks—that this should all be picked up in the Data Protection and Digital Information Bill—unlike the group we have just discussed, this issue was discussed at Second Reading and given a really detailed airing in Committee. This is not new news, in the same way that other issues where we have adopted the same recipe that includes a backstop are being dealt with in the Bill. I urge my noble friend the Minister to follow the good progress so far and to complete the package, as we have in other areas.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - -

My Lords, it is valuable to be able to speak immediately after my noble friend Lady Harding of Winscombe, because it gives me an opportunity to address some remarks she made last Wednesday when we were considering the Bill on Report. She suggested that there was a fundamental disagreement between us about our view of how serious online safety is—the suggestion being that somehow I did not think it was terribly important. I take this opportunity to rebut that and to add to it by saying that other things are also important. One of those things is privacy. We have not discussed privacy in relation to the Bill quite as much as we have freedom of expression, but it is tremendously important too.

Government Amendment 247A represents the most astonishing level of intrusion. In fact, I find it very hard to see how the Government think they can get away with saying that it is compatible with the provisions of the European Convention on Human Rights, which we incorporated into law some 20 years ago, thus creating a whole law of privacy that is now vindicated in the courts. It is not enough just to go around saying that it is “proportionate and necessary” as a mantra; it has to be true.

This provision says that an agency has the right to go into a private business with no warrant, and with no let or hindrance, and is able to look at its processes, data and equipment at will. I know of no other business that can be subjected to that without a warrant or some legal process in advance pertinent to that instance, that case or that business.

My noble friend Lord Bethell said that the internet has been abused by people who carry out evil things; he mentioned terrorism, for example, and he could have mentioned others. However, take mobile telephones and Royal Mail—these are also abused by people conducting terrorism, but we do not allow those communications to be intruded into without some sort of warrant or process. It does not seem to me that the fact that the systems can be abused is sufficient to justify what is being proposed.

My noble friend the Minister says that this can happen only offline. Frankly, I did not understand what he meant by that. In fact, I was going to say that I disagreed with him, but I am moving to the point of saying that I think it is almost meaningless to say that it is going to happen offline. He might be able to explain that. He also said that Ofcom will not see individual traffic. However, neither the point about being offline nor the point about not seeing individual traffic is on the face of the Bill.

When we ask ourselves what the purpose of this astonishing power is—this was referred to obliquely to some extent by the noble Baroness, Lady Fox of Buckley—we can find it in Clause 91(1), to which proposed new subsection (2A) is being added or squeezed in subordinate to it. Clause 91(1) talks about

“any information that they”—

that is, Ofcom—

“require for the purpose of exercising, or deciding whether to exercise, any of their online safety functions”.

The power could be used entirely as a fishing expedition. It could be entirely for the purpose of educating Ofcom as to what it should be doing. There is nothing here to say that it can have these powers of intrusion only if it suspects that there is criminality, a breach of the codes of conduct or any other offence. It is a fishing expedition, entirely for the purpose of

“exercising, or deciding whether to exercise”.

Those are the intrusions imposed upon companies. In some ways, I am less concerned about the companies than I am about what I am going to come to next: the intrusion on the privacy of individuals and users. If we sat back and listened to ourselves and what we are saying, could we explain to ordinary people—we are going to come to this when we discuss end-to-end encryption—what exactly can happen?

Two very significant breaches of the protections in place for privacy on the internet arise from what is proposed. First, if you allow someone into a system and into equipment, especially from outside, you increase the risk and the possibility that a further, probably more hostile party that is sufficiently well-equipped with resources—we know state actors with evil intent which are so equipped—can get in through that or similar holes. The privacy of the system itself would be structurally weakened as a result of doing this. Secondly, if Ofcom is able to see what is going on, the system becomes leaky in the direction of Ofcom. It can come into possession of information, some of which could be of an individual character. My noble friend says that it will not be allowed to release any data and that all sorts of protections are in place. We know that, and I fully accept the honesty and integrity of Ofcom as an institution and of its staff. However, we also know that things get leaked and escape. As a result of this provision, very large holes are being built into the protections of privacy that exist, yet there has been no reference at all to privacy in the remarks made so far by my noble friend.

I finish by saying that we are racing ahead and not thinking. Good Lord, my modest amendment in the last group to bring a well-established piece of legislation—the Consumer Rights Act—to bear upon this Bill was challenged on the grounds that there had not been an impact assessment. Where is the impact assessment for this? Where is even the smell test for this in relation to explaining it to the public? If my noble friend is able to expatiate at the end on the implications for privacy and attempt to give us some assurance, that would be some consolation. I doubt that he is going to give way and do the right thing and withdraw this amendment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, the debate so far has been—in the words of the noble Baroness, Lady Fox—a Committee debate. That is partly because this set of amendments from the Government has come quite late. If they had been tabled in Committee, I think we would have had a more expansive debate on this issue and could have knocked it about a bit and come back to it on Report. The timing is regrettable in all of this.

That said, the Government have tabled some extremely important amendments, particularly Amendments 196 and 198, which deal with things such as algorithms and functionalities. I very much welcome those important amendments, as I know the noble Baroness, Lady Kidron, did.

I also very much support Amendments 270 and 272 in the name of the noble Baroness, Lady Fraser. I hope the Minister, having been pre-primed, has all the answers to them. It is astonishing that, after all these years, we are so unattuned to the issues of the devolved Administrations and that we are still not in the mindset on things such as research. We are not sufficiently granular, as has been explained—let alone all the other questions that the noble Lord, Lord Stevenson, asked. I hope the Minister can unpack some of that as well.

I want to express some gratitude, too, because the Minister and his officials took the trouble to give us a briefing about remote access issues, alongside Ofcom. Ofcom also sent through its note on algorithmic assessment powers, so an effort has been made to explain some of these powers. Indeed, I can see the practical importance, as explained to us. It is partly the lateness, however, that sets off what my noble friend Lord Allan called “trigger words” and concerns about the remote access provisions. Indeed, I think we have a living and breathing demonstration of the impact of triggers on the noble Lord, Lord Moylan, because these are indeed issues that concern those outside the House to quite a large degree.

--- Later in debate ---
Moved by
225: After Clause 161, insert the following new Clause—
“Transparency of government representations to regulated service providers
(1) The Secretary of State must produce a report setting out any relevant representations His Majesty’s Government have made to providers of Part 3 services to tackle the presence of misinformation and disinformation on Part 3 services.(2) In this section “relevant representations” are representations that could reasonably be considered to be intended to persuade or encourage a provider of a Part 3 service to—(a) modify the terms of service of a regulated service in an effort to address misinformation or disinformation,(b) restrict or remove a particular user’s access to accounts used by them on a regulated service, or(c) take down, reduce the visibility of, or restrict access to content that is present or may be encountered on a regulated service.(3) The first report must be laid before both Houses of Parliament within six months of this Act being passed.(4) Subsequent reports must be laid before both Houses of Parliament at intervals not exceeding six months.(5) The Secretary of State is not required by this section to include in the report information that the Secretary of State considers would be against the interests of national security.(6) If the Secretary of State relies upon subsection (5) they must as soon as reasonably practicable send a report containing that information to the Intelligence and Security Committee of Parliament.”Member’s explanatory statement
This amendment addresses government influence on content moderation, for example by way of initiatives like the Government’s Counter Disinformation Unit.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

My Lords, continuing the rather radical approach of debating an amendment that has already been debated in Committee and has not just been introduced, and picking up on the theme of our debate immediately before we adjourned, I move an amendment that seeks to address the question of the Government’s activities in interacting with providers when they seek to influence providers on what is shown on their sites.

It might be a matter of interest that according to the Daily Telegraph, which I implicitly trust, only on Tuesday of last week, a judge in Louisiana in the United States issued an injunction forbidding a lengthy list of White House officials from making contact with social media companies to report misinformation. I say this not because I expect the jurisprudence of the state of Louisiana to have any great influence in your Lordships’ House but simply to show how sensitive and important this issue is. The judge described what he had heard and seen as one of the greatest assaults on free speech in the history of the United States.

We are not necessarily quite in that territory, and nor does my amendment do anything so dramatic as to prevent the Government communicating with providers with a view to influencing their content, but Amendment 225 requires the Secretary of State to produce a report within six months of the passing of the Act, and every six months thereafter, in which he sets out

“any relevant representations His Majesty’s Government have made to providers”

that are

“intended to persuade or encourage a provider”

to do one of three things. One is to

“modify the terms of service of a regulated service in an effort to address misinformation or disinformation”;

one is to

“restrict or remove a particular user’s access to accounts used by them”;

and the third is to

“take down, reduce the visibility of, or restrict access to content that is present or may be encountered on a regulated service”.

None of these things would be prohibited or prevented by this amendment, but it would be required that His Majesty’s Government produce a report saying what they have done every six months.

Very importantly there is an exception, in that there would be no obligation on the Secretary of State to disclose publicly any information that affected national security, but he would be required in that case to make a report to the Intelligence and Security Committee here in Parliament. As I said, this is a very sensitive subject, and remarks made by the noble Baroness, Lady Fox of Buckley, in the previous debate referred in particular to this subject in connection with the pandemic. While that is in the memory, other topics may easily come up and need to be addressed, where the Government feel obliged to move and take action.

We know nothing about those contacts, because they are not instructions or actions taken under law. They are simply nudges, winks and phone conversations with providers that have an effect and, very often, the providers will act on them. Requiring the Government to make a report and say what they have done seems a modest, proportionate and appropriate means to bring transparency to this exercise, so that we all know what is going on.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I have addressed the points made by the noble Baroness and my noble friend already. She asks the same question again and I can give her the same answer. We are operating openly and transparently here, and the Bill sets out further provisions for transparency and accountability.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - -

My Lords, I see what my noble friend did there, and it was very cunning. He gave us a very worthwhile account of the activities of the Counter Disinformation Unit, a body I had not mentioned at all, as if the Counter Disinformation Unit was the sole locus of this sort of activity. I had not restricted it to that. We know, in fact, that other bodies within government have been involved in undertaking this sort of activity, and on those he has given us no answer at all, because he preferred to answer about one particular unit. He referred also to its standardised transparency processes. I can hardly believe that I am reading out words such as those. The standardised transparency process allows us all to know that encounters take place but still refuses to let us know what actually happens in any particular encounter, even though there is a great public interest in doing so. However, I will not press it any further.

My noble friend, who is genuinely a friend, is in danger of putting himself, at the behest of civil servants and his ministerial colleagues, in some danger. We know what happens in these cases. The Minister stands at the Dispatch Box and says “This has never happened; it never normally happens; it will not happen. Individuals are never spoken of, and actions of this character are never taken”. Then of course, a few weeks or months later, out pour the leaked emails showing that all these things have been happening all the time. The Minister then has to resign in disgrace and it is all very sad. His friends, like myself, rally round and buy him a drink, before we never see him again.

Anyway, I think my noble friend must be very careful that he does not put himself in that position. I think he has come close to doing so this evening, through the assurances he has given your Lordships’ House. Although I do not accept those assurances, I will none the less withdraw the amendment, with the leave of the House.

Amendment 225 withdrawn.

Online Safety Bill

Lord Moylan Excerpts
Amendment 257D requires Ofcom to consider the impact that the use of a particular technology on a particular service would have on the availability of journalistic content and the confidentiality of journalistic sources when considering whether to issue a notice. It builds on the existing safeguards in Clause 113 regarding freedom of expression and privacy. I am grateful to the noble Lord, Lord Stevenson of Balmacara, for his constructive engagement on this issue. I beg to move.
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - -

My Lords, I am conscious of the imprecation earlier from the noble Lord, Lord Stevenson of Balmacara, that we keep our contributions short, but I intend to take no notice of it. That is for the very good reason that I do not think the public would understand why we disposed of such a momentous matter as bringing to an end end-to-end encryption on private messaging services as a mere technicality and a brief debate at the end of Report.

It is my view that end-to-end encryption is assumed nowadays by the vast majority of people using private messaging services such as WhatsApp, iMessage and Signal. They are unaware, I think, of the fact that it is about to be taken from them by Clause 111 of the Bill. My amendment would prevent that. It is fairly plain; it says that

“A notice under subsection (1)”


of Clause 111

“may not impose a requirement relating to a service if the effect of that requirement would be to require the provider of the service to weaken or remove end-to-end encryption applied in relation to the service”.

My noble friend says that there is no threat of ending end-to-end encryption in his proposal, but he achieves that by conflating two things—which I admit my own amendment conflates, but I will come back to that towards the end. They are the encryption of platforms and the encryption of private messaging services. I am much less concerned about the former. I am concerned about private messaging services. If my noble friend was serious in meaning that there was no threat to end-to-end encryption, then I cannot see why he would not embrace my amendment, but the fact that he does not is eloquent proof that it is in fact under threat, as is the fact that the NSPCC and the Internet Watch Foundation are so heavily lobbying against my amendment. They would not be doing that if they did not think it had a serious effect.

I shall not repeat at any length the technical arguments we had in Committee, but the simple fact is that if you open a hole into end-to-end encryption, as would be required by this provision, then other people can get through that hole, and the security of the system is compromised. Those other people may not be very nice; they could be hostile state actors—we know hostile state actors who are well enough resourced to do this—but they could also be our own security services and others, from whom we expect protection. Normally, we do get a degree of protection from those services, because they are required to have some form of warrant or prior approval but, as I have explained previously in debate on this, these powers being given to Ofcom require no warrant or prior approval in order to be exercised. So there is a vulnerability, but there is also a major assault on privacy. That is the point on which I intend to start my conclusion.

If we reflect for a moment, the evolution of this Bill in your Lordships’ House has been characterised and shaped, to a large extent, by the offer made by the noble Lord, Lord Stevenson of Balmacara, when he spoke at Second Reading, to take a collaborative approach. But that collaborative approach has barely extended to those noble Lords concerned about privacy and freedom of expression. As a result, in my view, those noble Lords rightly promoting child protection have been reckless to the point of overreaching themselves.

If we stood back and had to explain to outsiders that we were taking steps today that took end-to-end encryption and the privacy they expect on their private messaging services away from them, together with the security and protection it gives, of course, in relation to scams and frauds and all the other things where it has a public benefit, then I think they would be truly outraged. I do not entirely understand how the Government think they could withstand that outrage, were it expressed publicly. I actually believe that the battle for this Bill—this part of this Bill, certainly—is only just starting. We may be coming to the end here, but I do not think that this Bill is settled, because this issue is such a sensitive one.

Given the manifest and widespread lack of support for my views on this question in your Lordships’ House in Committee, I will not be testing the opinion of the House today. I think I know what the opinion of the House is, but it is wrong, and it will have to be revised. My noble friend simply cannot stand there and claim that what he is proposing is proportionate and necessary, because it blatantly and manifestly is not.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, the powers in Clause 111 are perhaps the most controversial outstanding issue in the Bill. I certainly agree with the noble Lord, Lord Moylan, that they deserve some continued scrutiny. I suspect that Members of another place are being lobbied on this extensively right now. Again, it is one of the few issues; they may not have heard of the Online Safety Bill, but they will do in the context of this particular measure.

We debated the rights and wrongs of encryption at some length in Committee, and I will not repeat those points today, not least because the noble Lord, Lord Moylan, has made some of the arguments as to why encryption is important. I will instead today focus on the future process, assuming that the Clause 111 powers will be available to Ofcom as drafted and that we are not going to accept the amendment from the noble Lord, Lord Moylan.

Amendments 258 and 258ZA, in my name and that of my noble friend Lord Clement-Jones, both aim to improve the process of issuing a Clause 111 order by adding in some necessary checks and balances.

As we debate this group, we should remember that the Clause 111 powers are not specific to encrypted services—I think the Minister made this point—and we should have the broader context in mind. I often try to bring some concrete scenarios to our discussions, and it may be helpful to consider three different scenarios in which Ofcom might reach for a Clause 111 notice.

The first is where a provider has no particular objections to using technology to identify and remove child sexual exploitation and abuse material or terrorist material but is just being slow to do this. There are mature systems out there. PhotoDNA is very well known in the industry and effectively has a database with digital signatures of known child sexual exploitation material. All the services we use on a daily basis such as Facebook, Instagram and others will check uploaded photos against that database and, where it is child sexual exploitation material, they will make sure that it does not get shown and that those people are reported to the authorities.

I can imagine scenarios where Ofcom is dealing with a service which has not yet implemented the technology—but does not have a problem doing it—and the material is unencrypted so there is no technical barrier; it is just being a bit slow. In those scenarios, Ofcom will tell the service to get on with it or it will get a Clause 111 notice. In those circumstances, in most cases the service will just get on with it, so Ofcom will be using the threat of the notice as a way to encourage the slow coaches. That is pretty unexceptional; it will work in a pretty straightforward way. I think the most common use of these notices may be to bring outliers into the pack of those who are following best practice. Ofcom may not even need to issue any kind of warning notice at all and will not get past the warning notice period. Waving a warning notice in front of a provider may be sufficient to get it to move.

The second scenario is one where the provider equally does not object to the use of the technology but would prefer to have a notice before it implements it. Outside the world of tech companies, it may seem a little strange why a provider would want to be ordered to do something rather than doing the right thing voluntarily, but we have to remember that the use of this kind of technology is legally fraught in many jurisdictions. There have been court cases in a number of places, not least the European Union, where there are people who will challenge whether you should use this technology on unencrypted services, never mind encrypted ones. In those cases, you can imagine there will be providers, particularly those established outside the United Kingdom, which may say, “Look, we are fine implementing this technology, but Ofcom please can you give us a notice? Then when someone challenges it in court, we can say that the UK regulator made us do it”. That would be helpful to them. This second group will want a notice and here we will get to the point of the notice being issued. They are not going to contest it; they want to have the notice because it gives them some kind of legal protection.

I think those two groups are relatively straightforward: we are dealing with companies which are being slow or are looking for legal cover but do not fundamentally object. The third scenario, though, is the most challenging and it is where I think the Government could get into real trouble. My amendments seek to help the Government in situations where a provider fundamentally objects to being ordered to deploy a particular technology because it believes that that technology will create real privacy threats and risks to the service that it offers. I do not think the provider is being awkward in these circumstances; it has genuine concerns about the implications of the technology being developed or which it is being instructed to deploy.

In these circumstances, Ofcom may have all the reasons in the world to argue why it thinks that what it is asking for is reasonable. However, the affected provider may not accept those reasons and take quite a strong counterview and have all sorts of other arguments as to why what it is being asked to do is unacceptable and too high-risk. This debate has been swirling around at the moment as we think about current models of end-to-end encryption and client-side scanning technology, but we need to recognise that this Bill is going to be around for a while and there may be all sorts of other technologies being ordered to be deployed that we do not even know about and have not even been developed yet. At any point, we may hit this impasse where Ofcom is saying it thinks it is perfectly reasonable to order a company to do it and the service provider is saying, “No, as we look at this, our experts and our lawyers are telling us that this is fundamentally problematic from a privacy point of view”.

--- Later in debate ---
Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - - - Excerpts

Just to be clear, am I right to understand my noble friend as saying that there is currently no technology that would be technically acceptable for tech companies to do what is being asked of them? Did he say that tech companies should be looking to develop the technology to do what may be required of them but that it is not currently available to them?

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

For clarification, if the answer to that is that the technology does not exist—which I believe is correct, although there are various snake oil salespeople out there claiming that it does, as the noble Baroness, Lady Fox of Buckley, said—my noble friend seems to be saying that the providers and services should develop it. This seems rather circular, as the Bill says that they must adopt an approved technology, which suggests a technology that has been imposed on them. What if they cannot and still get such a notice? Is it possible that these powers will never be capable of being used, especially if they do not co-operate?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

To answer my noble friend Lady Stowell first, it depends on the type of service. It is difficult to give a short answer that covers the range of services that we want to ensure are covered here, but we are seeking to keep this and all other parts of the Bill technology neutral so that, as services develop, technology changes and criminals, unfortunately, seek to exploit that, technology companies can continue to innovate to keep children safe while protecting the privacy of their users. That is a long-winded answer to my noble friend’s short question, but necessarily so. Ofcom will need to make its assessments on a case- by-case basis and can require a company to use its best endeavours to innovate if no effective and accurate technology is currently available.

While I am directing my remarks towards my noble friend, I will also answer a question she raised earlier on general monitoring. General monitoring is not a legally defined concept in UK law; it is a term in European Union law that refers to the generalised monitoring of user activity online, although its parameters are not clearly defined. The use of automated technologies is already fundamental to how many companies protect their users from the most abhorrent harms, including child sexual abuse. It is therefore important that we empower Ofcom to require the use of such technology where it is necessary and proportionate and ensure that the use of these tools is transparent and properly regulated, with clear and appropriate safeguards in place for users’ rights. The UK’s existing intermediary liability regime remains in place.

Amendment 255 from my noble friend Lord Moylan seeks to prevent Ofcom imposing any requirement in a notice that would weaken or remove end-to-end encryption. He is right that end-to-end encryption should not be weakened or removed. The powers in the Bill will not do that. These powers are underpinned by proportionality and technical feasibility; if it is not proportionate or technically feasible for companies to identify child sexual exploitation abuse content on their platform while upholding users’ right to privacy, Ofcom cannot require it.

I agree with my noble friend and the noble Baroness, Lady Fox, that encryption is a very important and popular feature today. However, with technology evolving at a rapid rate, we cannot accept amendments that would risk this legislation quickly becoming out of date. Naming encryption in the Bill would risk that happening. We firmly believe that the best approach is to focus on strong safeguards for upholding users’ rights and ensuring that measures are proportionate to the specific situation, rather than on general features such as encryption.

The Bill already requires Ofcom to consider the risk that technology could result in a breach of any statutory provision or rule of law concerning privacy and whether any alternative measures would significantly reduce the amount of illegal content on a service. As I have said in previous debates, Ofcom is also bound by the Human Rights Act not to act inconsistently with users’ rights.

Online Safety Bill

Lord Moylan Excerpts
Our stance on tackling child sexual abuse online remains firm, and we have always been clear that the Bill takes a measured, evidence-based approach to do this. I hope that is useful clarification for those who still had questions on that point.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - -

Will my noble friend draw attention to the part of Clause 122 that says that Ofcom cannot issue a requirement which is not technically feasible, as he has just said? That does not appear in the text of the clause, and it creates a potential conflict. Even if the requirement is not technically feasible—or, at least, if the platform claims that it is not—Ofcom’s power to require it is not mitigated by the clause. It still has the power, which it can exercise, and it can presumably take some form of enforcement action if it decides that the company is not being wholly open or honest. The technical feasibility is not built into the clause, but my noble friend has just added it, as with quite a lot of other stuff in the Bill.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

It has to meet minimum standards of accuracy and must have privacy safeguards in place. The clause talks about those in a positive sense, which sets out the expectation. I am happy to make clear, as I have, what that means: if the appropriate technology does not exist that meets these requirements, then Ofcom will not be able to use Clause 122 to require its use. I hope that that satisfies my noble friend.

--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I want to thank the Minister and other noble colleagues for such kind words. I really appreciate it.

I want to say very little. It has been an absolute privilege to work with people across both Houses on this. It is not every day that one keeps the faith in the system, but this has been a great pleasure. In these few moments that I am standing, I want to pay tribute to the bereaved parents, the children’s coalition, the NSPCC, my colleagues at 5Rights, Barnardo’s, and the other people out there who listen and care passionately that we get this right. I am not going to go through what we got right and wrong, but I think we got more right than we got wrong, and I invite the Minister to sit with me on Monday in the Gallery to make sure that those last little bits go right—because I will be there. I also remind the House that we have some work in the data Bill vis-à-vis the bereaved parents.

In all the thanks—and I really feel that I have had such tremendous support on my area of this Bill—I pay tribute to the noble Baroness, Lady Benjamin. She was there before many people were and suffered cruelly in the legislative system. Our big job now is to support Ofcom, hold it to account and help it in its task, because that is Herculean. I really thank everyone who has supported me through this.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - -

My Lords, I am sure that your Lordships would not want the Bill to pass without hearing some squeak of protest and dissent from those of us who have spent so many days and weeks arguing for the interests of privacy and free speech, to which the Bill remains a very serious and major threat.

Before I come to those remarks, I associate myself with what other noble Lords have said about what a privilege it has been, for me personally and for many of us, to participate over so many days and weeks in what has been the House of Lords at its deliberative best. I almost wrote down that we have conducted ourselves like an academic seminar, but when you think about what most academic seminars are like—with endless PowerPoint slides and people shuttling around, and no spontaneity whatever—we exceeded that by far. The conversational tone that we had in the discussions, and the way in which people who did not agree were able to engage—indeed, friendships were made—meant that the whole thing was done with a great deal of respect, even for those of us who were in the small minority. At this point, I should perhaps say on behalf of the noble Baroness, Lady Fox of Buckley, who participated fully in all stages of the Bill, that she deeply regrets that she cannot be in her place today.

I am not going to single out anybody except for one person. I made the rather frivolous proposal in Committee that all our debates should begin with the noble Lord, Lord Allan of Hallam; we learned so much from every contribution he made that he really should have kicked them all off. We would all have been a great deal more intelligent about what we were saying, and understood it better, had we heard what he had to say. I certainly have learned a great deal from him, and that was very good.

I will raise two issues only that remain outstanding and are not assuaged by the very odd remarks made by my noble friend as he moved the Third Reading. The first concerns encryption. The fact of the matter is that everybody knows that you cannot do what Ofcom is empowered by the Bill to do without breaching end-to-end encryption. It is as simple as that. My noble friend may say that that is not the Government’s intention and that it cannot be forced to do it if the technology is not there. None of that is in the Bill, by the way. He may say that at the Dispatch Box but it does not address the fact that end-to-end encryption will be breached if Ofcom finds a way of doing what the Bill empowers it to do, so why have we empowered it to do that? How do we envisage that Ofcom will reconcile those circumstances where platforms say that they have given their best endeavours to doing something and Ofcom simply does not believe that they have? Of course, it might end up in the courts, but the crucial point is that that decision, which affects so many people—and so many people nowadays regard it as a right to have privacy in their communications—might be made by Ofcom or by the courts but will not be made in this Parliament. We have given it away to an unaccountable process and democracy has been taken out of it. In my view, that is a great shame.

I come back to my second issue—I will not be very long. I constantly ask about Wikipedia. Is Wikipedia in scope of the Bill? If it is, is it going to have to do prior checking of what is posted? That would destroy its business model and make many minority language sites—I instanced Welsh—totally unviable. My noble friend said at the Dispatch Box that, in his opinion, Wikipedia was not going to be in scope of the Bill. But when I asked why we could not put that in the Bill, he said it was not for him to decide whether it was in scope and that the Government had set up this wonderful structure whereby Ofcom will tell us whether it is—almost without appeal, and again without any real democratic scrutiny. Oh yes, and we might have a Select Committee, which might write a very good, highly regarded report, which might be debated some time within the ensuing 12 months on the Floor of your Lordships’ House. However, we will have no say in that matter; we have given it away.

I said at an earlier stage of the Bill that, for privacy and censorship, this represents the closest thing to a move back to the Lord Chamberlain and Lady Chatterley’s Lover that you could imagine but applied to the internet. That is bad, but what is almost worse is this bizarre governance structure where decisions of crucial political sensitivity are being outsourced to an unaccountable regulator. I am very sad to say that I think that, at first contact with reality, a large part of this is going to collapse, and with it a lot of good will be lost.