I hope that the Minister will accept that a number of these amendments are particularly helpful in strengthening the Bill, and that he will find a way to accept that form of strengthening.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

I am very grateful to the noble Lords who have spoken on the amendments in this group, both this afternoon and last Tuesday evening. As this is a continuation of that debate, I think my noble friend Lord Moylan is technically correct still to wish the noble Baroness, Lady Kidron, a happy birthday, at least in procedural terms.

We have had a very valuable debate over both days on the Bill’s approach to holding platforms accountable to their users. Amendments 33B, 41A, 43ZA, 138A and 194A in the names of the noble Lords, Lord Lipsey and Lord McNally, and Amendment 154 in the name of the noble Lord, Lord Stevenson of Balmacara, seek to bring back the concept of legal but harmful content and related adult risk assessments. They reintroduce obligations for companies to consider the risk of harm associated with legal content accessed by adults. As noble Lords have noted, the provisions in the Bill to this effect were removed in another place, after careful consideration, to protect freedom of expression online. In particular, the Government listened to concerns that the previous legal but harmful provisions could create incentives for companies to remove legal content from their services.

In place of adult risk assessments, we introduced new duties on category 1 services to enable users themselves to understand how these platforms treat different types of content, as set out in Clauses 64 and 65. In particular, this will allow Ofcom to hold them to account when they do not follow through on their promises regarding content they say that they prohibit or to which they say that they restrict access. Major platforms already prohibit much of the content listed in Clause 12, but these terms of service are often opaque and not consistently enforced. The Bill will address and change that.

I would also like to respond to concerns raised through Amendments 41A and 43ZA, which seek to ensure that the user empowerment categories cover the most harmful categories of content to adults. I reassure noble Lords that the user empowerment list reflects input from a wide range of interested parties about the areas of greatest concern to users. Platforms already have strong commercial incentives to tackle harmful content. The major technology companies already prohibit most types of harmful and abusive content. It is clear that most users do not want to see that sort of content and most advertisers do not want their products advertised alongside it. Clause 12 sets out that providers must offer user empowerment tools with a specified list of content to the extent that it is proportionate to do so. This will be based on the size or capacity of the service as well as the likelihood that adult users will encounter the listed content. Providers will therefore need internally to assess the likelihood that users will encounter the content. If Ofcom disagrees with the assessment that a provider has made, it will have the ability to request information from providers for the purpose of assessing compliance.

Amendments 44 and 158, tabled by the right reverend Prelate the Bishop of Oxford, seek to place new duties on providers of category 1 services to produce an assessment of their compliance with the transparency, accountability, freedom of expression and user empowerment duties as set out in Clauses 12, 64 and 65 and to share their assessments with Ofcom. I am sympathetic to the aim of ensuring that Ofcom can effectively assess companies’ compliance with these duties. But these amendments would enable providers to mark their own homework when it comes to their compliance with the duties in question. The Bill has been designed to ensure that Ofcom has responsibility for assessing compliance and that it can obtain sufficient information from all regulated services to make judgments about compliance with their duties. The noble Baroness, Lady Kidron, asked about this—and I think the noble Lord, Lord Clement-Jones, is about to.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I hope the Minister will forgive me for interrupting, but would it not be much easier for Ofcom to assess compliance if a risk assessment had been carried out?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I will come on to say a bit more about how Ofcom goes about that work.

The Bill will ensure that providers have the information they need to understand whether they are in compliance with their duties under the Bill. Ofcom will set out how providers can comply in codes of practice and guidance that it publishes. That information will help providers to comply, although they can take alternative action if they wish to do so.

The right reverend Prelate’s amendments also seek to provide greater transparency to Ofcom. The Bill’s existing duties already account for this. Indeed, the transparency reporting duties set out in Schedule 8 already enable Ofcom to require category 1, 2A and 2B services to publish annual transparency reports with relevant information, including about the effectiveness of the user empowerment tools, as well as detailed information about any content that platforms prohibit or restrict, and the application of their terms of service.

Amendments 159, 160 and 218, tabled by the noble Lord, Lord Stevenson, seek to require user-to-user services to create and abide by minimum terms of service recommended by Ofcom. The Bill already sets detailed and binding requirements on companies to achieve certain outcomes. Ofcom will set out more detail in codes of practice about the steps providers can take to comply with their safety duties. Platforms’ terms of service will need to provide information to users about how they are protecting users from illegal content, and children from harmful content.

These duties, and Ofcom’s codes of practice, ensure that providers take action to protect users from illegal content and content that is harmful to children. As such, an additional duty to have adequate and appropriate terms of service, as envisaged in the amendments, is not necessary and may undermine the illegal and child safety duties.

I have previously set out why we do not agree with requiring platforms to set terms of service for legal content. In addition, it would be inappropriate to delegate this much power to Ofcom, which would in effect be able to decide what legal content adult users can and cannot see.

Amendment 155, tabled by my noble friend Lord Moylan, seeks to clarify whether and how the Bill makes the terms of service of foreign-run platforms enforceable by Ofcom. Platforms’ duties under Clause 65 apply only to the design, operation and use of the service in the United Kingdom and to UK users, as set out in Clause 65(11). Parts or versions of the service which are used in foreign jurisdictions—

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

On that, in an earlier reply the Minister explained that platforms already remove harmful content because it is harmful and because advertisers and users do not like it, but could he tell me what definition of “harmful” he thinks he is using? Different companies will presumably have a different interpretation of “harmful”. How will that work? It would mean that UK law will require the removal of legal speech based on a definition of harmful speech designed by who—will it be Silicon Valley executives? This is the problem: UK law is being used to implement the removal of content based on decisions that are not part of UK law but with implications for UK citizens who are doing nothing unlawful.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

The noble Baroness’s point gets to the heart of the debate that we have had. I talked earlier about the commercial incentive that there is for companies to take action against harmful content that is legal which users do not want to see or advertisers do not want their products to be advertised alongside, but there is also a commercial incentive to ensure that they are upholding free speech and that there are platforms on which people can interact in a less popular manner, where advertisers that want to advertise products legally alongside that are able to do so. As with anything that involves the market, the majority has a louder voice, but there is room for innovation for companies to provide products that cater to minority tastes within the law.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, my noble friend has explained clearly how terms of service would normally work, which is that, as I said myself, a business might write its own terms of service to its own advantage but it cannot do so too egregiously or it will lose customers, and businesses may aim themselves at different customers. All this is part of normal commercial life, and that is understood. What my noble friend has not really addressed is the question of why uniquely and specifically in this case, especially given the egregious history of censorship by Silicon Valley, he has chosen to put that into statute rather than leave it as a commercial arrangement, and to make it enforceable by Ofcom. For example, when my right honourable friend David Davis was removed from YouTube for his remarks about Covid passes, it would have been Ofcom’s obligation not to vindicate his right to free speech but to cheer on YouTube and say how well it had done for its terms of service.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Our right honourable friend’s content was reuploaded. This makes the point that the problem at the moment is the opacity of these terms and conditions; what platforms say they do and what they do does not always align. The Bill makes sure that users can hold them to account for the terms of service that they publish, so that people can know what to expect on platforms and have some form of redress when their experience does not match their expectations.

I was coming on to say a bit more about that after making some points about foreign jurisdictions and my noble friend’s Amendment 155. As I say, parts or versions of the service that are used in foreign jurisdictions but not in the UK are not covered by the duties in Clause 65. As such, the Bill does not require a provider to have systems and processes designed to enforce any terms of service not applicable in the UK.

In addition, the duties do not give powers to Ofcom to enforce a provider’s terms of service directly. Ofcom’s role will be focused on ensuring that platforms have systems and processes in place to enforce their own terms of service consistently rather than assessing individual pieces of content.

Requiring providers to set terms of service for specific types of content suggests that the Government view that type of content as harmful or risky. That would encourage providers to prohibit such content, which of course would have a negative impact on freedom of expression, which I am sure is not what my noble friend wants to see. Freedom of expression is essential to a democratic society. Throughout the passage of the Bill, the Government have always committed to ensuring that people can speak freely online. We are not in the business of indirectly telling companies what legal content they can and cannot allow online. Instead, the approach that we have taken will ensure that platforms are transparent and accountable to their users about what they will and will not allow on their services.

Clause 65 recognises that companies, as private entities, have the right to remove content that is legal from their services if they choose to do so. To prevent them doing so, by requiring them to balance this against other priorities, would have perverse consequences for their freedom of action and expression. It is right that people should know what to expect on platforms and that they are able to hold platforms to account when that does not happen. On that basis, I invite the noble Lords who have amendments in this group not to press them.

Lord McNally Portrait Lord McNally (LD)
- View Speech - Hansard - - - Excerpts

My Lords, in his opening remarks, the Minister referred to the fact that this debate began last Tuesday. Well, it did, in that I made a 10-minute opening speech and the noble Baroness, Lady Stowell, rather elegantly hopped out of this group of amendments; perhaps she saw what was coming.

How that made me feel is perhaps best summed up by what the noble Earl, Lord Howe, said earlier when he was justifying the business for tomorrow. He said that adjournments were never satisfactory. In that spirit, I wrote to the Leader of the House, expressing the grumbles I made in my opening remarks. He has written back in a very constructive and thoughtful way. I will not delay the Committee any longer, other than to say that I hope the Leader of the House would agree to make his reply available for other Members to read. It says some interesting things about how we manage business. It sounds like a small matter but if what happened on Tuesday had happened in other circumstances in the other place, business would probably have been delayed for at least an hour while the usual suspects picked holes in it. If the usual channels would look at this, we could avoid some car crashes in future.

I am pleased that this group of amendments has elicited such an interesting debate, with fire coming from all sides. In introducing the debate, I said that probably the only real advice I could give the Committee came from my experience of being on the pre-legislative scrutiny committee in 2003. That showed just how little we were prepared for the tsunami of new technology that was about to engulf us. My one pleasure was that we were part of forming Ofcom. I am pleased that the chairman of Ofcom, the noble Lord, Lord Grade, has assiduously sat through our debates. I suspect he is thinking that he had better hire some more lawyers.

We are trying to get this right. I have no doubt that all sides of the House want to get this legislation through in good shape and for it to play an important role. I am sure that the noble Lord, Lord Grade, never imagined that he would become a state regulator in the kind of ominous way in which the noble Baroness, Lady Fox, said it. Ofcom has done a good job and will do so in future.

There is a problem of getting definitions right. When I was at the Ministry of Justice, I once had to entertain a very distinguished American lawyer. As I usually did, I explained that I was not a lawyer. He looked at me and said, “Then I will speak very slowly”. There is a danger, particularly in this part of the Bill, of wandering into a kind of lawyer-fest. It is important that we are precise about what powers we are giving to whom. Just to chill the Minister’s soul, I remember being warned as well about Pepper v Hart. What he says at the Dispatch Box will be used to interpret what Parliament meant when it gave this or that power.

The debate we have had thus far has been fully justified in sending a few warning signals to the Minister that it is perhaps not quite right yet. It needs further work. There is a lot of good will on all sides of the House to get it right. For the moment, I beg leave to withdraw my amendment.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

As ever, the noble Baroness is an important voice in bursting our bubble in the Chamber. I continue to respect her for that. It will not be perfect; there is no perfect answer to all this. I am siding with safety and caution rather than a bit of a free-for-all. Sometimes there might be overcaution and aspects of debate where the platforms, the regulator, the media, and discussion and debate in this Chamber would say, “The toggles have got it wrong”, but we just have to make a judgment about which side we are on. That is what I am looking forward to hearing from the Minister.

These amendments are supported on all sides and by a long list of organisations, as listed by the noble Baroness, Lady Morgan, and the noble Lord, Lord Clement-Jones. The Minister has not conceded very much at all so far to this Committee. We have heard compelling speeches, such as those from the noble Baroness, Lady Parminter, that have reinforced my sense that he needs to give in on this when we come to Report.

I will also speak to my Amendment 38A. I pay tribute to John Penrose MP, who was mentioned by the noble Baroness, Lady Harding, and his work in raising concerns about misinformation and in stimulating discussion outside the Chambers among parliamentarians and others. Following discussions with him and others in the other place, I propose that users of social media should have the option to filter out content the provenance of which cannot be authenticated.

As we know, social media platforms are often awash with content that is unverified, misleading or downright false. This can be particularly problematic when it comes to sensitive or controversial topics such as elections, health or public safety. In these instances, it can be difficult for users to know whether the information presented to them is accurate. Many noble Lords will be familiar with the deep-fake photograph of the Pope in a white puffa jacket that recently went viral, or the use of imagery for propaganda purposes following the Russian invasion of Ukraine.

The Content Authenticity Initiative has created an open industry standard for content authenticity and provenance. Right now, tools such as Adobe Photoshop allow users to turn on content credentials to securely attach provenance data to images and any edits then made to those images. That technology has now been adopted by camera manufacturers such as Leica and Nikon, so the technology is there to do some of this to help give us some reassurance.

Amendment 38A would allow users to filter out unverified content and is designed to flag posts or articles that do not come from a reliable source or have not been independently verified by a reputable third party. Users could then choose to ignore or filter out such content, ensuring that they are exposed only to information that has been vetted and verified. This would not only help users to make more informed decisions but help to combat the spread of false information on social media platforms. By giving users the power to filter out unverified content, we can help to ensure that social media platforms are not used to spread harmful disinformation or misinformation.

Amendments 42 and 45, in the name of my noble friend Lord Stevenson, are good and straightforward improvements to the Bill, suggesting that Clause 12(11) should include all the protected characteristics in the Equality Act 2010. I listened closely to the argument of the noble Baroness, Lady Buscombe, about listing the various characteristics, but I think it makes much more sense to do what my noble friend is suggesting and just reference the Equality Act. If we chose separately to amend the Equality Act to change protected characteristics, that change would then flow through into this Bill. That seems to be a much more sensible and pragmatic way to proceed. Similarly, in Amendment 45 my noble friend is suggesting that consideration should be given by the Secretary of State as to whether any of these tools should also be available to child users, and we have heard good arguments in favour of that when it comes to dealing with the cliff-edge effect, as described by the noble Baroness, Lady Kidron.

Lastly—and, perhaps the Committee will forgive me, slightly bizarrely—Amendment 283ZA, in the name of the noble Baroness, Lady Fox, to Clause 170(7) in Part 11 invites us to consider, among other things, whether a bot can act reasonably. Given that self-coding bots are now technically possible, I would be fascinated to hear the Minister’s view on whether an autonomous bot can have reason. I asked ChatGPT this question. For the help of the Committee, it replied:

“As an artificial intelligence language model, I do not have consciousness or emotions, and therefore, I do not have a capacity for ‘reason’ in the way that humans understand it. However, I am programmed to use advanced algorithms that allow me to process and analyze large amounts of data, recognize patterns, and provide reasoned responses based on the information available to me. This allows me to simulate a form of reasoning, but it is important to note that it is not the same as human reasoning, as I do not have subjective experiences or personal biases. Ultimately, my abilities are limited to the algorithms and data that have been programmed into my system, and I cannot generate my own subjective experiences or judgments.”


That is the view of the algorithm as to whether or not bots can have reason. I look forward to the Minister’s response.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, the Government recognise the objectives of the amendments in this group: to strengthen protections for adults online. I hope noble Lords will agree that the Bill will indeed significantly improve the safety of all adult users, particularly those who are more vulnerable.

The user empowerment content features will not be the only measures in the Bill that will protect adults. They will act as a final layer of protection, coming after the duties on illegal content and the requirement on category 1 providers to uphold their terms of service. However, as the Clause 12 duties apply to legal content, we need to tread carefully and not inadvertently restrict free expression.

Amendments 34 and 35 in the name of my noble friend Lady Morgan of Cotes and Amendments 36 and 37 in the name of the noble Lord, Lord Clement-Jones, seek to require category 1 services to have their user empowerment content features in operation by default for adult users. The Government share concerns about users who experience disproportionate levels of abuse online or those who are more susceptible to suicide, self-harm or eating disorder content, but these amendments encroach on users’ rights in two ways.

First, the amendments intend to make the decision on behalf of users about whether to have these features turned on. That is aimed especially at those who might not otherwise choose to use those features. The Government do not consider it appropriate to take that choice away from adults, who must be allowed to decide for themselves what legal content they see online. That debate was distilled in the exchange just now between the noble Lord, Lord Knight, and the noble Baroness, Lady Fox, when the noble Lord said he would err on the side of caution, even overcaution, while he characterised the other side as a free-for-all. I might say that it was erring on the side of freedom. That is the debate that we are having, and should have, when looking at these parts of the Bill.

Secondly, the amendments would amount to a government requirement to limit adults’ access to legal content. That presents real concerns about freedom of expression, which the Government cannot accept.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Does the Minister therefore think that the Government condone the current system, where we are inundated algorithmically with material that we do not want? Are the Government condoning that behaviour, in the way that he is saying they would condone a safety measure?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

We will come to talk about algorithms and their risks later on. There is an important balance to strike here that we have debated, rightly, in this group. I remind noble Lords that there are a range of measures that providers can put in place—

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

Because of the importance of that point in relation to what the Minister is about to say, we should be clear about this point: is he ruling out the ability to prioritise the needs and requirements of those who are effectively unable to take the decisions themselves in favour of a broader consideration of freedom of expression? It would be helpful for the future of this debate to be clear on that point.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

We will come in a moment to the provisions that are in the Bill to make sure that decisions can be taken by adults, including vulnerable adults, easily and clearly. If the noble Lord will allow, I will cover that point.

I was in the middle of reminding noble Lords that there are a range of measures that providers can put in place under these duties, some of which might have an impact on a user’s experience if they were required to be switched on by default. That may include, for example, restricting a user’s news feed to content from connected users, adding to the echo chamber and silos of social media, which I know many noble Lords would join me in decrying. We think it is right that that decision is for individual users to make.

The Bill sets out that the user empowerment content tools must be offered to all adult users and must be easy to access—to go the point raised just now as well as by my noble friend Lady Harding, and the noble Baroness, Lady Burt, and, as noble Lords were right to remind us, pushed by the noble Baroness, Lady Campbell of Surbiton, who I am pleased to say I have been able to have discussions with separately from this Committee.

Providers will also be required to have clear and accessible terms of service about what tools are offered on their service and how users might take advantage of them. Ofcom will be able to require category 1 services to report on user empowerment tools in use through transparency reports. Ofcom is also bound by the Communications Act 2003 and the public sector equality duty, so it will need to take into account the ways that people with certain characteristics, including people with disabilities, may be affected when performing its duties, such as writing the codes of practice for the user empowerment duties.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I think the Minister is trying to answer the point raised by my noble friend about vulnerable adults. I am interested in the extent to which he is relying on the Equality Act duty on Ofcom then to impact the behaviour of the platforms that it is regulating in respect of how they are protecting vulnerable adults. My understanding is that the Equality Act duty will apply not to the platforms but only to Ofcom in the way that it regulates them. I am unclear how that is going to provide the protection that we want.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

That is right. Platforms are not in the public sector, so the public sector equality duty does not apply to them. However, that duty applies to Ofcom, taking into account the ways in which people with certain characteristics can be affected through the codes of practice and the user empowerment duties that it is enforcing. So it suffuses the thinking there, but the duty is on Ofcom as a public sector body.

We talk later in Clause 12(11) of some of the characteristics that are similar in approach to the protected characteristics in the Equality Act 2010. I will come to that again shortly in response to points made by noble Lords.

I want to say a bit about the idea of there being a cliff edge at the age of 18. This was raised by a number of noble Lords, including the noble Lord, Lord Griffiths, my noble friends Lady Morgan and Lady Harding and the noble Baroness, Lady Kidron. The Bill’s protections recognise that, in law, people become adults when they turn 18—but it is not right to say that there are no protections for young adults. As noble Lords know, the Bill will provide a triple shield of protection, of which the user empowerment duties are the final element.

The Bill already protects young adults from illegal content and content that is prohibited in terms and conditions. As we discussed in the last group, platforms have strong commercial incentives to prohibit content that the majority of their users do not want to see. Our terms of service duties will make sure that they are transparent about and accountable for how they treat this type of content.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, what distinguishes young adults from older adults in what the Minister in saying?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

In law, there is nothing. I am engaging with the point that there is no cliff edge. There are protections for people once they turn 18. People’s tastes and risk appetites may change over time, but there are protections in the Bill for people of all ages.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Surely, this is precisely the point that the noble Baroness, Lady Kidron, was making. As soon as you reach 18, there is no graduation at all. There is no accounting for vulnerable adults.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

There is not this cliff edge which noble Lords have feared—that there are protections for children and then, at 18, a free for all. There are protections for adult users—young adults, older adults, adults of any age—through the means which I have just set out: namely, the triple shield and the illegal content provisions. I may have confused the noble Lord in my attempt to address the point. The protections are there.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

There is an element of circularity to what the Minister is saying. This is precisely why we are arguing for the default option. It allows this vulnerability to be taken account of.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Perhaps it would help if the Minister wanted to just set out the difference for us. Clearly, this Committee has spent some time debating the protection for children, which has a higher bar than protection for adults. It is not possible to argue that there will be no difference at the age of 18, however effective the first two elements of the triple shield are. Maybe the Minister needs to think about coming at it from the point of view of a child becoming an adult, and talk us through what the difference will be.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Once somebody becomes an adult in law at the age of 18, they are protected through the triple shield in the Bill. The user empowerment duties are one element of this, along with the illegal content duties and the protection against content prohibited in terms and conditions and the redress through Ofcom.

The legislation delivers protection for adults in a way that preserves their choice. That is important. At the age of 18, you can choose to go into a bookshop and to encounter this content online if you want. It is not right for the Government to make decisions on behalf of adults about the legal content that they see. The Bill does not set a definition of a vulnerable adult because this would risk treating particular adults differently, or unfairly restricting their access to legal content or their ability to express themselves. There is no established basis on which to do that in relation to vulnerability.

Finally, we remain committed to introducing a new criminal offence to capture communications that intentionally encourage or assist serious self-harm, including eating disorders. This will provide another layer of protection on top of the regulatory framework for both adults and children.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I understand all of that—I think—but that is not the regime being applied to children. It is really clear that children have a safer, better experience. The difference between those experiences suddenly happening on an 18th birthday is what we are concerned about.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Before the Minister stands up—a new phrase—can he confirm that it is perfectly valid to have a choice to lift the user empowerment tool, just as it is to impose it? Choice would still be there if our amendments were accepted.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

It would be, but we fear the chilling effect of having the choice imposed on people. As the noble Baroness, Lady Fox, rightly put it, one does not know what one has not encountered until one has engaged with the idea. At the age of 18, people are given the choice to decide what they encounter online. They are given the tools to ensure that they do not encounter it if they do not wish to do so. As the noble Lord has heard me say many times, the strongest protections in the Bill are for children. We have been very clear that the Bill has extra protections for people under the age of 18, and it preserves choice and freedom of expression online for adult users—young and old adults.

My noble friend Lady Buscombe asked about the list in Clause 12(11). We will keep it under constant review and may consider updating it should compelling evidence emerge. As the list covers content that is legal and designed for adults, it is right that it should be updated by primary legislation after a period of parliamentary scrutiny.

Amendments 42 and 38A, tabled by the noble Lords, Lord Stevenson of Balmacara and Lord Knight of Weymouth, respectively, seek to change the scope of user empowerment content features. Amendment 38A seeks to expand the user empowerment content features to include the restriction of content the provenance of which cannot be authenticated. Amendment 42 would apply features to content that is abusive on the basis of characteristics protected under the Equality Act 2010.

The user empowerment content list reflects areas where there is the greatest need for users to be offered choice about reducing their exposure to types of content. While I am sympathetic to the intention behind the amendments, I fear they risk unintended consequences for users’ rights online. The Government’s approach recognises the importance of having clear, enforceable and technically feasible duties that do not infringe users’ rights to free expression. These amendments risk undermining this. For instance, Amendment 38A would require the authentication of the provenance of every piece of content present on a service. This could have severe implications for freedom of expression, given its all-encompassing scope. Companies may choose not to have anything at all.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I will try to help the Minister. If the amendment has been poorly drafted, I apologise. It does not seek to require a platform to check the provenance of every piece of content, but content that is certified as having good provenance would have priority for me to be able to see it. In the Bill, I can see or not see verified users. In the same way, I could choose to see or not see verified content.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Thank you. I may be reading the noble Lord’s Amendment 38A excessively critically. I will look at it again. To try to reassure the noble Lord, the Bill already ensures that all services take steps to remove illegal manufactured or manipulated content when they become aware of it. Harmful and illegal misinformation and disinformation is covered in that way.

Amendment 42 would require providers to try to establish on a large scale what is a genuinely held belief that is more than an opinion. In response, I fear that providers would excessively apply the user empowerment features to manage that burden.

A number of noble Lords referred to the discrepancy between the list—

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

Several times in the Bill—but this is a clear example—the drafters have chosen to impose a different sequence of words from that which exists in statute. The obvious one here is the Equality Act, which we have touched on before. The noble Baroness, Lady Buscombe, made a number of serious points about that. Why have the Government chosen to list, separately and distinctively, the characteristics which we have also heard, through a different route, the regulator will be required to uphold in respect of the statute, while the companies will be looking to the text of the Bill, when enacted? Is that not just going to cause chaos?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

The discrepancy comes from the point we touched on earlier. Ofcom, as a public body, is subject to the public sector equality duty and therefore the list set out in the Equality Act 2010. The list at Clause 12(11) relates to content which is abusive, and is therefore for providers to look at. While the Equality Act has established an understanding of characteristics which should be given special protection in law, it is not necessarily desirable to transpose those across. They too are susceptible to the point made by my noble friend Lady Buscombe about lists set out in statute. If I remember rightly, the Equality Act was part of a wash-up at the end of that Parliament, and whether Parliament debated that Bill as thoroughly as it is debating this one is a moot point.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

The noble Lord made that point before, and I was going to pick him up on it. It really is not right to classify our legislation by whether it came through in a short or long period. We are spending an awfully long time on this but that is not going to make it any better. I was involved in the Equality Act, and I have the scars on my back to prove it. It is jolly good legislation and has stood the test of time. I do not think the point is answered properly by simply saying that this is a better way of doing it. The Minister said that Clause 12(11) was about abuse targets, but Clause 12(12) is about “hatred against people” and Clause 12(13) is a series of explanatory points. These provisions are all grist to the lawyers. They are not trying to clarify the way we operate this legislation, in my view, to the best benefit of those affected by it.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

The content which we have added to Clause 12 is a targeted approach. It reflects input from a wide range of interested parties, with whom we have discussed this, on the areas of content that users are most concerned about. The other protected characteristics that do not appear are, for instance, somebody’s marriage or civil partnership status or whether they are pregnant. We have focused on the areas where there is the greatest need for users to be offered the choice about reducing their exposure to types of content because of the abuse they may get from it. This recognises the importance of clear, enforceable and technically feasible duties. As I said a moment ago in relation to the point made by my noble friend Lady Buscombe, we will keep it under review but it is right that these provisions be debated at length—greater length than I think the Equality Bill was, but that was long before my time in your Lordships’ House, so I defer to the noble Lord’s experience and I am grateful that we are debating them thoroughly today.

I will move now, if I may, to discuss Amendments 43 and 283ZA, tabled by the noble Baroness, Lady Fox of Buckley. Amendment 43 aims to ensure that the user empowerment content features do not capture legitimate debate and discussion, specifically relating to the characteristics set out in subsections (11) and (12). Similarly, her Amendment 283ZA aims to ensure that category 1 services apply the features to content only when they have reasonable grounds to infer that it is user empowerment content.

With regard to both amendments, I can reassure the noble Baroness that upholding users’ rights to free expression is an integral principle of the Bill and it has been accounted for in drafting these duties. We have taken steps to ensure that legitimate online discussion or criticism will not be affected, and that companies make an appropriate judgment on the nature of the content in question. We have done this by setting high thresholds for inclusion in the content categories and through further clarification in the Bill’s Explanatory Notes, which I know she has consulted as well. However, the definition here deliberately sets a high threshold. By targeting only abuse and incitement to hatred, it will avoid capturing content which is merely challenging or robust discussion on controversial topics. Further clarity on definitions will be provided by Ofcom through regulatory guidance, on which it will be required to consult. That will sit alongside Ofcom’s code of practice, which will set out the steps companies can take to fulfil their duties.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I appreciate the Minister’s comments but, as I have tried to indicate, incitement to hatred and abuse, despite people thinking they know what those words mean, is causing huge difficulty legally and in institutions throughout the land. Ofcom will have its work cut out, but it was entirely for that reason that I tabled this amendment. There needs to be an even higher threshold, and this needs to be carefully thought through.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

But as I think the noble Baroness understands from that reference, this is a definition already in statute, and with which Parliament and the courts are already engaged.

The Bill’s overarching freedom of expression duties also apply to Clause 12. Subsections (4) to (7) of Clause 18 stipulate that category 1 service providers are required to assess the impact on free expression from their safety policies, including the user empowerment features. This is in addition to the duties in Clause 18(2), which requires all user-to-user services to have particular regard to the importance of protecting freedom of expression when complying with their duties. The noble Baroness’s Amendment 283ZA would require category 1 providers to make judgments on user empowerment content to a similar standard required for illegal content. That would be disproportionate. Clause 170 already specifies how providers must make judgments about whether content is of a particular kind, and therefore in scope of the user empowerment duties. This includes making their judgment based on “all relevant information”. As such, the Bill already ensures that the user empowerment content features will be applied in a proportionate way that will not undermine free speech or hinder legitimate debate online.

Amendment 45, tabled by the noble Lord, Lord Stevenson of Balmacara, would require the Secretary of State to lay a Statement before Parliament outlining whether any of the user empowerment duties should be applied to children. I recognise the significant interest that noble Lords have in applying the Clause 12 duties to children. The Bill already places comprehensive requirements on Part 3 services which children are likely to access. This includes undertaking regular risk assessments of such services, protecting children from harmful content and activity, and putting in place age-appropriate protections. If there is a risk that children will encounter harm, such as self-harm content or through unknown or unverified users contacting them, service providers will need to put in place age- appropriate safety measures. Applying the user empowerment duties for child users runs counter to the Bill’s child safety objectives and may weaken the protections for children—for instance, by giving children an option to see content which is harmful to them or to engage with unknown, unverified users. While we recognise the concerns in this area, for the reasons I have set out, the Government do not agree with the need for this amendment.

I will resist the challenge of the noble Lord, Lord Knight, to talk about bots because I look forward to returning to that in discussing the amendments on future-proofing. With that, I invite noble Lords—

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I noted the points made about the way information is pushed and, in particular, the speech of the right reverend Prelate. Nothing in the Government’s response has really dealt with that concern. Can the Minister say a few words about not the content but the way in which users are enveloped? On the idea that companies always act because they have a commercial imperative not to expose users to harmful material, actually, they have a commercial imperative to spread material and engage users. It is well recorded that a lot of that is in fact harmful material. Can the Minister speak a little more about the features rather than the content?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

We will discuss this when it comes to the definition of content in the Bill, which covers features. I was struck by the speech by the right reverend Prelate about the difference between what people encounter online, and the analogy used by the noble Baroness, Lady Fox, about a bookshop. Social media is of a different scale and has different features which make that analogy not a clean or easy one. We will debate in other groups the accumulated threat of features such as algorithms, if the noble Baroness, Lady Kidron, will allow me to go into greater detail then, but I certainly take the points made by both the right reverend Prelate and the noble Baroness, Lady Fox, in their contributions.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I thank my noble friend very much indeed, and thank all noble Lords who have taken part. As the noble Lord, Lord Knight, said, this has been an important debate—they are all important, of course—but I think this has really got to the heart of parts of the Bill, parts of why it has been proposed in the first place, and some choices the Government made in their drafting and the changes they have made to the Bill. The right reverend Prelate reminded us, as Bishops always do, of the bigger picture, and he was quite right to do so. There is no equality of arms, as he put it, between most of us as internet users and these enormous companies that are changing, and have changed, our society. My noble friend was right—and I was going to pick up on it too—that the bookshop example given by the noble Baroness, Lady Fox, is, I am afraid, totally misguided. I love bookshops; the point is that I can choose to walk into one or not. If I do not walk into a bookshop, I do not see the books promoting some of the content we have discussed today. If they spill out on to the street where I trip over them, I cannot ignore them. This would be even harder if I were a vulnerable person, as we are going to discuss.

Noble Lords said that this is not a debate about content or freedom of expression, but that it is about features; I think that is right. However, it is a debate about choice, as the noble Lord, Lord Clement-Jones, said. I am grateful to each of those noble Lords who supported my amendments; we have had a good debate on both sets of amendments, which are similar. But as the noble Lord, Lord Griffiths, said, some of the content we are discussing, particularly in subsection (10), relating to suicide, pro-self-harm and pro-anorexia content, has literal life or death repercussions. To those noble Lords, and those outside this House, who seem to think we should not worry and should allow a total free-for-all, I say that we are doing so, in that the Government, in choosing not to adopt such amendments, are making an active choice. I am afraid the Government are condoning the serving up of insidious, deliberately harmful and deliberately dangerous content to our society, to younger people and vulnerable adults. The Minister and the Government would be better off if they said, “That is the choice that we have made”. I find it a really troubling choice because, as many noble Lords will know, I was involved in this Bill a number of years ago—there has been a certain turnover of Culture Secretaries in the last couple of years, and I was one of them. I find the Government’s choice troubling, but it has been made. As the noble Lord, Lord Knight, said, we are treating children differently from how we are treating adults. As drafted, there is a cliff edge at the age of 18. As a society, we should say that there are vulnerabilities among adults, as we do in many walks of life; and exactly as the noble Baroness, Lady Parminter, so powerfully said, there are times when we as a House, as a Parliament, as a society and as a state, should say we want to protect people. There is an offer here in both sets of amendments—I am not precious about which ones we choose—to have that protection.

I will of course withdraw the amendment today, because that is the convention of the House, but I ask my noble friend to reflect on the strength of feeling expressed by the House on this today; I think the Whip on the Bench will report as well. I am certain we will return to this on Report, probably with a unified set of amendments. In the algorithmic debate we will return to, the Government will have to explain, in words of one syllable, to those outside this House who worry about the vulnerable they work with or look after, about the choice that the Government have made in not offering protections when they could have done, in relation to these enormously powerful platforms and the insidious content they serve up repeatedly.