Moved by
32: Clause 6, page 5, line 29, at end insert—
“(ba) the duties about assessments related to adult user empowerment set out in section (Assessment duties: user empowerment),”Member’s explanatory statement
This amendment ensures that the new duties in the new Clause proposed after Clause 11 in my name are imposed on providers of Category 1 services.
--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - - - Excerpts

My Lords, as noble Lords will be aware, the Government removed the legal but harmful provisions from the Bill in another place, given concerns about freedom of expression. I know that many noble Lords would not have taken that approach, but I am grateful for their recognition of the will of the elected House in this regard as well as for their constructive contributions about ways of strengthening the Bill while continuing to respect that.

I am therefore glad to bring forward a package of amendments tabled in my name relating to adult safety. Among other things, these strengthen our existing approach to user empowerment and terms of service by rebalancing the power over the content adults see and interact with online, moving the choice away from unaccountable technology companies and towards individual users.

First, we are introducing a number of amendments, which I am pleased to say have the support of the Opposition Front Bench, which will introduce a comprehensive duty on category 1 providers to carry out a full assessment of the incidence of user empowerment content on their services. The amendments will mean that platforms can be held to account by Ofcom and their users when they fail to assess the incidence of this kind of content on their services or when they fail to offer their users an appropriate ability to control whether or not they view it.

Amendments 19 to 21 and 26—I am grateful to noble Lords opposite for putting their names to them—will strengthen the user empowerment content duty. Category 1 providers will now need proactively to ask their registered adult users how they would like the control features to be applied. We believe that these amendments achieve two important aims that your Lordships have been seeking from these duties: first, they ensure that they are more visible for registered adult users; and, secondly, they offer better protection for young adult users.

Amendments 55 and 56, tabled by the noble Lord, Lord Clement-Jones, my noble friend Lord Moylan and the noble Baroness, Lady Fox of Buckley, seek to provide users with a choice over how the tools are applied for each category of content set out in Clause 12(10), (11) and (12). The legislation gives platforms the flexibility to decide what tools they offer in compliance with Clause 12(2). A blanket approach is unlikely to be consistent with the duty on category 1 services to have particular regard to the importance of protecting users’ freedom of expression when putting these features in place. Additionally, the measures that Ofcom will recommend in its code of practice must consider the impact on freedom of expression so are unlikely to be a blanket approach.

Amendments 58 and 63 would require providers to set and enforce consistent terms of service on how they identify the categories of content to which Clause 12(2) applies; and to apply the features to content only when they have reasonable grounds to infer that it is user empowerment content. I assure noble Lords that the Bill’s freedom of expression duties will prevent providers overapplying the features or adopting an inconsistent or capricious approach. If they do, Ofcom can take enforcement action.

Amendments 59, 64 and 181, tabled by the noble Lord, Lord Clement-Jones, seek to require that the user empowerment and user verification features are provided at no cost. I reassure the noble Lord that the effect of these amendments is already achieved by the drafting of Clause 12. Category 1 providers will be compliant with their duties only if they proactively ask all registered users whether or not they want to use the user empowerment content features, which would not be possible with a paywall. Amendment 181 is similar and applies to user verification. While the Bill does not specify that verification must be free of charge, category 1 providers can meet the duties in the Bill only by offering all adult users the option to verify themselves.

Turning to Amendment 204, tabled by the noble Baroness, Lady Finlay of Llandaff, I share her concern about the impact that self-harm and suicide content can have. However, as I said in Committee, the Bill goes a long way to provide protections for both children and adults from this content. First, it includes the new criminal offence of encouraging or assisting self-harm. This then feeds through into the Bill’s illegal content duties. Companies will be required to take down such content when it is reported to them by users.

Beyond the illegal content duties, there are specific protections in place for children. The Government have tabled amendments designating content that encourages, promotes or provides instructions as a category of primary priority content, meaning that services will have to prevent children of all ages encountering it. For adults, the Government listened to concerns and, as mentioned, have strengthened the user empowerment duties to make it easier for adult users to opt in to using them by offering a forced choice. We have made a careful decision, however, to balance these protections with users’ right to freedom of expression and therefore cannot require platforms to treat legal content accessed by adults in a prescribed way. That is why, although I share the noble Baroness’s concerns about the type of content that she mentions, I cannot accept her amendment and hope that she will agree.

The Bill’s existing duties require category 1 platforms to offer users the ability to verify their identity. Clause 12 requires category 1 platforms to offer users the ability to filter out users who have not verified their identity. Amendment 183 from my noble friend Lord Moylan seeks to give Ofcom the discretion to decide when it is and is not proportionate for category 1 services to offer users the ability to verify their identity. We do not believe that these will be excessively burdensome, given that they will apply only to category 1 companies, which have the resource and capacity to offer such tools.

Amendment 182 would require platforms to offer users the option to make their verification status visible. The existing duty in Clause 57, in combination with the duty in Clause 12, will already provide significant protections for adults from anonymous abuse. Adult users will now be able to verify their own status and decide to interact only with other verified users, whether or not their status is visible. We do not believe that this amendment would provide additional protections.

The Government carefully considered mandating that all users display their verification status, which may heighten some users’ safety, but it would be detrimental to vulnerable users, who may need to remain anonymous for perfectly justifiable reasons. Further government amendments in my name will expand the types of information that Ofcom can require category 1, 2A and 2B providers to publish in their transparency reports in relation to user empowerment content.

Separately, but also related to transparency, government Amendments 189 and 202 make changes to Clause 67 and Schedule 8. These relate to category 1 providers’ duties to create clear and accessible terms of service and apply them consistently and transparently. Our amendments tighten these parts of the Bill so that all the providers’ terms through which they might indicate that a certain type of content is not allowed on their service, are captured by these duties.

I hope that noble Lords will therefore accept the Government amendments in this group and that my anticipatory remarks about their amendments will give them some food for thought as they make their contributions. I beg to move.

--- Later in debate ---
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I am happy to acknowledge and recognise what the Government did when they created user empowerment duties to replace legal but harmful. I think they were trying to counter the dangers of over-paternalism and illiberalism that oblige providers to protect adult users from content that allegedly would cause them harm.

At least the new provisions brought into the Bill have a different philosophy completely. They enhance users’ freedom as individuals and allow them to apply voluntary content filters and freedom of choice, on the principle that adults can make decisions for themselves.

In case anyone panics, I am not making a philosophical speech. I am reminding the Government that that is what they said to us—to everybody—“We are getting rid of legal but harmful because we believe in this principle”. I am worried that some of the amendments seem to be trying to backtrack from that different basis of the Bill—and that more liberal philosophy—to go back to the old legal but harmful. I say to the noble Lord, Lord Allan of Hallam, that the cat is distinctly not dead.

The purpose of Amendment 56 is to try to ensure that providers also cannot thwart the purpose of Clause 12 and make it more censorious and paternalistic. I am not convinced that the Government needed to compromise on this as I think Amendment 60 just muddies the waters and fudges the important principle that the Government themselves originally established.

Amendment 56 says that the default must be no filtering at all. Then users have to make an active decision to switch on the filtering. The default is that you should be exposed to a full flow of ideas and, if you do not want that, you have to actively decide not to and say that you want a bowdlerised or sanitised version.

Amendment 56 takes it a bit further, in paragraph (b), and applies different levels of filtering in terms of content of democratic importance and journalistic content. In the Bill itself, the Government accept the exceptional nature of those categories of content, and this just allows users to be able to do the same and say, “No; I might want to filter some things out but bear in mind the exceptional importance of democratic and journalistic content”. I worry that the government amendments signal to users that certain ideas are dangerous and must be hidden. That is my big concern. In other words, they might be legal but they are harmful: that is what I think these amendments try to counter.

One of the things that worries me about the Bill is the danger of echo chambers. I know we are concentrating on harms, but I think echo chambers are harmful. I started today quite early at Blue Orchid at 55 Broadway with a big crowd of sixth formers involved in debating matters. I complimented Keir Starmer on his speech on the importance of oracy and encouraging young people to speak. I stressed to all the year 12 and year 13 young people that the important thing was that they spoke out but also that they listened to contrary opinions and got out of their safe spaces and echo chambers. They were debating very difficult topics such as commercial surrogacy, cancel culture and the risks of contact sports. I am saying all that to them and then I am thinking, “We have now got a piece of legislation that says you can filter out all the stuff you do not want to hear and create your own safe space”. So I just get anxious that we do not inadvertently encourage in the young—I know this is for all adults—that antidemocratic tendency to not want to hear what you do not want to hear, even when it would be good to hear as many opinions as possible.

I also want to press the Minister on the problem of filtering material that targets race, religion, sex, sexual orientation, disability and gender reassignment. I keep trying to raise the problem that it could lead to diverse philosophical views around those subjects also being removed by overzealous filtering. You might think that you know what you are asking to be filtered out. If you say you want to filter out material that is anti-religion, you might not mean that you do not want any debates on religious tolerance. For example, there was that major controversy over the “The Lady of Heaven” film. I know the Minister was interested, as I was, in the dangers of censorship in relation to that. You would not want, because you said, “Don’t target me for my religion”, to not be able to access that debate.

I think there is a danger that we are handing a lot of power to filterers to make filtering decisions based on their values when we are not clear about what they are. Look at what has happened with the banks in the last few days. Their values have closed down people’s bank accounts because they disagree on values. Again, we say “Don’t target on race”, but I have been having lots of arguments with people recently who have accused the Government, through their Illegal Migration Bill, of being racist. I think we just need to know that we are not accepting an ideological filtering of what we see.

Amendment 63 is key because it requires providers’ terms of service to include provisions about how content to which Clause 12(2) applies is identified, precisely to try to counter these problems. It imposes a duty on providers to apply those provisions consistently, as the noble Lord, Lord Moylan, explained. The point that providers have to set out how they identify content that is allegedly hostile, for example, to religion, or racially abusive, is important because this is about empowering users. Users need to know whether this will be done by machine learning or will it be a human doing it. Do they look for red flags and, if so, what are the red flags? How are these things decided? That means that providers have to state clearly and be accountable for their definition of any criteria that could justify them filtering out and disturbing the flow of democratic information. It is all about transparency and accountability in that sense.

Finally, in relation to Amendment 183, I am worried about the notion of filtering out content from unverified users for a range of reasons. It indicates somehow that there is a direct link between being unverified or anonymous and harm or being dodgy, which I think that is illegitimate. It has already been explained that there will be a detrimental impact on certain organisations —we have talked about Reddit, but I like to remember Mumsnet. There are quite a lot of organisations with community-centred models, where the structure is that influencers broadcast to their followers and where there are pseudonymous users. Is the requirement to filter out those contributors likely to lead to those models collapsing? I need to be reassured on this because I am not convinced at all. As has been pointed out, there will be a two-tier internet because those who are unable or unwilling to disclose their identity online or to be verified by someone would be or could be shut out from public discussions. That is a very dangerous place to have ended up, even though I am sure it is not what the Government intend.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful for the broad, if not universal, support for the amendments that we have brought forward following the points raised in Committee. I apologise for anticipating noble Lords’ arguments, but I am happy to expand on my remarks in light of what they have said.

My noble friend Lord Moylan raised the question of non-verified user duties and crowdsourced platforms. The Government recognise concerns about how the non-verified user duties will work with different functionalities and platforms, and we have engaged extensively on this issue. These duties are only applicable to category 1 platforms, those with the largest reach and influence over public discourse. It is therefore right that such platforms have additional duties to empower their adult users. We anticipate that these features will be used in circumstances where vulnerable adults wish to shield themselves from anonymous abuse. If users decide that they are restricting their experience on a particular platform, they can simply choose not to use them. In addition, before these duties come into force, Ofcom will be required to consult effective providers regarding the codes of practice, at which point they will consider how these duties might interact with various functionalities.

My noble friend and the noble Lord, Lord Allan of Hallam, raised the potential for being bombarded with pop-ups because of the forced-choice approach that we have taken. These amendments have been carefully drafted to minimise unnecessary prompts or pop-ups. That is why we have specified that the requirement to proactively ask users how they want these tools to be applied is applicable only to registered users. This approach ensures that users will be prompted to make a decision only once, unless they choose to ignore it. After a decision has been made, the provider should save this preference and the user should not be prompted to make the choice again.

The noble Lord, Lord Clement-Jones, talked further about his amendments on the cost of user empowerment tools as a core safety duty in the Bill. Category 1 providers will not be able to put the user empowerment tools in Clause 12 behind a pay wall and still be compliant with their duties. That is because they will need to offer them to users at the first possible opportunity, which they will be unable to do if they are behind a pay wall. The wording of Clause 12(2) makes it clear that providers have a duty to include user empowerment features that an adult user may use or apply.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

The Minister may not have the information today, but I would be happy to get it in writing. Can he clarify exactly what will be expected of a service that already prohibits all the Clause 12 bad stuff in their terms of service?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - - - Excerpts

I will happily write to the noble Lord on that.

Clause 12(4) further sets out that all search user empowerment content tools must be made available to all adult users and be easy to access.

The noble Lord, Lord Clement-Jones, on behalf of the noble Baroness, Lady Finlay, talked about people who will seek out suicide, self-harm or eating-disorder content. While the Bill will not prevent adults from seeking out legal content, it will introduce significant protections for adults from some of the most harmful content. The duties relating to category 1 services’ terms of service are expected hugely to improve companies’ own policing of their sites. Where this content is legal and in breach of the company’s terms of service, the Bill will force the company to take it down.

We are going even further by introducing a new user empowerment content-assessment duty. This will mean that where content relates to eating disorders, for instance, but which is not illegal, category 1 providers need fully to assess the incidence of this content on their service. They will need clearly to publish this information in accessible terms of service, so users will be able to find out what they can expect on a particular service. Alternatively, if they choose to allow suicide, self-harm or eating content disorder which falls into the definition set out in Clause 12, they will need proactively to ask users how they would like the user empowerment content features to be applied.

My noble friend Lady Morgan was right to raise the impact on vulnerable people or people with disabilities. While we anticipate that the changes we have made will benefit all adult users, we expect them particularly to benefit those who may otherwise have found it difficult to find and use the user empowerment content features independently—for instance, some users with types of disabilities. That is because the onus will now be on category 1 providers proactively to ask their registered adult users whether they would like these tools to be applied at the first possible opportunity. The requirement also remains to ensure that the tools are easy to access and to set out clearly what tools are on offer and how users can take advantage of them.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, does the Minister have any more to say on identity verification?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am being encouraged to be brief so, if I may, I will write to the noble Lord on that point.

Amendment 32 agreed.
--- Later in debate ---
Moved by
33: Clause 6, page 5, line 37, leave out “duty about record-keeping set out in section 19(9)” and insert “duties about record-keeping set out in section 19(8A) and (9)”
Member’s explanatory statement
This amendment ensures that the new duties in Clause 19 proposed by amendments in my name to that clause are imposed on providers of Category 1 services.
--- Later in debate ---
Moved by
34: Clause 10, page 9, line 13, after “8” insert “and, in the case of services likely to be accessed by children which are Category 1 services, the duties about assessments set out in section (Assessment duties: user empowerment)”
Member’s explanatory statement
This amendment inserts a signpost to the new duties imposed on providers of Category 1 services by the new Clause proposed after Clause 11 in my name.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, I will speak to the government amendments now but not anticipate the non-government amendments in this group.

As noble Lords know, protecting children is a key priority for this Bill. We have listened to concerns raised across your Lordships’ House about ensuring that it includes the most robust protections for children, particularly from harmful content such as pornography. We also recognise the strength of feeling about ensuring the effective use of age-assurance measures, by which we mean age verification and age estimation, given the important role they will have in keeping children safe online.

I thank the noble Baroness, Lady Kidron, and my noble friends Lady Harding of Winscombe and Lord Bethell in particular for their continued collaboration over the past few months on these issues. I am very glad to have tabled a significant package of amendments on age assurance. These are designed to ensure that children are prevented from accessing pornography, whether it is published by providers in scope of the Part 5 duties or allowed by user-to-user services that are subject to Part 3 duties. The Bill will be explicit that services will need to use highly effective age verification or age estimation to meet these new duties.

These amendments will also ensure that there is a clear, privacy-preserving and future-proof framework governing the use of age assurance, which will be overseen by Ofcom. Our amendments will, for the first time, explicitly require relevant providers to use age verification or age estimation to protect children from pornography. Publishers of pornographic content, which are regulated in Part 5, will need to use age verification or age estimation to ensure that children are not normally able to encounter content which is regulated provider pornographic content on their service.

Further amendments will ensure that, where such tools are proactive technology, Ofcom may also require their use for Part 5 providers to ensure compliance. Amendments 279 and 280 make further definitional changes to proactive technology to ensure that it can be recommended or required for this purpose. To ensure parity across all regulated pornographic content in the Bill, user-to-user providers which allow pornography under their terms of service will also need to use age verification or age estimation to prevent children encountering pornography where they identify such content on their service. Providers covered by the new duties will also need to ensure that their use of these measures meets a clear, objective and high bar for effectiveness. They will need to be highly effective at correctly determining whether a particular user is a child. This new bar will achieve the intended outcome behind the amendments which we looked at in Committee, seeking to introduce a standard of “beyond reasonable doubt” for age assurance for pornography, while avoiding the risk of legal challenge or inadvertent loopholes.

To ensure that providers are using measures which meet this new bar, the amendments will also require Ofcom to set out, in its guidance for Part 5 providers, examples of age-verification and age-estimation measures which are highly effective in determining whether a particular user is a child. Similarly, in codes of practice for Part 3 providers, Ofcom will need to recommend age-verification or age-estimation measures which can be used to meet the new duty to use highly effective age assurance. This will meet the intent of amendments tabled in Committee seeking to require providers to use measures in a manner approved by Ofcom.

I confirm that the new requirement for Part 3 providers will apply to all categories of primary priority content that is harmful to children, not just pornography. This will mean that providers which allow content promoting or glorifying suicide, self-harm and eating disorders will also be required to use age verification or age estimation to protect children where they identify such content on their service.

Further amendments clarify that a provider can conclude that children cannot access a service—and therefore that the service is not subject to the relevant children’s safety duty—only if it uses age verification or age estimation to ensure that children are not normally able to access the service. This will ensure consistency with the new duties on Part 3 providers to use these measures to prevent children’s access to primary priority content. Amendment 34 inserts a reference to the new user empowerment duties imposed on category 1 providers in the child safety duties.

Amendment 214 will require Part 5 providers to publish a publicly available summary of the age-verification or age-estimation measures that they are using to ensure that children are not normally able to encounter content that is regulated provider pornographic content on their service. This will increase transparency for users on the measures that providers are using to protect children. It also aligns the duties on Part 5 providers with the existing duties on Part 3 providers to include clear information in terms of service on child protection measures or, for search engines, a publicly available statement on such measures.

I thank the noble Baroness, Lady Kidron, for her tireless work relating to Amendment 124, which sets out a list of age-assurance principles. This amendment clearly sets out the important considerations around the use of age-assurance technologies, which Ofcom must have regard to when producing its codes of practice. Amendment 216 sets out the subset of principles which apply to Part 5 guidance. Together, these amendments ensure that providers are deploying age-assurance technologies in an appropriate manner. These principles appear as a full list in Schedule 4. This ensures that the principles can be found together in one place in the Bill. The wider duties set out in the Bill ensure that the same high standards apply to both Part 3 and Part 5 providers. These principles have been carefully drafted to avoid restating existing duties in the Bill. In accordance with good legislative drafting practice, the principles also do not include reference to other legislation which already directly applies to providers. In its relevant guidance and codes, however, Ofcom may include such references as it deems appropriate.

Finally, I highlight the critical importance of ensuring that users’ privacy is protected throughout the age-assurance processes. I make it clear that privacy has been represented in these principles to the furthest degree possible, by referring to the strong safeguards for user privacy already set out in the Bill.

In recognition of these new principles and to avoid duplication, Amendment 127 requires Ofcom to refer to the age-assurance principles, rather than to the proactive technology principles, when recommending age-assurance technologies that are also proactive technology.

We have listened to the points raised by noble Lords about the importance of having clear and robust definitions in the Bill for age assurance, age verification and age estimation. Amendment 277 brings forward those definitions. We have also made it clear that self-declared age, without additional, more robust measures, is not to be regarded as age verification or age estimation for compliance with duties set out in the Bill. Amendment 278 aligns the definition of proactive technology with these new definitions.

The Government are clear that the Bill’s protections must be implemented as quickly as is feasible. This entails a complex programme of work for the Government and Ofcom, as well as robust parliamentary scrutiny of many parts of the regime. All of this will take time to deliver. It is right, however, that we set clear expectations for when the most pressing parts of the regulation—those targeting illegal content and protecting children—should be in place. These amendments create an 18-month statutory deadline from the day the Bill is passed for Ofcom’s implementation of those areas. By this point, Ofcom must submit draft codes of practice to the Secretary of State to be laid in Parliament and publish its final guidance relating to illegal content duties, duties about content harmful to children and duties about pornography content in Part 5. This also includes relevant cross-cutting duties, such as content reporting procedures, which are relevant to illegal content and content harmful to children.

In line with convention, most of the Bill’s substantive provisions will be commenced two months after Royal Assent. These amendments ensure that a set of specific clauses will commence earlier—on the day of Royal Assent—allowing Ofcom to begin vital implementation work sooner than it otherwise would have done. Commencing these clauses early will enable Ofcom to launch its consultation on draft codes of practice for illegal content duties shortly after Royal Assent.

Amendment 271 introduces a new duty on Ofcom to produce and publish a report on in-scope providers’ use of age-assurance technologies, and for this to be done within 18 months of the first date on which both Clauses 11 and 72(2), on pornography duties, are in force. I thank the noble Lord, Lord Allan of Hallam, for the amendment he proposed in Committee, to which this amendment responds. We believe that this amendment will improve transparency in how age-assurance solutions are being deployed by providers, and the effectiveness of those solutions.

Finally, we are also making a number of consequential and technical amendments to the Bill to split Clauses 11 and 25 into two parts. This is to ensure these do not become unwieldy and that the duties are clear for providers and for Ofcom. I beg to move.

Debate on Amendment 34 adjourned.