Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to hear what the Minister has just announced. The scheme that was originally prefigured in the pre-legislative scrutiny report has now got some chance of being delivered. I think the process and procedures are quite appropriate; it does need review and thought. There needs to be account taken of practice on the ground, how people have found the new system is working, and whether or not there are gaps that can be filled this way. I give my full support to the proposal, and I am very glad to see it.

Having got to the Dispatch Box early, I will just appeal to our small but very important group. We are on the last day on Report. We are reaching a number of issues where lots of debate has taken place in Committee. I think it would be quite a nice surprise for us all if we were to get through this quickly. The only way to do that is by restricting our contributions.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - -

My Lords, I will speak briefly to Amendments 272AA and 274AA, only because at the previous stage of the Bill I tabled amendments related to the reporting of illegal content and fraudulent advertisements, both in reporting, and complaints and transparency. I have not re-tabled them here, but I have had conversations with my noble friend the Minister. It is still unclear to those in the House and outside why the provisions relating to that type of reporting would not apply to fraudulent advertisements, particularly given that the more information that can be filed about those types of scams and fraudulent advertisements, the easier it would be for the platforms to gather information, and help users and others to start to crack down on that. I wonder if, when he sums up, my noble friend could say something about the reporting provisions relating to fraudulent advertisements generally, and in particular around general reporting and reporting relating to complaints by users.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, as I was eagerly anticipating, government Amendments 238A and 238D seek to change the parliamentary process for laying the first regulations specifying the category 1 threshold conditions from the negative to the affirmative procedure. I am pleased to bring forward this change in response to the recommendation of your Lordships’ Delegated Powers and Regulatory Reform Committee.

The change will ensure that there are adequate levels of parliamentary scrutiny of the first regulations specifying the category 1 threshold conditions. This is appropriate given that the categorisation of category 1 services will lead to the most substantial duties on the largest and most influential services. As noble Lords are aware, these include the duties on user empowerment, user identity verification, journalistic and news publisher content, content of democratic importance, and fraudulent advertising.

Category 2A services will have only additional transparency and fraudulent advertising duties, and category 2B services will be subject only to additional transparency reporting duties. The burden of these duties is significantly less than the additional category 1 duties, and we have therefore retained the use of the negative resolution procedure for these regulations, as they require less parliamentary scrutiny.

Future changes to the category 1 threshold conditions will also use the negative procedure. This will ensure that the regime remains agile in responding to change, which I know was of particular concern to noble Lords when we debated the categorisation group in Committee. Keeping the negative procedure for such subsequent uses will avoid the risk of future changes being subject to delays because of parliamentary scheduling. I beg to move.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - -

My Lords, I shall speak to Amendment 245. I would like to thank my noble friend the Minister, and also the Minister on leave, for the conversations that I have had with them about this amendment and related issues. As we have already heard, the platform categorisation is extremely important. So far, much of it is unknown, including which sites are actually going to be in which categories. For example, we have not yet seen any proposed secondary regulations. As my noble friend has just outlined, special duties apply, especially for those sites within category 1—user empowerment in particular, but also other duties relating to content and fraudulent advertisements.

Clause 85 and Schedule 11 set out the thresholds for determining which sites will be in category 1, category 2A or category 2B. I am very mindful of the exhortation of the noble Lord, Lord Stevenson, about being brief, but it is amazing how much you have to say about one word to explain this amendment. This amendment proposes to change an “and” to an “or” in relation to determining which sites would fall within category 1. It would move from a test of size “and” functionality to a test of size “or” functionality. This would give Ofcom more flexibility to decide which platforms really need category 1 designation. Category 1 should not be decided just on size; it should also be possible to determine it on the basis of functionality.

Functionality is defined in the Bill in Clause 208. We will get to those amendments shortly, but there is no doubt from what the Government have already conceded, or agreed with those of us who have been campaigning passionately on the Bill for a number of years, that functionality can make a platform harmful. It is perfectly possible to have small platforms that both carry highly harmful content and themselves become harmful in the way that they are designed. We have heard many examples and I will not detain the House with them, but I draw attention to two particular sites which capture how broad this is. The perpetrators of offline hate crimes are often linked to these small platforms. For example, the perpetrator of the 2018 Tree of Life synagogue mass shooting had an online presence on the right-wing extremist social network Gab. In the UK, Jake Davison, the self-proclaimed incel who killed five people in Plymouth in 2021, frequented smaller incel forums after he was banned from Reddit in the days leading up to the mass shooting.

I also want to share with noble Lords an email that I received just this week from a family who had been to see their Member of Parliament, Matt Rodda MP, and also the noble Baroness, Lady Kidron, who I know is very regretful that she cannot be here today. I thank Victoria and Jean Eustace for sharing the story of their sister and daughter. Victoria wrote: “I am writing to you regarding the Online Safety Bill, as my family and I are concerned it will not sufficiently protect vulnerable adults from harm. My sister, Zoe Lyalle, killed herself on 26 May 2020, having been pointed towards a method using an online forum called Sanctioned Suicide. Zoe was 18 years old at the time of her death and as such technically an adult, but she was autistic, so she was emotionally less mature than many 18 year- olds. She found it difficult to critically analyse written content”. She says that “The forum in question is not large and states on its face that it does not encourage suicide, although its content does just that”. The next part I was even more shocked about: “Since Zoe’s death, we have accessed her email account. The forum continues to email Zoe, providing her with updates on content she may have missed while away from the site, as well as requesting donations. One recent email included a link to a thread on the forum containing tips on how best to use the precise method that Zoe had employed”.

In her note to me, the Minister on leave said that she wanted to catch some of the platforms we are talking about with outsized influence. In my reply, I said that those sites on which people are encouraged to take their own lives or become radicalised and therefore take the harms they are seeing online into the real world undoubtedly exercise influence and should be tackled.

It is also perfectly possible for us to have large but safe platforms. I know that my noble friend Lord Moylan may want to discuss this in relation to sites that he has talked about already on this Bill. The risk of the current drafting is a flight of users from these large platforms, newly categorised as category 1, to the small, non-category 1 platforms. What if a platform becomes extremely harmful very quickly? How will it be recategorised speedily but fairly and involving parliamentary oversight?

The Government have run a variety of arguments as to why the “and” in the Bill should not become an “or”. They say that it creates legal uncertainty. Every Bill creates legal uncertainty; that is why we have an army of extremely highly paid lawyers, not just in this country but around the world. They say that what we are talking about is broader than illegal content or content related to children’s safety, but they have already accepted an earlier amendment on safety by design and, in subsections (10) to (12) of Clause 12, that specific extra protections should be available for content related to

“suicide or an act of deliberate self-injury, or … an eating disorder or behaviours associated with an eating disorder”

or abusive content relating to race, religion, sex, sexual orientation, disability or gender reassignment and that:

“Content is within this subsection if it incites hatred against people”.


The Government have already breached some of their own limits on content that is not just illegal or relates to child safety duties. In fact, they have agreed that that content should have enhanced triple-shield protection.

The Government have also said that they want to avoid burdens on small but low-harm platforms. I agree with that, but with an “or” it would be perfectly possible for Ofcom to decide by looking at size or functionality and to exclude those smaller platforms that do not present the harm we all care about. The Minister may also offer me a review of categorisation; however, it is a review of the tiers of categorisation and not the sites within the categories, which I think many of us will have views on over the years.

I come to what we should do on this final day of Report. I am very thankful to those who have had many conversations on this, but there is a fundamental difference of opinion in this House on these matters. We will talk about functionality shortly and I am mindful of the pre-legislative scrutiny committee’s recommendation that this legislation should adopt

“a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model”.

There should be other factors. Ofcom should have the ability to decide whether it takes one factor or another, and not have a series of all the thresholds to be passed, to give it the maximum flexibility. I will listen very carefully to what my noble friend the Minister and other noble Lords say, but at this moment I intend to test the opinion of the House on this amendment.

Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I strongly support Amendment 245. The noble Baroness, Lady Morgan of Cotes, has explained the nub of the problem we are facing—that size and functionality are quite separate. You can have large sites that perform a major social function and are extremely useful across society. Counter to that, you can have a small site focused on being very harmful to a small group of people. The problem is that, without providing the flexibility to Ofcom to determine how the risk assessment should be conducted, the Bill would lock it into leaving these small, very harmful platforms able to pursue their potentially ever-increasingly harmful activities almost out of sight. It does nothing to make sure that their risk assessments are appropriate.

--- Later in debate ---
Moved by
245: Schedule 11, page 223, line 32, leave out “and” and insert “or”
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Hansard - -

My Lords, I wish to test the opinion of the House and I beg to move.

--- Later in debate ---
It is late, and it has been a long Report stage. I will listen very carefully to what my noble friend the Minister has to say. I really hope that the Bill can continue to progress in this collaborative way. It would be an awful shame if, at the end of a long Report stage, we did not recognise that we are trying to solve the same problem and find a way through. I beg to move.
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - -

My Lords, the hour is late and I will not detain the House for long. However, I hope that the fact that we are all still sitting here at the end of a long Report stage, because we care very much about the Bill and what we are trying to achieve, will be noted by my noble friend the Minister, his officials and others who are watching. I thank my noble friend Lady Harding for so ably introducing the amendments, which I absolutely support. I was, perhaps for the first time, going to agree with something the noble Baroness, Lady Fox, said a day or so ago: that one thing we and Ofcom need to do much better is to understand the transparency of the algorithms. It is not just algorithms—this is where my knowledge ends—but other design features that make these sites addictive and harmful, and which are outside content. The Bill will not be capable of addressing even the next five years, let alone beyond that, if we do not reflect the fact that, as my noble friend Lady Harding said, it has already been amended so that one way its objectives are to be achieved is by services being required to focus on safety by design.

I hope very much that my noble friend will take up the invitation, because everybody is tired and has been looking at this Bill for so many hours and months that we are probably all word-blind. We could all do with standing back and thinking, “With the amendments made, how does it all hang together so that ultimately, we keep those we want to keep safe as safe as we possibly can?” On that basis, I support these amendments and look forward to hearing further from the Government about how they hope to keep safe those we all wish to keep safe.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to support the amendment in the name of the noble Baroness, Lady Kidron. She has been such a forceful voice throughout the passage of this Bill, driven by her passion to protect children, and no more so than with the amendment in her name. That is why I feel compelled to speak up to support her. So far, we have all worked with the Government to see the safe passage of the Online Safety Bill, with strong protections for children. These amendments would be yet another excellent and unique opportunity to protect children. This is what we have been fighting for for years, and it is so uplifting that the Government have listened to us throughout the passage of this Bill—so why stop now? If the Government are saying that the Bill is being clear about harms, they should have no objection to making it explicit.

These amendments press for safety by design to be embedded in later clauses of the Bill and go hand in hand with the earlier amendment that the House so clearly supported. It is clear that the design of services and algorithms is responsible for orchestrating and manipulating the behaviour, feelings, emotions and thoughts of children who, because they are at a vulnerable stage in their development, are easily influenced. We have all witnessed the disastrous impact of the new technology which is fast encroaching upon us, and our children will not be spared from it. So it is imperative that Ofcom have the tools with which to consider and interrogate system design separately from content because, as has been said, it is not only content that is harmful: design is too. We therefore need to take a holistic approach and leave nowhere to hide for the tech companies when it comes to harms affecting our children.

As I have said before, these amendments would send a loud and clear message to the industry that it is responsible for the design of its products and has to think of the consequences for our children’s mental health and well-being when considering design. What better way to do that than for the Government to accept these amendments, in order to show that they are on the side of our children, not the global tech companies, when it comes to protecting them from harm? They need to put measures in place to ensure that the way a service is designed is subject to the online safety regime we have all fought for over the years and during the passage of this Bill.

If the Government do not accept the amendment, perhaps the issue of harmful design could be included in the welcome proposed review of pornography. It would be good to hear the Minister’s thoughts on this idea—but I am not giving him a let-off. I hope he will listen to the strength of feeling and that the Government will reconsider their position, support the amendment and complete the one main task they set out to complete with this Bill, which is to protect children from harm no matter where it rears its ugly head online.