Baroness Morgan of Cotes debates involving the Department for Digital, Culture, Media & Sport during the 2019 Parliament

Wed 6th Sep 2023
Wed 19th Jul 2023
Wed 12th Jul 2023
Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 3
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1 & Report stage: Minutes of Proceedings
Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 16th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1

Online Safety Bill

Baroness Morgan of Cotes Excerpts
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Parliament Live - Hansard - - - Excerpts

My Lords, I, too, thank the Minister for the great improvements that the Government have made to the Secretary of State’s powers in the Bill during its passage through this House. I rise to speak briefly today to praise the Government’s new Amendments 1 and 2 to Clause 44. As a journalist, I was worried by the lack of transparency around these powers in the clause; I am glad that the lessons of Section 94 of the Telecommunications Act 1984, which had to be rescinded, have been learned. In a world of conspiracy theories that can be damaging to public trust and governmental and regulatory process, it has never been more important that Parliament and the public are informed about the actions of government when giving directions to Ofcom about the draft codes of practice. So I am glad that these new amendments resolve those concerns.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Parliament Live - Hansard - -

My Lords, I welcome Amendments 5 and 6, as well as the amendments that reflect the work done and comments made in earlier stages of this debate by the noble Baroness, Lady Kennedy. Of course, we are not quite there yet with this Bill, but we are well on the way as this is the Bill’s last formal stage in this Chamber before it goes back to the House of Commons.

Amendments 5 and 6 relate to the categorisation of platforms. I do not want to steal my noble friend’s thunder, but I echo the comments made about the engagement both from my noble friend the Minister and from the Secretary of State. I am delighted that the indications I have received are that they will accept the amendment to Schedule 11, which this House voted on just before the Recess; that is a significant and extremely welcome change.

When commentators outside talk about the work of a revising Chamber, I hope that this Bill will be used as a model for cross-party, non-partisan engagement in how we make a Bill as good as it possibly can be—particularly when it is as ground-breaking and novel as this one is. My noble friend the Minister said in a letter to all of us that this Bill had been strengthened in this Chamber, and I think that is absolutely right.

I also want to echo thanks to the Bill team, some of whom I was working with four years ago when we were talking about this Bill. They have stuck with the Bill through thick and thin. Also, I thank noble Lords across the House for their support for the amendments but also all of those outside this House who have committed such time, effort, support and expertise to making sure this Bill is as good as possible. I wish it well with its final stages. I think we all look forward to both Royal Assent and also the next big challenge, which is implementation.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Parliament Live - Hansard - - - Excerpts

My Lords, I thank the Minister for his introduction today and also for his letter which set out the reasons and the very welcome amendments that he has tabled today. First, I must congratulate the noble Baroness, Lady Stowell, for her persistence in pushing amendments of this kind to Clause 45, which will considerably increase the transparency of the Secretary of State’s directions if they are to take place. They are extremely welcome as amendments to Clause 45.

Of course, there is always a “but”—by the way, I am delighted that the Minister took the advice of the House and clearly spent his summer reading through the Bill in great deal, or we would not have seen these amendments, I am sure—but I am just sorry that he did not take the opportunity also to address Clause 176 in terms of the threshold for powers to direct Ofcom in special circumstances, and of course the rather burdensome powers in relation to the Secretary of State’s guidance on Ofcom’s exercise of its functions under the Bill as a whole. No doubt we will see how that works out in practice and whether they are going to be used on a frequent basis.

My noble friend Lord Allan—and I must congratulate both him and the noble Lord, Lord Knight, for their addressing this very important issue—has set out five assurances that he is seeking from the Minister. I very much hope that the Minister can give those today, if possible.

Congratulations are also due to the noble Baroness, Lady Kennedy, for finding a real loophole in the offence, which has now been amended. We are all delighted to see that the point has been well taken.

Finally, on the point raised by the noble Lord, Lord Rooker, clearly it is up to the Minister to respond to the points made by the committee. All of us would have preferred to see a comprehensive scheme in the primary legislation, but we are where we are. We wanted to see action on apps; they have some circumscribing within the terms of the Bill. The terms of the Bill—as we have discussed—particularly with the taking out of “legal but harmful”, do not give a huge amount of leeway, so this is not perhaps as skeleton a provision as one might otherwise have thought. Those are my reflections on what the committee has said.

Online Safety Bill

Baroness Morgan of Cotes Excerpts
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Parliament Live - Hansard - - - Excerpts

My Lords, I am grateful to hear what the Minister has just announced. The scheme that was originally prefigured in the pre-legislative scrutiny report has now got some chance of being delivered. I think the process and procedures are quite appropriate; it does need review and thought. There needs to be account taken of practice on the ground, how people have found the new system is working, and whether or not there are gaps that can be filled this way. I give my full support to the proposal, and I am very glad to see it.

Having got to the Dispatch Box early, I will just appeal to our small but very important group. We are on the last day on Report. We are reaching a number of issues where lots of debate has taken place in Committee. I think it would be quite a nice surprise for us all if we were to get through this quickly. The only way to do that is by restricting our contributions.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Parliament Live - Hansard - -

My Lords, I will speak briefly to Amendments 272AA and 274AA, only because at the previous stage of the Bill I tabled amendments related to the reporting of illegal content and fraudulent advertisements, both in reporting, and complaints and transparency. I have not re-tabled them here, but I have had conversations with my noble friend the Minister. It is still unclear to those in the House and outside why the provisions relating to that type of reporting would not apply to fraudulent advertisements, particularly given that the more information that can be filed about those types of scams and fraudulent advertisements, the easier it would be for the platforms to gather information, and help users and others to start to crack down on that. I wonder if, when he sums up, my noble friend could say something about the reporting provisions relating to fraudulent advertisements generally, and in particular around general reporting and reporting relating to complaints by users.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, as I was eagerly anticipating, government Amendments 238A and 238D seek to change the parliamentary process for laying the first regulations specifying the category 1 threshold conditions from the negative to the affirmative procedure. I am pleased to bring forward this change in response to the recommendation of your Lordships’ Delegated Powers and Regulatory Reform Committee.

The change will ensure that there are adequate levels of parliamentary scrutiny of the first regulations specifying the category 1 threshold conditions. This is appropriate given that the categorisation of category 1 services will lead to the most substantial duties on the largest and most influential services. As noble Lords are aware, these include the duties on user empowerment, user identity verification, journalistic and news publisher content, content of democratic importance, and fraudulent advertising.

Category 2A services will have only additional transparency and fraudulent advertising duties, and category 2B services will be subject only to additional transparency reporting duties. The burden of these duties is significantly less than the additional category 1 duties, and we have therefore retained the use of the negative resolution procedure for these regulations, as they require less parliamentary scrutiny.

Future changes to the category 1 threshold conditions will also use the negative procedure. This will ensure that the regime remains agile in responding to change, which I know was of particular concern to noble Lords when we debated the categorisation group in Committee. Keeping the negative procedure for such subsequent uses will avoid the risk of future changes being subject to delays because of parliamentary scheduling. I beg to move.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Parliament Live - Hansard - -

My Lords, I shall speak to Amendment 245. I would like to thank my noble friend the Minister, and also the Minister on leave, for the conversations that I have had with them about this amendment and related issues. As we have already heard, the platform categorisation is extremely important. So far, much of it is unknown, including which sites are actually going to be in which categories. For example, we have not yet seen any proposed secondary regulations. As my noble friend has just outlined, special duties apply, especially for those sites within category 1—user empowerment in particular, but also other duties relating to content and fraudulent advertisements.

Clause 85 and Schedule 11 set out the thresholds for determining which sites will be in category 1, category 2A or category 2B. I am very mindful of the exhortation of the noble Lord, Lord Stevenson, about being brief, but it is amazing how much you have to say about one word to explain this amendment. This amendment proposes to change an “and” to an “or” in relation to determining which sites would fall within category 1. It would move from a test of size “and” functionality to a test of size “or” functionality. This would give Ofcom more flexibility to decide which platforms really need category 1 designation. Category 1 should not be decided just on size; it should also be possible to determine it on the basis of functionality.

Functionality is defined in the Bill in Clause 208. We will get to those amendments shortly, but there is no doubt from what the Government have already conceded, or agreed with those of us who have been campaigning passionately on the Bill for a number of years, that functionality can make a platform harmful. It is perfectly possible to have small platforms that both carry highly harmful content and themselves become harmful in the way that they are designed. We have heard many examples and I will not detain the House with them, but I draw attention to two particular sites which capture how broad this is. The perpetrators of offline hate crimes are often linked to these small platforms. For example, the perpetrator of the 2018 Tree of Life synagogue mass shooting had an online presence on the right-wing extremist social network Gab. In the UK, Jake Davison, the self-proclaimed incel who killed five people in Plymouth in 2021, frequented smaller incel forums after he was banned from Reddit in the days leading up to the mass shooting.

I also want to share with noble Lords an email that I received just this week from a family who had been to see their Member of Parliament, Matt Rodda MP, and also the noble Baroness, Lady Kidron, who I know is very regretful that she cannot be here today. I thank Victoria and Jean Eustace for sharing the story of their sister and daughter. Victoria wrote: “I am writing to you regarding the Online Safety Bill, as my family and I are concerned it will not sufficiently protect vulnerable adults from harm. My sister, Zoe Lyalle, killed herself on 26 May 2020, having been pointed towards a method using an online forum called Sanctioned Suicide. Zoe was 18 years old at the time of her death and as such technically an adult, but she was autistic, so she was emotionally less mature than many 18 year- olds. She found it difficult to critically analyse written content”. She says that “The forum in question is not large and states on its face that it does not encourage suicide, although its content does just that”. The next part I was even more shocked about: “Since Zoe’s death, we have accessed her email account. The forum continues to email Zoe, providing her with updates on content she may have missed while away from the site, as well as requesting donations. One recent email included a link to a thread on the forum containing tips on how best to use the precise method that Zoe had employed”.

In her note to me, the Minister on leave said that she wanted to catch some of the platforms we are talking about with outsized influence. In my reply, I said that those sites on which people are encouraged to take their own lives or become radicalised and therefore take the harms they are seeing online into the real world undoubtedly exercise influence and should be tackled.

It is also perfectly possible for us to have large but safe platforms. I know that my noble friend Lord Moylan may want to discuss this in relation to sites that he has talked about already on this Bill. The risk of the current drafting is a flight of users from these large platforms, newly categorised as category 1, to the small, non-category 1 platforms. What if a platform becomes extremely harmful very quickly? How will it be recategorised speedily but fairly and involving parliamentary oversight?

The Government have run a variety of arguments as to why the “and” in the Bill should not become an “or”. They say that it creates legal uncertainty. Every Bill creates legal uncertainty; that is why we have an army of extremely highly paid lawyers, not just in this country but around the world. They say that what we are talking about is broader than illegal content or content related to children’s safety, but they have already accepted an earlier amendment on safety by design and, in subsections (10) to (12) of Clause 12, that specific extra protections should be available for content related to

“suicide or an act of deliberate self-injury, or … an eating disorder or behaviours associated with an eating disorder”

or abusive content relating to race, religion, sex, sexual orientation, disability or gender reassignment and that:

“Content is within this subsection if it incites hatred against people”.


The Government have already breached some of their own limits on content that is not just illegal or relates to child safety duties. In fact, they have agreed that that content should have enhanced triple-shield protection.

The Government have also said that they want to avoid burdens on small but low-harm platforms. I agree with that, but with an “or” it would be perfectly possible for Ofcom to decide by looking at size or functionality and to exclude those smaller platforms that do not present the harm we all care about. The Minister may also offer me a review of categorisation; however, it is a review of the tiers of categorisation and not the sites within the categories, which I think many of us will have views on over the years.

I come to what we should do on this final day of Report. I am very thankful to those who have had many conversations on this, but there is a fundamental difference of opinion in this House on these matters. We will talk about functionality shortly and I am mindful of the pre-legislative scrutiny committee’s recommendation that this legislation should adopt

“a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model”.

There should be other factors. Ofcom should have the ability to decide whether it takes one factor or another, and not have a series of all the thresholds to be passed, to give it the maximum flexibility. I will listen very carefully to what my noble friend the Minister and other noble Lords say, but at this moment I intend to test the opinion of the House on this amendment.

Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- Parliament Live - Hansard - - - Excerpts

My Lords, I strongly support Amendment 245. The noble Baroness, Lady Morgan of Cotes, has explained the nub of the problem we are facing—that size and functionality are quite separate. You can have large sites that perform a major social function and are extremely useful across society. Counter to that, you can have a small site focused on being very harmful to a small group of people. The problem is that, without providing the flexibility to Ofcom to determine how the risk assessment should be conducted, the Bill would lock it into leaving these small, very harmful platforms able to pursue their potentially ever-increasingly harmful activities almost out of sight. It does nothing to make sure that their risk assessments are appropriate.

--- Later in debate ---
Moved by
245: Schedule 11, page 223, line 32, leave out “and” and insert “or”
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Hansard - -

My Lords, I wish to test the opinion of the House and I beg to move.

--- Later in debate ---
It is late, and it has been a long Report stage. I will listen very carefully to what my noble friend the Minister has to say. I really hope that the Bill can continue to progress in this collaborative way. It would be an awful shame if, at the end of a long Report stage, we did not recognise that we are trying to solve the same problem and find a way through. I beg to move.
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Parliament Live - Hansard - -

My Lords, the hour is late and I will not detain the House for long. However, I hope that the fact that we are all still sitting here at the end of a long Report stage, because we care very much about the Bill and what we are trying to achieve, will be noted by my noble friend the Minister, his officials and others who are watching. I thank my noble friend Lady Harding for so ably introducing the amendments, which I absolutely support. I was, perhaps for the first time, going to agree with something the noble Baroness, Lady Fox, said a day or so ago: that one thing we and Ofcom need to do much better is to understand the transparency of the algorithms. It is not just algorithms—this is where my knowledge ends—but other design features that make these sites addictive and harmful, and which are outside content. The Bill will not be capable of addressing even the next five years, let alone beyond that, if we do not reflect the fact that, as my noble friend Lady Harding said, it has already been amended so that one way its objectives are to be achieved is by services being required to focus on safety by design.

I hope very much that my noble friend will take up the invitation, because everybody is tired and has been looking at this Bill for so many hours and months that we are probably all word-blind. We could all do with standing back and thinking, “With the amendments made, how does it all hang together so that ultimately, we keep those we want to keep safe as safe as we possibly can?” On that basis, I support these amendments and look forward to hearing further from the Government about how they hope to keep safe those we all wish to keep safe.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- Parliament Live - Hansard - - - Excerpts

My Lords, I rise to support the amendment in the name of the noble Baroness, Lady Kidron. She has been such a forceful voice throughout the passage of this Bill, driven by her passion to protect children, and no more so than with the amendment in her name. That is why I feel compelled to speak up to support her. So far, we have all worked with the Government to see the safe passage of the Online Safety Bill, with strong protections for children. These amendments would be yet another excellent and unique opportunity to protect children. This is what we have been fighting for for years, and it is so uplifting that the Government have listened to us throughout the passage of this Bill—so why stop now? If the Government are saying that the Bill is being clear about harms, they should have no objection to making it explicit.

These amendments press for safety by design to be embedded in later clauses of the Bill and go hand in hand with the earlier amendment that the House so clearly supported. It is clear that the design of services and algorithms is responsible for orchestrating and manipulating the behaviour, feelings, emotions and thoughts of children who, because they are at a vulnerable stage in their development, are easily influenced. We have all witnessed the disastrous impact of the new technology which is fast encroaching upon us, and our children will not be spared from it. So it is imperative that Ofcom have the tools with which to consider and interrogate system design separately from content because, as has been said, it is not only content that is harmful: design is too. We therefore need to take a holistic approach and leave nowhere to hide for the tech companies when it comes to harms affecting our children.

As I have said before, these amendments would send a loud and clear message to the industry that it is responsible for the design of its products and has to think of the consequences for our children’s mental health and well-being when considering design. What better way to do that than for the Government to accept these amendments, in order to show that they are on the side of our children, not the global tech companies, when it comes to protecting them from harm? They need to put measures in place to ensure that the way a service is designed is subject to the online safety regime we have all fought for over the years and during the passage of this Bill.

If the Government do not accept the amendment, perhaps the issue of harmful design could be included in the welcome proposed review of pornography. It would be good to hear the Minister’s thoughts on this idea—but I am not giving him a let-off. I hope he will listen to the strength of feeling and that the Government will reconsider their position, support the amendment and complete the one main task they set out to complete with this Bill, which is to protect children from harm no matter where it rears its ugly head online.

Online Safety Bill

Baroness Morgan of Cotes Excerpts
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, as we discussed in Committee, the Bill contains strong protection for women and girls and places duties on services to tackle and limit the kinds of offences and online abuse that we know disproportionately affect them. His Majesty’s Government are committed to ensuring that women and girls are protected online as well as offline. I am particularly grateful to my noble friend Lady Morgan of Cotes for the thoughtful and constructive way in which she has approached ensuring that the provisions in the Bill are as robust as possible.

It is with my noble friend’s support that I am therefore pleased to move government Amendment 152. This will create a new clause requiring Ofcom to produce guidance that summarises, in one clear place, measures that can be taken to tackle the abuse that women and girls disproportionately face online. This guidance will relate to regulated user-to-user and search services and will cover content regulated under the Bill’s frame- work. Crucially, it will summarise the measures in the Clause 36 codes for Part 3 duties, namely the illegal and child safety duties. It will also include a summary of platforms’ relevant Part 4 duties—for example, relevant terms of service and reporting provisions. This will provide a one-stop shop for providers.

Providers that adhere to the codes of practice will continue to be compliant with the duties. However, this guidance will ensure that it is easy and clear for platforms to implement holistic and effective protections for women and girls across their various duties. Any company that says it is serious about protecting women and girls online will, I am sure, refer to this guidance when implementing protections for its users.

Ofcom will have the flexibility to shape the guidance in a way it deems most effective in protecting women and girls online. However, as outlined in this amendment, we expect that it will include examples of best practice for assessing risks of harm to women and girls from content and activity, and how providers can reduce these risks and emphasise provisions in the codes of practice that are particularly relevant to the protection of women and girls.

To ensure that this guidance is effective and makes a difference, the amendment creates a requirement on Ofcom to consult the Domestic Abuse Commissioner and the Victims’ Commissioner, among other people or organisations it considers appropriate, when it creates this guidance. Much like the codes of practice, this will ensure that the views and voices of experts on the issue, and of women, girls and victims, are reflected. This amendment will also require Ofcom to publish this guidance.

I am grateful to all the organisations that have worked with us and with my noble friend Lady Morgan to get to this point. I hope your Lordships will accept the amendment. I beg to move.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Parliament Live - Hansard - -

My Lords, I will speak very briefly to this amendment; I know that the House is keen to get on to other business today. I very much welcome the amendment that the Government have tabled. My noble friend the Minister has always said that they want to keep women and girls safe online. As has been referred to elsewhere, the importance of making our digital streets safer cannot be overestimated.

As my noble friend said, women and girls experience a disproportionate level of abuse online. That is now recognised in this amendment, although this is only the start, not the end, of the matter. I thank my noble friend and the Secretary of State for their engagement on this issue. I thank the chief executive and the chair of Ofcom. I also thank the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Gloucester, who I know cannot be here today, and the noble Lord, Lord Knight, who signed the original amendment that we discussed in Committee.

My noble friend has already talked about the campaigners outside the Chamber who wanted there to be specific mention of women and girls in the Bill. I thank Refuge, the 100,000 people who signed the End Violence Against Women coalition’s petition, BT, Glitch, Carnegie UK, Professor Lorna Woods, the NSPCC and many others who made the case for this amendment.

As my noble friend said, this is Ofcom guidance. It is not necessarily a code of practice, but it is still very welcome because it is broader than just the specific offences that the Government have legislated on, which I also welcome. As he said, this puts all the things that companies, platforms and search engines should be doing to protect women and girls online in one specific place. My noble friend mentioned holistic protection, which is very important.

There is no offline/online distinction these days. Women and girls should feel safe everywhere. I also want to say, because I know that my noble friend has had a letter, that this is not about saying that men and boys should not be safe online; it is about recognising the disproportionate levels of abuse that women and girls suffer.

I welcome the fact that, in producing this guidance, Ofcom will have to consult with the Domestic Abuse Commissioner and the Victims’ Commissioner and more widely. I look forward, as I am sure do all the organisations I just mentioned, to working with Ofcom on the first set of guidance that it will produce. It gives me great pleasure to have signed the amendment and to support its introduction.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - - - Excerpts

My Lords, I know that we do not have long and I do not want to be churlish. I am not that keen on this amendment, but I want to ask a question in relation to it.

I am concerned that there should be no conflation in the best practice guidance between the actual, practical problems of, for example, victims of domestic abuse being stalked online, which is a threat to their safety, or threatened with physical violence—I understand that—and abuse. Abuse is horrible to be on the receiving end of, but it is important for freedom of thought and freedom of speech that we do not make no distinction between words and action. It is important not to overreact or frighten young women by saying that being shouted at is the same as being physically abused.

Online Safety Bill

Baroness Morgan of Cotes Excerpts
Lord Bethell Portrait Lord Bethell (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, I also support the amendments from the noble Baroness, Lady Kidron. It is relatively easy to stand here and make the case for age verification for porn: it is such a black and white subject and it is disgusting pornography, so of course children should be protected from it. Making the case for the design of the attention economy is more subtle and complex—but it is incredibly important, because it is the attention economy that is driving our children to extreme behaviours.

I know this from my own personal life; I enjoy incredibly lovely online content about wild-water swimming, and I have been taken down a death spiral towards ice swimming and have become a compulsive swimmer in extreme temperatures, partly because of the addiction generated by online algorithms. This is a lovely and heart-warming anecdote to give noble Lords a sense of the impact of algorithms on my own imagination, but my children are prone to much more dangerous experiences. The plasticity of their brains is so much more subtle and malleable; they are, like other children, open to all sorts of addiction, depression, sleeplessness and danger from predators. That is the economy that we are looking at.

I point noble Lords to the intervention from the surgeon general in America, Admiral Vivek Murthy—an incredibly impressive individual whom I came across during the pandemic. His 25-page report on the impact of social media on the young of America is incredibly eye-opening reading. Some 95% of American children have come across social media, and one-third of them see it almost constantly, he says. He attributes to the impact of social media depression, anxiety, compulsive behaviours and sleeplessness, as well as what he calls the severe impact on the neurological development of a generation. He calls for a complete bar on all social media for the under-13s and says that his own children will never get anywhere near a mobile phone until they are 16. That is the state of the attention economy that the noble Baroness, Lady Kidron, talks about, and that is the state of the design of our online applications. It is not the content itself but the way in which it is presented to our children, and it traps their imagination in the kind of destructive content that can lead them into all kinds of harms.

Admiral Murthy calls on legislators to act today—and that was followed on the same day by a commitment from the White House to look into this and table legislation to address the kind of design features that the noble Baroness, Lady Kidron, is looking at. I think that we should listen to the surgeon general in America and step up to the challenge that he has given to American legislators. I am enormously grateful to my noble friend the Minister for the incredible amount of work that he has already done to try to bridge the gap in this matter, but there is a way to go. Like my noble friend Lady Harding, I hope very much indeed that he will be able to tell us that he has been able to find a way across the gap, or else I shall be supporting the noble Baroness, Lady Kidron, in her amendment.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Parliament Live - Hansard - -

I rise briefly to speak to this group of amendments. I want to pick up where my noble friend Lord Bethell has just finished. The Government have listened hugely on this Bill and, by and large, the Bill, and the way in which Ministers have engaged, is a model of how the public wants to see their Parliament acting: collaboratively and collegiately, listening to each other and with a clear sense of purpose that almost all of us want to see the Bill on the statute book as soon as possible. So I urge my noble friend the Minister to do so again. I know that there have been many conversations and I think that many of us will be listening with great care to what he is about to say.

There are two other points that I wanted to mention. The first is that safety by design was always going to be a critical feature of the Bill. I have been reminding myself of the discussions that I had as Culture Secretary. Surely and in general, we want to prevent our young people in particular encountering harms before they get there, rather than always having to think about the moderation of harmful content once it has been posted.

Secondly, I would be interested to hear what the Minister has to say about why the Government find it so difficult to accept these amendments. Has there been some pushback from those who are going to be regulated? That would suggest that, while they can cope with the regulation of content, there is still secrecy surrounding the algorithms, functionalities and behaviours. I speak as the parent of a teenager who, if he could, would sit there quite happily looking at YouTube. In fact, he may well be doing that now—he certainly will not be watching his mother speaking in this House. He may well be sitting there and looking at YouTube and the content that is served up automatically, time after time.

I wonder whether this is, as other noble Lords have said, an opportunity. If we are to do the Bill properly and to regulate the platforms—and we have decided we need to do that—we should do the job properly and not limit ourselves to content. I shall listen very carefully to what my noble friend says but, with regret, if there is a Division, I will have to support the indomitable noble Baroness, Lady Kidron, as I think she was called.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Parliament Live - Hansard - - - Excerpts

My Lords, I very strongly support the noble Baroness, Lady Kidron, in her Amendments 35, 36 and 281F and in spirit very much support what the noble Lord, Lord Russell, said in respect of his amendments. We have heard some very powerful speeches from the noble Baroness, Lady Kidron, herself, from the noble Baronesses, Lady Harding and Lady Morgan, from the right reverend Prelate the Bishop of Oxford, from my noble friend Lady Benjamin and from the noble Lords, Lord Russell and Lord Bethell. There is little that I can add to the colour and the passion that they brought to the debate today.

As the noble Baroness, Lady Kidron, started by saying that it is not just about content; it is about functionalities, features and behaviours. It is all about platform design. I think the Government had pretty fair warning throughout the progress of the Bill that we would be keen to probe this. If the Minister looks back to the Joint Committee report, he will see that there was a whole chapter titled “Societal harm and the role of platform design”. I do not think we could have been clearer about what we wanted from this legislation. One paragraph says:

“We heard throughout our inquiry that there are design features specific to online services that create and exacerbate risks of harm. Those risks are always present, regardless of the content involved, but only materialise when the content concerned is harmful”.


It goes on to give various examples and says:

“Tackling these design risks is more effective than just trying to take down individual pieces of content (though that is necessary in the worst cases). Online services should be identifying these design risks and putting in place systems and process to mitigate them before people are harmed”.


That is the kind of test that the committee put. It is still valid today. As the noble Baroness said, platforms are benefiting from the network effect, and the Threads platform is an absolutely clear example of how that is possible.

The noble Lord, Lord Russell, gave us a very chilling example of the way that infinite scrolling worked for Milly. A noble Lord on the Opposition Bench, a former Home Secretary whose name I momentarily forget, talked about the lack of empathy of AI in these circumstances. The algorithms can be quite relentless in pushing this content; they lack human qualities. It may sound over the top to say that, but that is exactly what we are trying to legislate for. As the noble Lord, Lord Russell, says, just because we cannot always anticipate what the future holds, there is no reason why we should not try. We are trying to future-proof ourselves as far as possible, and it is not just the future but the present that we are trying to proof against through these amendments. We know that AI and the metaverse are coming down the track, but there are present harms that we are trying to legislate for as well. The noble Baroness, Lady Kidron, was absolutely right to keep reminding us about Molly Russell. It is this kind of algorithmic amplification that is so dangerous to our young people.

The Minister has a chance, still, to accede to these amendments. He has heard the opinion all around the House. It is rather difficult to understand what the Government’s motives are. The noble Baroness, Lady Morgan, put her finger on it: why is it so difficult to accede to these? We have congratulated the Government, the Minister and the Secretary of State throughout these groups over the last day and a bit; they have been extremely consensual and have worked very hard at trying to get agreement on a huge range of issues. Most noble Lords have never seen so many government amendments in their life. So far, so good; why ruin it?

Online Safety Bill

Baroness Morgan of Cotes Excerpts
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Parliament Live - Hansard - - - Excerpts

Is he not outrageous, trying to make appeals to one’s good humour and good sense? But I support him.

I will say only three things about this brief but very useful debate. First, I welcome the toggle-on, toggle-off resolution: that is a good move. It makes sure that people make a choice and that it is made at an appropriate time, when they are using the service. That seems to be the right way forward, so I am glad that that has come through.

Secondly, I still worry that terms of service, even though there are improved transparency measures in these amendments, will eventually need some form of power for Ofcom to set de minimis standards. So much depends on the ability of the terms of service to carry people’s engagement with the social media companies, including the decisions about what to see and not to see, and about whether they want to stay on or keep off. Without some power behind that, I do not think that the transparency will take it. However, we will leave it as it is; it is better than it was before.

Thirdly, user ID is another issue that will come back. I agree entirely with what the noble Lord, Lord Clement-Jones, said: this is at the heart of so much of what is wrong with what we see and perceive as happening on the internet. To reduce scams, to be more aware of trolls and to be aware of misinformation and disinformation, you need some sense of who you are talking to, or who is talking to you. There is a case for having that information verified, whether or not it is done on a limited basis, because we need to protect those who need to have their identities concealed for very good reason—we know all about that. As the noble Lord said, it is popular to think that you would be a safer person on the internet if you were able to identify who you were talking to. I look forward to hearing the Minister’s response.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Parliament Live - Hansard - -

My Lords, I will speak very briefly to Amendments 55 and 182. We are now at the stage of completely taking the lead from the Minister and the noble Lords opposite—the noble Lords, Lord Stevenson and Lord Clement-Jones—that we have to accept these amendments, because we need now to see how this will work in practice. That is why we all think that we will be back here talking about these issues in the not too distant future.

My noble friend the Minister rightly said that, as we debated in Committee, the Government made a choice in taking out “legal but harmful”. Many of us disagree with that, but that is the choice that has been made. So I welcome the changes that have been made by the Government in these amendments to at least allow there to be more empowerment of users, particularly in relation to the most harmful content and, as we debated, in relation to adult users who are more vulnerable.

It is worth reminding the House that we heard very powerful testimony during the previous stage from noble Lords with personal experience of family members who struggle with eating disorders, and how difficult these people would find it to self-regulate the content they were looking at.

In Committee, I proposed an amendment about “toggle on”. Anyone listening to this debate outside who does not know what we are talking about will think we have gone mad, talking about toggle on and toggle off, but I proposed an amendment for toggle on by default. Again, I take the Government’s point, and I know my noble friend has put a lot of work into this, with Ministers and others, in trying to come up with a sensible compromise.

I draw attention to Amendment 55. I wonder if my noble friend the Minister is able say anything about whether users will be able to have specific empowerment in relation to specific types of content, where they are perhaps more vulnerable if they see it. For example, the needs of a user might be quite different between those relating to self-harm and those relating to eating disorder content or other types of content that we would deem harmful.

On Amendment 182, my noble friend leapt immediately to abusive content coming from unverified users, but, as we have heard, and as I know, having led the House’s inquiry into fraud and digital fraud last year, there will be, and already is, a prevalence of scams. The Bill is cracking down on fraudulent advertisements but, as an anti-fraud measure, being able to see whether an account has been verified would be extremely useful. The view now is that, if this Bill is successful—and we hope it is—in cracking down on fraudulent advertising, then there will be even more reliance on what is called organic reach, which is the use of fake accounts, where verification therefore becomes more important. We have heard from opinion polling that the public want to see which accounts are or are not verified. We have also heard that Amendment 182 is about giving users choice, in making clear whether their accounts are verified; it is not about compelling people to say whether they are verified or not.

As we have heard, this is a direction of travel. I understand that the Government will not want to accept these amendments at this stage, but it is useful to have this debate to see where we are going and what Ofcom will be looking at in relation to these matters. I look forward to hearing what my noble friend the Minister has to say about these amendments.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Parliament Live - Hansard - - - Excerpts

My Lords, I speak to Amendment 53, on the assessment duties, and Amendment 60, on requiring services to provide a choice screen. It is the first time we have seen these developments. We are in something of a see-saw process over legal but harmful. I agree with my noble friend Lord Clement-Jones when he says he regrets that it is no longer in the Bill, although that may not be a consistent view everywhere. We have been see-sawing backwards and forwards, and now, like the Schrödinger’s cat of legal but harmful, it is both dead and alive at the same time. Amendments that we are dealing with today make it a little more alive that it was previously.

In this latest incarnation, we will insist that category 1 services carry out an assessment of how they will comply with their user-empowerment responsibility. Certainly, this part seems reasonable to me, given that it is limited to category 1 providers, which we assume will have significant resources. Crucially, that will depend on the categorisations—so we are back to our previous debate. If we imagine category 1 being the Meta services and Twitter, et cetera, that is one thing, but if we are going to move others into category 1 who would really struggle to do a user empowerment tool assessment—I have to use the right words; it is not a risk assessment—then it is a different debate. Assuming that we are sticking to those major services, asking them to do an assessment seems reasonable. From working on the inside, I know that even if it were not formalised in the Bill, they would end up having to do it as part of their compliance responsibilities. As part of the Clause 8 illegal content risk assessment, they would inevitably end up doing that.

That is because the categories of content that we are talking about in Clauses 12(10) to (12) are all types of content that might sometimes be illegal and sometimes not illegal. Therefore, if you were doing an illegal content risk assessment, you would have to look at it, and you would end up looking at types of content and putting them into three buckets. The first bucket is that it is likely illegal in the UK, and we know what we have to do there under the terms of the Bill. The second is that it is likely to be against your terms of service, in which case you would deal with it there. The third is that it is neither against your terms of service nor against UK law, and you would make a choice about that.

I want to focus on what happens once you have done the risk assessment and you have to have the choice screen. I particularly want to focus on services where all the content in Clause 12 is already against their terms of service, so there is no gap. The whole point of this discussion about legal but harmful is imagining that there is going to be a mixed economy of services and, in that mixed economy, there will be different standards. Some will wish to allow the content listed in Clause 12—self-harm-type content, eating disorder content and various forms of sub-criminal hate speech. Some will choose to do that—that is going to be their choice—and they will have to provide the user empowerment tools and options. I believe that many category 1 providers will not want to; they will just want to prohibit all that stuff under their terms of service and, in that case, offering a choice is meaningless. That will not make the noble Lord, Lord Moylan, or the noble Baroness, Lady Fox, very happy, but that is the reality.

Most services will just say that they do not want that stuff on their platform. In those cases, I hope that what we are going to say is that, in their terms of service, when a user joins a service, they can say that they have banned all that stuff anyway, so they are not going to give the user a user empowerment tool and, if the user sees that stuff, they should just report it and it will be taken down under the terms of service. Throughout this debate I have said, “No more cookie banners, please”. I hope that we are not going to require people, in order for them to comply with this law, to offer a screen that people then click through. It is completely meaningless and ineffective. For those services that have chosen under their terms of service to restrict all the content in Clause 12, I hope that we will be saying that their version of the user empowerment tool is not to make people click anything but to provide education and information and tell them where they can report the content and have it taken down.

Then there are those who will choose to protect that content and allow it on their service. I agree with the noble Lord, Lord Moylan, that this is, in some sense, Twitter-focused or Twitter-driven legislation, because Twitter tends to be more in the freedom of speech camp and to allow hate speech and some of that stuff. It will be more permissive than Facebook or Instagram in its terms, and it may choose to maintain that content and it will have to offer that screen. That is fine, but we should not be making services do so when they have already prohibited such content.

The noble Lord, Lord Moylan, mentioned services that use community moderators to moderate part of the service and how this would apply there. Reddit is the obvious example, but there are others. If you are going to have user empowerment—and Reddit is more at the freedom of expression end of things—then if there are some subreddits, or spaces within Reddit that allow hate speech or the kind of speech that is in Clause 12, it would be rational to say that user empowerment in the context of Reddit is to be told that you can join these subreddits and you are fine or you can join those subreddits and you are allowing yourself to be exposed to this kind of content. What would not make sense would be for Reddit to do it individual content item by content item. When we are thinking about this, I hope that the implementation would say that, for a service with community-moderated spaces, and subspaces within the larger community, user empowerment means choosing which subspaces you enter, and you would be given information about them. Reddit would say to the moderators of the subreddits, “You need to tell us whether you have any Clause 12-type content”—I shall keep using that language—“and, if you are allowing it, you need to make sure that you are restricted”. But we should not expect Reddit to restrict every individual content item.

Finally, as a general note of caution, noble Lords may have detected that I am not entirely convinced that these will be hugely beneficial tools, perhaps other than for a small subset of Twitter users, for whom they are useful. There is an issue around particular kinds of content on Twitter, and particular Twitter users, including people in prominent positions in public life, for whom these tools make sense. For a lot of other people, they will not be particularly meaningful. I hope that we are going to keep focused on outcomes and not waste effort on things that are not effective.

As I say, many companies, when they are faced with this, will look at it and say, “I have limited engineering time. I could build all these user empowerment tools or I could just ban the Clause 12 stuff in my terms of service”. That would not be a great outcome for freedom of expression; it might be a good outcome for the people who wanted to prohibit legal but harmful in the first place. You are going to do that as a really hard business decision. It is much more expensive to try to maintain these different regimes and flag all this content and so on. It is simpler to have one set of standards.

Online Safety Bill

Baroness Morgan of Cotes Excerpts
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - - - Excerpts

My Lords, first, I welcome the amendment from the noble Lord, Lord Allan, and his motivation, because I am concerned that, throughout the Bill, the wrong targets are being caught up. I was grateful to hear his recognition that people who talk about their problems with self-harm could end up being targeted, which nobody would ever intend. These things need to be taken seriously.

In that sense, I was slightly concerned about the motivation of the noble Baroness, Lady Burt of Solihull, in the “reckless” amendment. The argument was that the recklessness standard is easier to prove. I am always worried about things that make it easier to prosecute someone, rather than there being a just reason for that prosecution. As we know, those involved in sending these images are often immature and very foolish young men. I am concerned about lowering the threshold at which we criminalise them—potentially destroying their lives, by the way, because if you have a criminal record it is not good—even though I in no way tolerate what they are doing and it is obviously important that we take that on.

There is a danger that this law will become a mechanism through which people try to resolve a whole range of social problems—which brings me on to responding to the speech just made by the noble Baroness, Lady Kennedy of The Shaws. I continue to be concerned about the question of trying to criminalise indirect threats. The point about somebody who sends a direct threat is that we can at least see the connection between that direct threat and the possibility of action. It is the same sort of thing that we have historically considered in relation to incitement. I understand that, where your physical being is threatened by words, physically a practical thing can happen, and that is to be taken very seriously. The problem I have is with the indirect threat from somebody who says, for example, “That smile should be taken of your face. It can be arranged”, or other indirect but incredibly unpleasant comments. There is clearly no link between that and a specific action. It might use violent language but it is indirect: “It could be arranged”, or “I wish it would happen”.

Anyone on social media—I am sure your Lordships all are—will know that I follow very carefully what people from different political parties say about each other. I do not know if you have ever followed the kind of things that are said about the Government and their Ministers, but the threats are not indirect and are often named. In that instance, it is nothing to do with women, but it is pretty violent and vile. By the way, I have also followed what is said about the Opposition Benches, and that can be pretty violent and vile, including language that implies that they wish those people were the subject of quite intense violence—without going into detail. That happens, and I do not approve of it—obviously. I also do not think that pile-ons are pleasant to be on the receiving end of, and I understand how they happen. However, if we criminalise pile-ons on social media, we are openly imposing censorship.

What is worse in my mind is that we are allowing the conflation of words and actions, where what people say or think is the same as acting on it, as the criminal law would see it. We have seen a very dangerous trend recently, which is particularly popular in the endless arguments and disputes over identity politics, where people will say that speech is violence. This has happened to a number of gender-critical feminists, in this instance women, who have gone in good faith to speak at universities, having been invited. They have been told that their speech was indistinguishable from violence and that it made students at the university feel under threat and unsafe and that it was the equivalent of being attacked. But guess what? Once you remove that distinction, the response to that speech can be to use violence, because you cannot tell the difference between them. That has happened around a number of university actions, where speakers and their supporters were physically assaulted by people who said that they were using self-defence against speech that was violent. I get nervous that this is a slippery slope, and we certainly should not go anywhere near it in legislation.

Finally, I agree that we should tackle the culture of people piling on and using this kind of language, but it is a cultural and social question. What we require is moral leadership and courage in the face of it—calling it out, arguing against it and so on. It is wrong to use the law to send messages; it is an abdication of moral leadership and a cop-out, let alone dangerous in what is criminalised. I urge your Lordships to reject those amendments.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Parliament Live - Hansard - -

My Lords, I will speak briefly to Amendments 5C and 7A in this group. I welcome the Government’s moves to criminalise cyberflashing. It is something that many have campaigned for in both Houses and outside for many years. I will not repeat the issues so nobly introduced by the noble Baroness, Lady Burt, and I say yet again that I suspect that the noble Baroness, Lady Featherstone, is watching, frustrated that she is still not able to take part in these proceedings.

It is worth making the point that, if actions are deemed to be serious enough to require criminalisation and for people potentially to be prosecuted for them, I very much hope that my noble friend the Minister will be able to say in his remarks that this whole area of the law will be kept under review. There is no doubt that women and girls’ faith in the criminal justice system, both law enforcement and the Crown Prosecution Service, is already very low. If we trumpet the fact that this offence has been introduced, and then there are no prosecutions because the hurdles have not been reached, that is even worse than not introducing the offence in the first place. So I hope very much that this will be kept under review, and no doubt there will be opportunities to return to it in the future.

I do not want to get into the broader debate that we have just heard, because we could be here for a very long time, but I would just say to the noble Baronesses, Lady Kennedy and Lady Fox, that we will debate this in future days on Report and there will be specific protection and mention of women and girls on the face of the Bill—assuming, of course, that Amendment 152 is approved by this House. The guidance might not use the words that have been talked about, but the point is that that is the place to have the debate—led by the regulator with appropriate public consultation—about the gendered nature of abuse that the noble Baroness, Lady Kennedy, has so eloquently set out. I hope that will also be a big step forward in these matters.

I look forward to hearing from the Minister about how this area of law will be kept under review.

Baroness Kidron Portrait Baroness Kidron (CB)
- Parliament Live - Hansard - - - Excerpts

My Lords, I understand that, as this is a new stage of the Bill, I have to declare my interests: I am the chair of 5Rights Foundation, a charity that works around technology and children; I am a fellow at the computer science department at Oxford University; I run the Digital Futures Commission, in conjunction with the 5Rights Foundation and the London School of Economics; I am a commissioner on the Broadband Commission; I am an adviser for the AI ethics institute; and I am involved in Born in Bradford and the Lancet commission, and I work with a broad number of civil society organisations.

--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I also welcome these amendments and want to pay tribute to Maria Miller in the other place for her work on this issue. It has been extraordinary. I too was going to raise the issue of the definition of “photograph”, so perhaps the Minister could say or, even better, put it in the Bill. It does extend to those other contexts.

My main point is about children. We do not want to criminalise children, but this is pervasive among under-18s. I do want to make the distinction between those under-18s who intentionally harm another under-18 and have to be responsible for what they have done in the meaning of the law as the Minister set it out, and those who are under the incredible pressure—I do not mean coercion, because that is another out-clause—of oversharing that is inherent in the design of many of these services. That is an issue I am sure we are going to come back to later today. I would love to hear the Minister say something about the Government’s intention from the Dispatch Box: that it is preventive first and there is a balance between education and punishment for under-18s who find themselves unavoidably in this situation.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Hansard - -

Very briefly, before I speak to these amendments, I want to welcome them. Having spoken to and introduced some of the threats of sharing intimate images under the Domestic Abuse Act 2021, I think it is really welcome that everything has been brought together in one place. Again, I pay tribute to the work of Dame Maria Miller and many others outside who have raised these as issues. I also want to pay tribute to the Ministry of Justice Minister Edward Argar, who has also worked with my noble friend the Minister on this.

I have one specific question. The Minister did mention this in his remarks, but could he be absolutely clear that these amendments do not mention specifically the lifetime anonymity of claimants and the special measures in relation to giving evidence that apply to witnesses. That came up in the last group of amendments as well. Because they are not actually in this drafting, it would be helpful if he could put on record the relationship with the provisions in the Sexual Offences Act 2003. I know that would be appreciated by campaigners.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I have very little to add to the wise words that we have heard from my noble friend and from the noble Baronesses, Lady Kidron and Lady Morgan. We should thank all those who have got us to this place, including the Law Commission. It was a separate report. In that context, I would be very interested to hear a little more from the Minister about the programme of further offences that he mentioned. The communication offences that we have talked about so far are either the intimate images offences, which there was a separate report on, or other communications offences, which are also being dealt with as part of the Bill. I am not clear what other offences are in the programme.

Finally, the Minister himself raised the question of deepfakes. I have rustled through the amendments to see exactly how they are caught. The question asked by the noble Baroness, Lady Kidron, is more or less the same but put a different way. How are these deepfakes caught in the wording that is now being included in the Bill? This is becoming a big issue and we must be absolutely certain that it is captured.

Online Safety Bill

Baroness Morgan of Cotes Excerpts
I repeat my offer to help if there is anything I can do to try to unblock some of this, work on the detail and make sure that this is effective. This is an area where we could make significant progress and certainly move on from a situation that has frustrated everybody and been unacceptable to date.
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Parliament Live - Hansard - -

My Lords, I am very pleased to support the noble Baroness, Lady Kidron, with these amendments. I also welcome the fact that we have, I hope, reached the final day of this stage of the Bill, which means that it is getting closer to becoming an Act of Parliament. The amendments to these clauses are a very good example of why the Bill needs to become an Act sooner rather than later.

As we heard during our earlier debates, social media platforms have for far too long avoided taking responsibility for the countless harms that children face on their services. We have, of course, heard about Molly Russell’s tragic death and heard from the coroner’s inquest report that it was on Instagram that Molly viewed some of the most disturbing posts. Despite this, at the inquest Meta’s head of health and well-being policy shied away from taking blame and claimed that the posts which the coroner said contributed to Molly’s death

“in a more than minimal way”

were, in Meta’s words, “safe”. Molly’s family and others have to go through the unthinkable when they lose their child in such a manner. Their lives can be made so much harder when they attempt to access their child’s social media accounts and activities only to be denied by the platforms.

The noble Baroness’s various amendments are not only sensible but absolutely the right thing to do. In many ways, it is a great tragedy that we have had to wait for this piece of primary legislation for these companies to start being compelled and told. I understand what the noble Lord, Lord Allan, very rationally said—companies should very much welcome these amendments—but it is a great shame that often they have not behaved better in these circumstances previously.

There is perhaps no point going into the details, because we want to hear from the Minister about what the Government will propose. I welcome the fact that the Government have engaged early-ish on these amendments and on these matters.

The amendments would force platforms to comply with coroners in investigations into the death of a child, have a named senior manager in relation to inquests and allow easier access to a child’s social media account for bereaved families. We will have to see what the Government’s amendments do to reflect that. One of the areas that the noble Baroness said had perhaps not been buttoned down is the responsibility for a named senior manager in relation to an inquest. This is requiring that:

“If Ofcom has issued a notice to a service provider they must name a senior manager responsible for providing material on behalf of the service and to inform that individual of the consequences for not complying”.


The noble Lord, Lord Allan, set out very clearly why having a named contact in these companies is important. Bereaved families find it difficult, if not impossible, to make contact with tech companies: they get lost in the automated systems and, if they are able to access a human being, they are told that the company cannot or will not give that information. We know that different coroners have had widely differing experiences getting information from the social media platforms, some refusing altogether and others obfuscating. Only a couple of companies have co-operated fully, and in only one or two instances. Creating a single point of contact, who understands the law—which, as we have just heard, is not necessarily always straightforward, particularly if it involves different jurisdictions—understands what is technically feasible and has the authority and powers afforded to the regulator will ensure a swifter, more equitable and less distressing process.

I have really set this out because we will obviously hear what the Minister will set out, but if it does not reflect having a named senior manager, then I hope very much that we are able to discuss that between this and the next stage.

Social media platforms have a responsibility to keep their users safe. When they fail, they should be obligated to co-operate with families and investigations, rather than seeking to evade them. Seeing what their child was viewing online before their death will not bring that child back, but it will help families on their journey towards understanding what their young person was going through, and towards seeking justice. Likewise, ensuring that platforms comply with inquests will help to ease the considerable strain on bereaved families. I urge noble Lords to support these amendments or to listen to what the Government say. Hopefully, we can come up with a combined effort to put an end to the agony that these families have been through.

Baroness Healy of Primrose Hill Portrait Baroness Healy of Primrose Hill (Lab)
- Parliament Live - Hansard - - - Excerpts

My Lords, I strongly support this group of amendments in the name of the noble Baroness, Lady Kidron, and other noble Lords. I, too, acknowledge the campaign group Bereaved Families for Online Safety, which has worked so closely with the noble Baroness, Lady Kidron, 5Rights and the NSPCC to bring these essential changes forward.

Where a child has died, sadly, and social media is thought to have played a part, families and coroners have faced years of stonewalling, often never managing to access data or information relevant to that death; this adds greatly to their grief and delays the finding of some kind of closure. We must never again see a family treated as Molly Russell’s family was treated, when it took five years of campaigning to get partial sight of material that the coroner found so distressing that he concluded that it contributed to her death in a more than minimal way; nor can it be acceptable for a company to refuse to co-operate, as in the case of Frankie Thomas, where Wattpad failed to provide the material requested by the coroner on the grounds that it is not based within the UK’s jurisdiction. With the threat of a fine of only £1,000 to face, companies feel little need to comply. These amendments would mean that tech companies now had to comply with Ofcom’s information notices or face a fine of up to 10% of their global revenue.

Coroners’ powers must be strengthened by giving Ofcom the duty and power to require relevant information from companies in cases where there is reason to suspect that a regulated service provider may hold information relevant to a child’s death. Companies may not want to face up to the role they have played in the death of a child by their irresponsible recommending and pushing of violent, sexual, depressive and pro-suicide material through algorithmic design, but they need to be made to answer when requested by a coroner on behalf of a bereaved family.

Amendment 215 requires a named senior manager, a concept that I am thankful is already enshrined in the Bill, to receive and respond to an information notice from Ofcom to ensure that a child’s information, including their interactions and behaviour and the actions of the regulated service provider, is preserved and made available. This could make a profound difference to how families will be treated by these platforms in future. Too often in the past, they have been evasive and unco-operative, adding greatly to the inconsolable grief of such bereaved parents. As Molly Russell's father Ian said:

“Having lived through Molly’s extended inquest, we think it is important that in future, after the death of a child, authorities’ access to data becomes … a matter of course”


and

“A more compassionate, efficient and speedy process”.


I was going to ask the Government to accept these amendments but, having listened to the noble Baroness, Lady Kidron, I am looking forward to their proposals. We must ensure that a more humane route for families and coroners to access data relating to the death of a child is at last available in law.

--- Later in debate ---
We have made immense progress in the development of this Bill in ensuring that children will be protected from pornographic and inappropriate content. We now have the responsibility to ensure that those who fail to comply with these measures face proportionate consequences. As regulator and sole enforcer of the Bill, Ofcom must be empowered to protect users online. In the spirit of willingness to respond positively, which the Minister has demonstrated already this afternoon, I hope that he will also do so with these amendments.
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Parliament Live - Hansard - -

My Lords, I will speak briefly to Amendment 218JA, spoken to by the noble Lord, Lord Allan. My name is attached to it online but has not made it on to the printed version. He introduced it so ably and comprehensively that I will not say much more, but I will be more direct with my noble friend the Minister.

This amendment would remove Clause 133(11). The noble Lord, Lord Allan, mentioned that BT has raised with us—I am sure that others have too—that the subsection gives examples of access facilities, such as ISPs and application stores. However, as the noble Lord said, there are other ways that services could use operating systems, browsers and VPNs to evade these access restriction orders. While it is convention for me to say that I would support this amendment should it be moved at a later stage, this is one of those issues that my noble friend the Minister could take off the table this afternoon—he has had letters about it to which there have not necessarily been replies—just by saying that subsection (11) does not give the whole picture, that there are other services and that it is misleading to give just these examples. Will he clarify at the Dispatch Box and on the record, for the benefit of everyone using the Bill now and in future, what broader services are caught? We could then take the issue off the table on this 10th day of Committee.

Baroness Kidron Portrait Baroness Kidron (CB)
- Parliament Live - Hansard - - - Excerpts

My Lords, I will be even more direct than the noble Baroness, Lady Morgan, and seek some confirmation. I understood from our various briefings in Committee that, where content is illegal, it is illegal anywhere in the digital world—it is not restricted simply to user to user, search and Part 5. Can the Minister say whether I have understood that correctly? If I have, will he confirm that Ofcom will be able to use its disruption powers on a service out of scope, as it were, such as a blog or a game with no user-to-user aspect, if it were found to be persistently hosting illegal content?

Moved by
192: Schedule 11, page 216, line 30, after “service” insert “, including significant risk of harm,”
Member’s explanatory statement
There are some platforms which, whilst attracting small user numbers, are hubs for extreme hateful content and should be regulated as larger user-to-user services.
--- Later in debate ---
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Hansard - -

My Lords, I am very grateful to the noble Baronesses, Lady Parminter and Lady Deech, and the noble Lord, Lord Mann, for their support. After a miscellaneous selection of amendments, we now come back to a group of quite tight amendments. Given the hour, those scheduling the groupings should be very pleased because for the first time we have done all the groups that we set out to do this afternoon. I do not want to tempt fate, but I think we will have a good debate before we head off for a little break from the Bill for a while.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Parliament Live - Hansard - - - Excerpts

In the response I gave, I said that we are making the risk assessment that the riskiest behaviour is illegal content and content which presents a harm to children. That is the assessment and the approach taken in the Bill. In relation to other content which is legal and for adults to choose how they encounter it, there are protections in the Bill to enforce terms of service and empower users to curate their own experience online, but that assessment is made by adult users within the law.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Parliament Live - Hansard - -

I thank all noble Lords who spoke in this short but important debate. As we heard, some issues relating to risk and harm have been returned to and will no doubt be again, and we note the impact of the absence of legal but harmful as a concept. As the noble Baroness, Lady Bull, said, I know that the noble Baroness, Lady Parminter, was very sad that she could not be here this afternoon due to another engagement.

I will not keep the House much longer. I particularly noted the noble Baroness’s point that there should not be, and is not, a direct relationship between the size of the platform and its ability to cause harm. There is a balance to be struck between the regulatory burden placed on platforms versus the health and well-being of those who are using them. As I have said before, I am not sure that we have always got that particular balance right in the Bill.

The noble Lord, Lord Allan, was very constructive: it has to be a good thing if we are now beginning to think about the Bill’s implementation, although we have not quite reached the end and I do not want to prejudge any further stages, in the sense that we are now thinking about how this would work. Of course, he is right to say that some of these platforms have no intention of complying with these rules at all. Ofcom and the Government will have to work out what to do about that.

Ultimately, the Government of the day—whoever it might be—will want the powers to be able to say that a small platform is deeply harmful in terms of its content and reach. When the Bill has been passed, there will be pressure at some point in the future on a platform that is broadcasting or distributing or amplifying content that is deeply harmful. Although I will withdraw the amendment today, my noble friend’s offer of further conversations, and more detail on categorisation and of any review of the platforms as categorised as category 1, 2 and beyond, would be very helpful in due course. I beg leave to withdraw.

Amendment 192 withdrawn.

Online Safety Bill

Baroness Morgan of Cotes Excerpts
Thursday 25th May 2023

(11 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Berridge Portrait Baroness Berridge (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, I am grateful to noble Lords who have added their name to my Amendment 271, which arose out of concerns that there are now seemingly several offences that laudably aim to protect women but are not being enforced effectively. The most notable in this category is the low rate of rape cases that are prosecuted and lead to convictions. The amendment is not affected in theory by the definition of cyberflashing, whether it is in the form recommended by the Law Commission, that of specific intent, rather than being based on consent. However, in practice, if it remains in that specific intent form, then the victim will not be required to go to court. Therefore, in practice the amendment would be more effective if the offence remained on that basis. However, even if the victim on that basis does not need to go to court, someone who has been cyberflashed is, as other noble Lords have mentioned, unlikely to go to the police station to report what has happened.

This amendment is designed to put an obligation on the providers of technology to provide a reporting mechanism on phones and to collate that information before passing it to the prosecuting authorities. The Minister said that there are various issues with how the amendment is currently drafted, such as “the Crown Prosecution Service” rather than “the police”, and perhaps the definition of “providers of internet services” as it may be a different part of the tech industry that is required to collate this information.

Drawing on our discussions on the previous group of amendments regarding the criminal law here, I hope that my noble friend can clarify the issues of intent, which is mens rea and different from motive in relation to this matter. The purpose of the amendment is to ensure that there will be resources and expertise from the technology sector to provide these reporting mechanisms for the offences. One can imagine how many people will report cyberflashing if they only have to click on an app, or if their phone is enabled to retain such an image, since some of them disappear after a short while. You should be able to sit on the bus and report it. The tech company would then store and collate that, potentially in a manner that it would become clear. For instance—because this happens so much as we have just heard—if six people on the 27 bus multiple times a week report that they have received the same image, that would prompt the police to get the CCTV from the bus company to identify who this individual is if the tech company data did not provide that specificity. Or, is someone hanging out every Friday night at the A&E department and cyberflashing as they sit there? This is not part of the amendment, but such an app or mechanism could also include a reminder to change the security settings on your phone so that you cannot be AirDropped.

I hope that His Majesty’s Government will look at the purpose of this amendment. It is laudable that we are making cyberflashing an offence, but this amendment is about the enforcement of that offence and will support that. Only with such an easy mechanism to report it can what will be a crime be effectively policed.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Parliament Live - Hansard - -

My Lords, I, too, wish the noble Baroness, Lady Featherstone, a very speedy recovery. Her presence here today is missed, though the amendments were very ably moved by the noble Baroness, Lady Burt. Having worked in government with the noble Baroness, Lady Featherstone, I can imagine how frustrated she is at not being able to speak today on amendments bearing her name.

As my noble friend said, this follows our debate on the wider issues around violence against women and girls in the online world. I do not want to repeat anything that was said there, but I am grateful to him for the discussions that we have had since. I support the Government in their introduction of Amendment 135A and the addition of controlling or coercive behaviour to the priority offences list. I will also speak to the cyberflashing amendments and Amendment 271, introduced by my noble friend Lady Berridge.

I suspect that many of us speaking in this debate today have had briefings from the wonderful organisation Refuge, which has seen a growing number of cases of technology-facilitated domestic abuse in recent years. As a result of this, Refuge pioneered a specialist technology-facilitated domestic abuse team, which uses expertise to support survivors and to identify emerging trends of online domestic abuse.

I draw noble Lords’ attention to a publication released since we debated this last week: the National Police Chiefs’ Council’s violence against women and girls strategic threat risk assessment for 2023, in which a whole page is devoted to tech and online-enabled violence against women and girls. In its conclusions, it says that one of the key threats is tech-enabled VAWG. The fact that we are having to debate these specific offences, but also the whole issue of gendered abuse online, shows how huge an issue this is for women and girls.

I will start with Amendment 271. I entirely agree with my noble friend about the need for specific user reporting and making that as easy as possible. That would support the debate we had last week about the code of practice, which would generally require platforms and search engines to think from the start how they will enable those who have been abused to report that abuse as easily as possible, so that the online platforms and search engines can then gather that data to build up a picture and share it with the regulator and law enforcement as appropriate. So, while I suspect from what the Minister has said that he will not accept this amendment, the points that my noble friend made are absolutely necessary in this debate.

I move on to the cyberflashing amendment. It has been very ably covered already, so I do not want to say too much. It is clear that women and girls experience harms regardless of the motives of the perpetrator. I also point out that, as we have heard, motivations are very difficult to prove, meaning that prosecutions are often extremely unlikely.

I was very proud to introduce the amendments to what became the Domestic Abuse Act 2021. It was one of my first contributions in this House. I remember that, in the face of a lockdown, most of us were working virtually. But we agreed, and the Government introduced, amendments on intimate image abuse and revenge porn. Even as I proposed those amendments and they were accepted, it was clear that they were not quite right and did not go far enough. As we have heard, for the intimate image abuse proposals, the Law Commission is proposing a consent-based image abuse offence. Can my noble friend be even clearer—I am sorry that I was not able to attend the briefing—about the distinction between consent-based intimate image abuse offences and motive-based cyberflashing offences, and why the Government decided to make it?

I also gently point out to him that I know that this is complicated, but we are still waiting for drafting of the intimate image abuse offences. We are potentially running out of time. Perhaps we will see them at the next stage of the Bill—unless he reveals them like a rabbit out of a hat this afternoon, which I suspect is not the case. These are important offences and it will be important for us to see the detail so that we can scrutinise them properly.

Finally, in welcoming the Government’s amendment on coercive control, I say that it is generally poorly understood by technology companies. Overall, the use of the online world to perpetrate abuse on women and girls, particularly in the domestic abuse context, is certainly being understood more quickly, but we are all playing catch-up in how this happens while the perpetrators are running ahead of us. More can be done to recognise the ways that the online world can be used to abuse and intimidate victims, as the Government have recognised with this amendment and as the noble Baroness, Lady Gohir, said. It is very necessary in debating the Bill. I look forward to hearing the Minister’s remarks at the end of this debate.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will certainly do so. It requires flicking through a number of amendments and cross-referencing them with provisions in the Bill. I will certainly do that in slower time and respond.

We think that the Law Commission, which looked at all these issues, including, I think, the questions put by the noble Lord, has done that well. We were satisfied with it. I thought its briefing with Professor Penney Lewis was useful in exploring those issues. We are confident that the offence as drafted is the appropriate one.

My noble friend Lady Morgan and others asked why both the Law Commission and the Government are taking a different approach in relation to intimate image abuse and to cyberflashing. We are taking action to criminalise both, but the Law Commission recommended different approaches in how to criminalise that behaviour to take into account the different actions of the perpetrator in each scenario. Sharing an intimate image of a person without their consent is ipso facto wrongful, as it is a violation of their bodily privacy and sexual autonomy. Sending a genital image is not ipso facto wrongful, as it does not always constitute a sexual intrusion, so greater additional culpability is required for that offence. To give an example, sending a photograph of a naked protestor, even without the consent of the recipient, is not always harmful. Although levels of harm resulting from behaviours may be the same and cause the same levels of stress, the criminal law must consider whether the perpetrator’s behaviour was sufficiently culpable for an offence to have been committed. That is why we think the intent approach is best for cyberflashing but have taken a different approach in relation to intimate image abuse.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Hansard - -

I thank my noble friend for that explanation, which is very helpful and there is a lot in his reply so far that we will have to bottom out. Is he able to shed any light at all on when we might see the drafting of the intimate image abuse wording because that would be helpful in resolving some of the issues we have been debating?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I cannot give a precise date. The Committee knows the dates for this Committee are a moveable feast, but we have been having fruitful discussions on some of the issues we have already discussed—we had one yesterday with my noble friend. I appreciate the point she is making about wanting to see the drafting in good time before Report so that we can have a well thought through debate on it. I will certainly reiterate that to the usual channels and to others.

Amendment 271 additionally seeks to require companies in scope to provide systems which enable users to report incidents of cyberflashing to platforms. Clauses 16 and 26 already require companies to set up systems and processes which allow users easily to report illegal content, and this will include cyberflashing. This amendment therefore duplicates the existing requirement set out in the Bill. Amendment 271 also requires in scope companies to report cyberflashing content to the Crown Prosecution Service. The Bill does not place requirements on in scope companies to report discovery of illegal content online, other than in the instances of child exploitation and abuse, reflecting the seriousness of that crime and the less subjective nature of the content that is being reported in those scenarios.

The Bill, which has been developed in consultation with our partners in law enforcement, aims to prevent and reduce the proliferation of illegal content and activity in the first place and the resulting harm this causes to so many. While the Bill does not place any specific responsibilities on policing, our policing partners are considering how best to respond to the growing threat of online offences, as my noble friend Lady Morgan noted, in relation to the publication last week of the Strategic Threat and Risk Assessment on Violence Against Women and Girls. Policing partners will be working closely with Ofcom to explore the operational impact of the Bill and make sure it is protecting women and girls in the way we all want it to.

I hope that helps noble Lords on the issues set out in these amendments. I am grateful for the support for the government amendment in my name and hope that noble Lords will be content not to move theirs at this juncture.

Online Safety Bill

Baroness Morgan of Cotes Excerpts
Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- Hansard - - - Excerpts

I am particularly grateful to the noble Lords who co-signed Amendments 96, 240 and 296 in this group. Amendment 225 is also important and warrants careful consideration, as it explicitly includes eating disorders. These amendments have strong support from Samaritans, which has helped me in drafting them, and from the Mental Health Foundation and the BMA. I declare that I am an elected member of the BMA ethics committee.

We have heard much in Committee about the need to protect children online more effectively even than in the Bill. On Tuesday the noble Baroness, Lady Morgan of Cotes, made a powerful speech acknowledging that vulnerability does not stop at the age of 18 and that the Bill currently creates a cliff edge whereby there is protection from harmful content for those under 18 but not for those over 18. The empowerment tools will be futile for those seriously contemplating suicide and self-harm. No one should underestimate the power of suicide contagion and the addictive nature of the content that is currently pushed out to people, goading them into such actions and drawing them into repeated viewings.

Amendment 96 seeks to redress that. It incorporates a stand-alone provision, creating a duty for providers of user-to-user services to manage harmful content about suicide or self-harm. This provision would operate as a specific category, relevant to all regulated services and applicable to both children and adults. Amendment 296 defines harmful suicide or self-harm content. It is important that we define that to avoid organisations such as Samaritans, which provide suicide prevention support, being inadvertently caught up in clumsy, simplistic search engine categorisation.

Suicide and self-harm content affects people of all ages. Adults in distress search the internet, and children easily bypass age-verification measures and parental controls even when the have been switched on. The Samaritans Lived Experience Panel reported that 82% of people who died by suicide, having visited websites that encouraged suicide and/or methods of self-harm, were over the age of 25.

Samaritans considers that the types of suicide and self-harm content that are legal but unequivocally harmful include, but are not limited to, information, depictions, instructions and advice on methods of self-harm and suicide; content that portrays self-harm and suicide as positive or desirable; and graphic descriptions or depictions of self-harm and suicide. As the Bill stands, platforms will not even need to consider the risk that such content could pose to adults. This will leave all that dangerous online content widely available and undermines the Bill’s intention from the outset.

Last month, other parliamentarians and I met Melanie, whose relative Jo died by suicide in 2020. He was just 23. He had accessed suicide-promoting content online, and his family are speaking out to ensure that the Bill works to avoid future tragedies. A University of Bristol study reported that those with severe suicidal thoughts actively use the internet to research effective methods and often find clear suggestions. Swansea University reported that three quarters of its research participants had harmed themselves more severely after viewing self-harm content online.

Amendment 240 complements the other amendments in this group, although it would not rely on them to be effective. It would establish a specific unit in Ofcom to monitor the prevalence of suicide, self-harm and harmful content online. I should declare that this is in line with the Private Member’s Bill I have introduced. In practice, that means that Ofcom would need to assess the efficacy of the legislation in practice. It would require Ofcom to investigate the content and the algorithms that push such content out to individuals at an alarming rate.

Researchers at the Center for Countering Digital Hate set up new accounts in the USA, UK, Canada and Australia at the minimum age TikTok allows, which is 13. These accounts paused briefly on videos about body image and mental health, and “liked” them. Within 2.6 minutes, TikTok recommended suicide content, and it sent content on eating disorders within eight minutes.

Ofcom’s responsibility for ongoing review and data collection, reported to Parliament, would take a future-facing approach covering new technologies. New communications and internet technologies are being developed at pace in ways we cannot imagine. The term

“in a way equivalent … to”

in Amendment 240 is specifically designed to include the metaverse, where interactions are instantaneous, virtual and able to incite, encourage or provoke serious harm to others.

We increasingly live our lives online. Social media is expanding, while user-to-user sites are now shopping platforms for over 70% of UK consumers. However, online is also being used to sell suicide kits or lethal substances, as recently covered in the press. It is important that someone holds the responsibility for reporting on dangers in the online world. Harmful suicide content methods and encouragement were found through a systematic review to be massed on sites with low levels of moderation and easy search functions for images. Some 78% of people with lived experience of suicidality and self-harm surveyed by Samaritans agree that new laws are needed to make online spaces safer.

I urge noble Lords to support my amendments, which aim to ensure that self-harm, suicide and seriously harmful content is addressed across all platforms in all categories as well as search engines, regardless of their functionality or reach, and for all persons, regardless of age. Polling by Samaritans has shown high support for this: four out of five agree that harmful suicide and self-harm content can damage adults as well as children, while three-quarters agree that tech companies should by law prevent such content being shown to users of all ages.

If the Government are not minded to adopt these amendments, can the Minister tell us specifically how the Bill will take a comprehensive approach to placing duties on all platforms to reduce dangerous content promoting suicide and self-harm? Can the Government confirm that smaller sites, such as forums that encourage suicide, will need to remove priority illegal content, whatever the level of detail in their risk assessment? Lastly—I will give the Minister a moment to note my questions—do the Government recognise that we need an amendment on Report to create a new offence of assisting or encouraging suicide and serious self-harm? I beg to move.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Parliament Live - Hansard - -

My Lords, I particularly support Amendment 96, to which I have added my name; it is a privilege to do so. I also support Amendment 296 and I cannot quite work out why I have not added my name to it, because I wholeheartedly agree with it, but I declare my support now.

I want to talk again about an issue that the noble Baroness, Lady Finlay, set out so well and that we also touched on last week, about the regulation of suicide and self-harm content. We have all heard of the tragic case of Molly Russell, but a name that is often forgotten in this discussion is Frankie Thomas. Frankie was a vulnerable teenager with childhood trauma, functioning autism and impulsivity. After reading a story about self-harm on the app Wattpad, according to the coroner’s inquest, she went home and undertook

“a similar act, resulting in her death”.

I do not need to repeat the many tragic examples that have already been shared in this House, but I want to reiterate the point already made by the BMA in its very helpful briefing on these amendments: viewing self-harm and suicide content online can severely harm the user offline. As I said last week when we were debating the user empowerment tools, this type of content literally has life or death repercussions. It is therefore essential that the Bill takes this sort of content more seriously and creates specific duties for services to adhere to.

We will, at some point this evening—I hope—come on to debate the next group of amendments. The question for Ministers to answer on this group, the next one and others that we will be debating is, where we know that content is harmful to society—to individuals but also to broader society—why the Government do not want to take the step of setting out how that content should be properly regulated. I think it all comes from their desire to draw a distinction between content that is illegal and content that is not illegal but is undoubtedly, in the eyes of pretty well every citizen, deeply harmful. As we have already heard from the noble Baroness, and as we heard last week, adults do not become immune to suicide and self-harm content the minute they turn 18. In fact, I would argue that no adult is immune to the negative effects of viewing this type of content online.

This amendment, therefore, is very important, as it would create a duty for providers of regulated user-to-user services and search engines to manage harmful suicide or self-harm content applicable to both children and adults, recognising this cliff edge otherwise in the Bill, which we have already talked about. I strongly urge noble Lords, particularly the Minister, to agree that protecting users from this content is one of the most important things that the Bill can do. People outside this House are looking to us to do this, so I urge the Government to support this amendment today.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Parliament Live - Hansard - - - Excerpts

My Lords, I am pleased that we have an opportunity, in this group of amendments, to talk about suicide and self-harm content, given the importance of it. It is important to set out what we expect to happen with this legislation. I rise particularly to support Amendment 225, to which my noble friend Lady Parminter added her name. I am doing this more because the way in which this kind of content is shared is incredibly complex, rather than simply because of the question of whether it is legal or illegal.

--- Later in debate ---
Moved by
97: Clause 36, page 36, line 42, at end insert “including a code of practice describing measures for the purpose of compliance with the relevant duties so far as relating to violence against women and girls.”
Member’s explanatory statement
This amendment would impose an express obligation on OFCOM to issue a code of practice on violence against women and girls rather than leaving it to OFCOM’s discretion. This would ensure that Part 3 providers recognise the many manifestations of online violence, including illegal content, that disproportionately affect women and girls.
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Parliament Live - Hansard - -

My Lords, it is a great pleasure to move Amendment 97 and speak to Amendment 304, both standing in my name and supported by the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Gloucester and the noble Lord, Lord Knight of Weymouth. I am very grateful for their support. I look forward to hearing the arguments by the noble Lord, Lord Stevenson, for Amendment 104 as well, which run in a similar vein.

These amendments are also supported by the Domestic Abuse Commissioner, the Revenge Porn Helpline, BT, EE and more than 100,000 UK citizens who have signed End Violence Against Women’s petition urging the Government to better protect women and girls in the Bill.

I am also very grateful to the noble Baroness, Lady Foster of Aghadrumsee—I know I pronounced that incorrectly—the very distinguished former Northern Ireland politician. She cannot be here to speak today in favour of the amendment but asked me to put on record her support for it.

I also offer my gratitude to the End Violence Against Women Coalition, Glitch, Refuge, Carnegie UK, NSPCC, 5Rights, Professor Clare McGlynn and Professor Lorna Woods. Between them all, they created the draft violence against women and girls code of practice many months ago, proving that a VAWG code of practice is not only necessary but absolutely deliverable.

Much has already been said on this, both here and outside the Chamber. In the time available, I will focus my case for these amendments on two very specific points. The first is why VAWG, violence against women and girls, should have a specific code of practice legislated for it, rather than other content we might debate. The second is what having a code of practice means in relation to the management of that content.

Ofcom has already published masses of research showing that abuse online is gendered. The Government’s own fact sheet, sent to us before these debates, said that women and girls experience disproportionate levels of abuse online. They experience a vast array of abuse online because of their gender, including cyberflashing, harassment, rape threats and stalking. As we have already heard and will continue to hear in these debates, some of those offences and abuse reach a criminal threshold and some do not. That is at the heart of this debate.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, protecting women and girls is a priority for His Majesty’s Government, at home, on our streets and online. This Bill will provide vital protections for women and girls, ensuring that companies take action to improve their safety online and protect their freedom of expression so that they can continue to play their part online, as well as offline, in our society.

On Amendments 94 and 304, tabled by my noble friend Lady Morgan of Cotes, I want to be unequivocal: all service providers must understand the systemic risks facing women and girls through their illegal content and child safety risk assessments. They must then put in place measures that manage and mitigate these risks. Ofcom’s codes of practice will set out how companies can comply with their duties in the Bill.

I assure noble Lords that the codes will cover protections against violence against women and girls. In accordance with the safety duties, the codes will set out how companies should tackle illegal content and activity confronting women and girls online. This includes the several crimes that we have listed as priority offences, which we know are predominantly perpetrated against women and girls. The codes will also cover how companies should tackle harmful online behaviour and content towards girls.

Companies will be required to implement systems and processes designed to prevent people encountering priority illegal content and minimise the length of time for which any such content is present. In addition, Ofcom will be required to carry out broad consultation when drafting codes of practice to harness expert opinions on how companies can address the most serious online risks, including those facing women and girls. Many of the examples that noble Lords gave in their speeches are indeed reprehensible. The noble Baroness, Lady Kidron, talked about rape threats and threats of violence. These, of course, are examples of priority illegal content and companies will have to remove and prevent them.

My noble friend Lady Morgan suggested that the Bill misses out the specific course of conduct that offences in this area can have. Clause 9 contains provisions to ensure that services

“mitigate and manage the risk of the service being used for the commission or facilitation of”

an offence. This would capture patterns of behaviour. In addition, Schedule 7 contains several course of conduct offences, including controlling and coercive behaviour, and harassment. The codes will set out how companies must tackle these offences where this content contributes to a course of conduct that might lead to these offences.

To ensure that women’s and girls’ voices are heard in all this, the Bill will, as the right reverend Prelate noted, make it a statutory requirement for Ofcom to consult the Victims’ Commissioner and the domestic abuse commissioner about the formation of the codes of practice. As outlined, the existing illegal content, child safety and child sexual abuse and exploitation codes will already cover protections for women and girls. Creating a separate code dealing specifically with violence against women and girls would mean transposing or duplicating measures from these in a separate code.

In its recent communication to your Lordships, Ofcom stated that it will be consulting quickly on the draft illegal content and child sexual abuse and exploitation codes, and has been clear that it has already started the preparatory work for these. If Ofcom were required to create a separate code on violence against women and girls this preparatory work would need to be revised, with the inevitable consequence of slowing down the implementation of these vital protections.

An additional stand-alone code would also be duplicative and could cause problems with interpretation and uncertainty for Ofcom and providers. Linked to this, the simpler the approach to the codes, the higher the rates of compliance are likely to be. The more codes there are covering specific single duties, the more complicated it will be for providers, which will have to refer to multiple different codes, and the harder for businesses to put in place the right protections for users. Noble Lords have said repeatedly that this is a complex Bill, and this is an area where I suggest we should not make it more complex still.

As the Bill is currently drafted, Ofcom is able to draft codes in a way that addresses a range of interrelated risks affecting different groups of users, such as people affected in more than one way; a number of noble Lords dealt with that in their contributions. For example, combining the measures that companies can take to tackle illegal content targeting women and girls with the measures they can take to tackle racist abuse online could ensure a more comprehensive and effective approach that recognises the point, which a number of noble Lords made, that people with more than one protected characteristic under the Equality Act may be at compound risk of harm. If the Bill stipulated that Ofcom separate the offences that disproportionately affect women and girls from other offences in Schedule 7, this comprehensive approach to tackling violence against women and girls online could be lost.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Parliament Live - Hansard - -

Could my noble friend the Minister confirm something? I am getting rather confused by what he is saying. Is it the case that there will be just one mega code of practice to deal with every single problem, or will there be lots of different codes of practice to deal with the problems? I am sure the tech platforms will have sufficient people to be able to deal with them. My understanding is that Ofcom said that, while the Bill might not mandate a code of practice on violence against women and girls, it would in due course be happy to look at it. Is that right, or is my noble friend the Minister saying that Ofcom will never produce a code of practice on violence against women and girls?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Parliament Live - Hansard - - - Excerpts

It is up to Ofcom to decide how to set the codes out. What I am saying is that the codes deal with specific categories of threat or problem—illegal content, child safety content, child sexual abuse and exploitation—rather than with specific audiences who are affected by these sorts of problems. There is a circularity here in some of the criticism that we are not reflecting the fact that there are compound harms to people affected in more than one way and then saying that we should have a separate code dealing with one particular group of people because of one particular characteristic. We are trying to deal with categories of harm that we know disproportionately affect women and girls but which of course could affect others, as the noble Baroness rightly noted. Amendment 304—

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

There are no codes designed for Jewish people, Muslim people or people of colour, even though we know that they are disproportionately affected by some of these harms as well. The approach taken is to tackle the problems, which we know disproportionately affect all of those groups of people and many more, by focusing on the harms rather than the recipients of the harm.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Hansard - -

Can I check something with my noble friend? This is where the illogicality is. The Government have mandated in the Strategic Policing Requirement that violence against women and girls is a national threat. I do not disagree with him that other groups of people will absolutely suffer abuse and online violence, but the Government themselves have said that violence against women and girls is a national threat. I understand that my noble friend has the speaking notes, the brief and everything else, so I am not sure how far we will get on this tonight, but, given the Home Office stance on it, I think that to say that this is not a specific threat would be a mistake.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

With respect, I do not think that that is a perfect comparison. The Strategic Policing Requirement is an operational policing document intended for chief constables and police and crime commissioners in the important work that they do, to make sure they have due regard for national threats as identified by the Home Secretary. It is not something designed for commercial technology companies. The approach we are taking in the Bill is to address harms that can affect all people and which we know disproportionately affect women and girls, and harms that we know disproportionately affect other groups of people as well.

We have made changes to the Bill: the consultation with the Victims’ Commissioner and the domestic abuse commissioner, the introduction of specific offences to deal with cyber-flashing and other sorts of particular harms, which we know disproportionately affect women and girls. We are taking an approach throughout the work of the Bill to reflect those harms and to deal with them. Because of that, respectfully, I do not think we need a specific code of practice for any particular group of people, however large and however disproportionately they are affected. I will say a bit more about our approach. I have said throughout, including at Second Reading, and my right honourable friend the Secretary of State has been very clear in another place as well, that the voices of women and girls have been heard very strongly and have influenced the approach that we have taken in the Bill. I am very happy to keep talking to noble Lords about it, but I do not think that the code my noble friend sets out is the right way to go about solving this issue.

Amendment 304 seeks to adopt the Istanbul convention definition of violence against women and girls. The Government are already compliant with the Convention on Preventing and Combating Violence Against Women and Domestic Violence, which was ratified last year. However, we are unable to include the convention’s definition of violence against women and girls in the Bill, as it extends to legal content and activity that is not in scope of the Bill as drafted. Using that definition would therefore cause legal uncertainty for companies. It would not be appropriate for the Government to require companies to remove legal content accessed by adults who choose to access it. Instead, as noble Lords know, the Government have brought in new duties to improve services’ transparency and accountability.

Amendment 104 in the name of the noble Lord, Lord Stevenson, seeks to require user-to-user services to provide a higher standard of protection for women, girls and vulnerable adults than for other adults. The Bill already places duties on service providers and Ofcom to prioritise responding to content and activity that presents the highest risk of harm to users. This includes users who are particularly affected by online abuse, such as women, girls and vulnerable adults. In overseeing the framework, Ofcom must ensure that there are adequate protections for those who are most vulnerable to harm online. In doing so, Ofcom will be guided by its existing duties under the Communications Act, which requires it to have regard when performing its duties to the

“vulnerability of children and of others whose circumstances appear to OFCOM to put them in need of special protection”.

The Bill also amends Ofcom’s general duties under the Communications Act to require that Ofcom, when carrying out its functions, considers the risks that all members of the public face online, and ensures that they are adequately protected from harm. This will form part of Ofcom’s principal duty and will apply to the way that Ofcom performs all its functions, including when producing codes of practice.

In addition, providers’ illegal content and child safety risk assessment duties, as well as Ofcom’s sectoral risk assessment duties, require them to understand the risk of harm to users on their services. In doing so, they must consider the user base. This will ensure that services identify any specific risks facing women, girls or other vulnerable groups of people.

As I have mentioned, the Bill will require companies to prioritise responding to online activity that poses the greatest risk of harm, including where this is linked to vulnerability. Vulnerability is very broad. The threshold at which somebody may arguably become vulnerable is subjective, context-dependent and maybe temporary. The majority of UK adult users could be defined as vulnerable in particular circumstances. In practice, this would be very challenging for Ofcom to interpret if it were added to the safety objectives in this way. The existing approach allows greater flexibility so that companies and Ofcom can focus on the greatest threats to different groups of people at any given time. This allows the Bill to adapt to and keep pace with changing risk patterns that may affect different groups of people.

--- Later in debate ---
I hope that I have given some reassurance that the Bill covers the sort of violent content about which noble Lords are rightly concerned, no matter against whom it is directed. The Government recognise that many of these offences and much of the violence does disproportionately affect women and girls in the way that has been correctly pointed out. We have reflected this in the way in which the Bill and its regulatory framework are to operate. I am happy to keep discussing this matter with my noble friend. She is right that it is important, but I hope that, at this juncture, she will be content to withdraw her amendment.
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Hansard - -

My Lords, I thank my noble friend for his response, which I will come on to in a moment. This has been a fascinating debate. Yet again, it has gone to the heart of some of the issues with this Bill. I thank all noble Lords who have spoken, even though I did not quite agree with everything they said. It is good that this Committee shows just how seriously it takes the issue of violence against women and girls. I particularly thank all those who are watching from outside. This issue is important to so many.

There is no time to run through all the brilliant contributions that have been made. I thank the right reverend Prelate the Bishop of Gloucester for her support. She made the point that, these days, for most people, there is no online/offline distinction. To answer one of the points made, we sometimes see violence or abuse that starts online and then translates into the offline world. Teachers in particular are saying that this is the sort of misogyny they are seeing in classrooms.

As the noble Baroness, Lady Merron, said, the onus should not be on women and girls to remove themselves from online spaces. I also thank the noble Baronesses, Lady Kidron and Lady Gohir, for their support. The noble Baroness, Lady Kidron, talked about the toxic levels of online violence. Parliament needs to say that this is not okay—which means that we will carry on with this debate.

I thank the noble Baroness, Lady Healy, for her contribution. She illustrated so well why a code of practice is needed. We can obviously discuss this, but I do not think the Minister is quite right about the user reporting element. For example, we have heard various women speaking out who have had multiple rape threats. At the moment, the platforms require each one to be reported individually. They do not put them together and then work out the scale of threat against a particular user. I am afraid that this sort of threat would not breach the illegal content threshold and therefore would not be caught by the Bill, despite what the Minister has been saying.

I agree with my noble friend Lady Stowell. I would love to see basic standards—I think she called it “civility” —and a better society between men and women. One of the things that attracts me most to the code of practice is that it seeks cultural and societal changes—not just whack-a-mole with individual offences but changing the whole online culture to build a healthier and better society.

I will certainly take up the Minister’s offer of a meeting. His response was disappointing. There was no logic to it at all. He said that the voice of women and girls is heard throughout the Bill. How can this be the case when the very phrase “women and girls” is not mentioned in 262 pages? Some 100,000 people outside this Chamber disagree with his position and on the need for there to be a code of practice. I say to both Ofcom and the tech platforms that a code has been drafted. Please do not do the “Not drafted here; we’re not going to adopt it”. It is there, the work has been done and it can easily be taken on.

I would be delighted to discuss the definition in Amendment 304 with my noble friend. I will of course withdraw my amendment tonight, but we will certainly return to this on Report.

Amendment 97 withdrawn.