All 5 Baroness Bull contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Wed 1st Feb 2023
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Mon 17th Jul 2023

Online Safety Bill

Baroness Bull Excerpts
Baroness Bull Portrait Baroness Bull (CB)
- View Speech - Hansard - -

My Lords, no one who has heard Molly Russell’s story can be in any doubt about the need to better protect young people online, and I join others in paying tribute to her family for their tireless campaign.

As we have heard, vulnerability online does not evaporate on turning 18. Some adults will be at risk because mental illness, disability, autism, learning disabilities or even age leaves them unable to protect themselves from harm. Others will be vulnerable only at certain times, or in relation to specific issues. The “legal but harmful” provisions were not perfect, but stripping out adult safety duties—when, as the Minister himself said, three-quarters of adults are fearful of going online—is a backward step.

With category 1 services no longer required to assess risks to adults, it is hard to agree when the Minister says this will be

“a regulatory regime which has safety at its heart”.

Without risk assessments, how will platforms work out what they need to include in their terms and conditions? How will users make informed choices? How will the effectiveness of user empowerment tools be measured? Without the real-time information that risk assessments provide, how will the regulator stay on top of new risks, and advise the Secretary of State accordingly?

Instead, the Bill sets out duties for category 1 services to write and enforce their own terms and conditions—they will be “author, judge and jury”, to quote my noble friend Lady Kidron—and to provide tools that empower adult users to increase control over types of content listed at Clause 12. Harms arise and spread quickly online, yet this list is static, and it has significant gaps already. Harmful or false health content is missing, as are harms relating to body image, despite evidence linking body shaming to eating disorders, self-harm and suicide ideation. Smaller sites that target specific vulnerabilities, including suicide forums, would fall outside scope of these duties.

Describing this list as “content over which users may wish to increase control” is euphemism at its best. This is not content some might consider in poor taste, or a bit off-colour. This is content encouraging or promoting suicide, self-harm and eating disorders. It is content that is abusive or incites hate on the basis of race, ethnicity, religion, disability, sex, gender, sexual orientation and misogyny, which evidence connects directly to violence against women and girls.

And yet tools to hide this content will be off by default, meaning that people at the point of crisis, those seeking advice on self-harm or starvation, will need to find and activate those settings when they may well be in an affected mental state that leaves them unable to self-protect. The complexities of addiction and eating disorders disempower choice, undermining the very basis on which Clause 12 is built.

We heard it said today that all adults, given the tools, are capable of protecting themselves from online abuse and harm. This is just not true. Of course, many adults are fortunate to be able to do so, but as my noble and expert friends Lady Hollins and Lady Finlay explained, there are many adults who, for reasons of vulnerability or capacity, cannot do so. Requiring the tools to be on by default would protect adults at risk and cause no hardship whatever to those who are not: a rational adult will be as capable of finding the off button as the one that turns them on.

Last week, Ministers defended the current approach on the basis that failing to give all users equal access to all material constitutes a chilling effect on freedom of expression. It is surely more chilling that this Bill introduces a regime in which content promoting suicide, self-harm, or racist and misogynistic abuse is deemed acceptable, and is openly available, harming some but influencing many, as long as the platform in question gives users an option to turn it off. This cannot be right, and I very much hope Ministers will go back and reconsider.

When the Government committed to making the UK the safest place in the world to be online, I find it hard to believe that this is the environment that they had in mind.

Online Safety Bill

Baroness Bull Excerpts
Baroness Buscombe Portrait Baroness Buscombe (Con)
- Hansard - - - Excerpts

I agree. The small list of individual items is the danger.

Baroness Bull Portrait Baroness Bull (CB)
- View Speech - Hansard - -

My Lords, I support the noble Baroness, Lady Buscombe, on the built-in obsolescence of any list. It would very soon be out of date.

I support the amendments tabled by the noble Lord, Lord Clement-Jones, and by the noble Baroness, Lady Morgan of Cotes. They effectively seek a similar aim. Like the noble Baroness, Lady Fraser, I tend towards those tabled by the noble Lord, Lord Clement-Jones, because they seem clearer and more inclusive, but I understand that they are trying for the same thing. I also register the support for this aim of my noble friend Lady Campbell of Surbiton, who cannot be here but whom I suspect is listening in. She was very keen that her support for this aim was recorded.

The issue of “on by default” inevitably came up at Second Reading. Then and in subsequent discussions, the Minister reiterated that a “default on” approach to user empowerment tools would negatively impact people’s use of these services. Speaking at your Lordships’ Communications and Digital Committee, on which I sat at the time, Minister Scully went further, saying that the strongest option, of having the settings off in the first instance,

“would be an automatic shield against people’s ability to explore what they want to explore on the internet”.

According to the Government’s own list, this was arguing for the ability to explore content that abuses, targets or incites hatred against people with protected characteristics, including race and disability. I struggle to understand why protecting this right takes precedence over ensuring that groups of people with protected characteristics are, well, protected. That is our responsibility. It is precedence, because switching controls one way is not exactly the same as switching them the other way. It is easy to think so, but the noble Baroness, Lady Parminter, explained very clearly that it is not the same. It is undoubtedly easier for someone in good health and without mental or physical disabilities to switch controls off than it is for those with disabilities or vulnerabilities to switch them on. That is self-evident.

It cannot be right that those most at risk of being targeted online, including some disabled people—not all, as we have heard—and those with other protected characteristics, will have the onus on them to switch on the tools to prevent them seeing and experiencing harm. There is a real risk that those who are meant to benefit from user empowerment tools, those groups at higher risk of online harm, including people with a learning disability, will not be able to access the tools because the duties allow category 1 services to design their own user empowerment tools. This means that we are likely to see as many versions of user empowerment tools as there are category 1 services to which this duty applies.

Given what we know about the nature of addiction and self-harm, which has already been very eloquently explained, it surely cannot be the intention of the Bill that those people who are in crisis and vulnerable to eating disorders or self-harm, for example, will be required to seek and activate a set of tools to turn off the very material that feeds their addiction or encourages their appetite for self-harm.

The approach in the Bill does little to prevent people spiralling down this rabbit hole towards ever more harmful content. Indeed, instead it requires people to know that they are approaching a crisis point, and to have sufficient levels of resilience and rationality to locate the switch and turn on the tools that will protect them. That is not how the irrational or distressed mind works.

So, all the evidence that we have about the existence of harm which arises from mental states, which has been so eloquently set out in introducing the amendments— I refer again to my noble friend Lady Parminter, because that is such powerful evidence—tips the balance in favour, I believe, of setting the tools to be on by default. I very much hope the Minister will listen and heed the arguments we have heard set out by noble Lords across the Committee, and come back with some of his own amendments on Report.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

Before the noble Baroness sits down, I wanted to ask for clarification, because I am genuinely confused. When it comes to political rights for adults in terms of their agency, they are rights which we assume are able to be implemented by everyone. But we recognise that in the adult community —this is offline now; I mean in terms of how we understand political rights—there may well be people who lack capacity or are vulnerable, and we take that into account. But we do not generally organise political rights and access to, for example, voting or free speech around the most vulnerable in society. That is not because we are insensitive or inhumane, or do not understand. The moving testimonies we have heard about people with eating disorders and so on are absolutely spot-on accurate. But are we suggesting that the world online should be organised around vulnerable adults, rather than adults and their political rights?

Baroness Bull Portrait Baroness Bull (CB)
- Hansard - -

I do not have all the answers, but I do think we heard a very powerful point from the right reverend Prelate. In doing the same for everybody, we do not ensure equality. We need to have varying approaches, in order that everybody has equality of access. As the Bill stands, it says nothing about vulnerable adults. It simply assumes that all adults have full capacity, and I think what these amendments seek to do is find a way to recognise that simply thinking about children, and then that everybody aged 18 is absolutely able to take care of themselves and, if I may say, “suck it up”, is not the world we live in. We can surely do better than that.

Online Safety Bill

Baroness Bull Excerpts
My second question is similar. We see now around the world—it is not available in the UK—that Meta has a verified subscription, for which you can pay around $15 per month. It is being piloted in the US as we speak. Again, I ask whether that satisfies the duty in terms of it being affordable to the average UK user. I am concerned that most UK social media users will not be able to afford £180 per social media account for verification. If that ends up being Meta’s UK offering, many users would not be given a proper, meaningful chance to be verified. What powers are there in the Bill for Ofcom to send Meta back and offer something else? So my questions really are about what “verified” means in terms of the Bill.
Baroness Bull Portrait Baroness Bull (CB)
- View Speech - Hansard - -

My Lords, I rise to speak to Amendment 141 in the names of the noble Lords, Lord Stevenson and Lord Clement-Jones. Once again, I register the support of my noble friend Lady Campbell of Surbiton, who feels very strongly about this issue.

Of course, there is value in transparency online, but anonymity can be vital for certain groups of people, such as those suffering domestic abuse, those seeking help or advice on matters they wish to remain confidential, or those who face significant levels of hatred or prejudice because of who they are, how they live or what they believe in. Striking the right balance is essential, but it is equally important that everyone who wishes to verify their identity and access the additional protections that this affords can do so easily and effectively, and that this opportunity is open to all.

Clause 57 requires providers of category 1 services to offer users the option to verify their identity, but it is up to providers to decide what form of verification to offer. Under subsection (2) it can be “of any kind”, and it need not require any documentation. Under subsection (3), the terms of service must include a “clear and accessible” explanation of how the process works and what form of verification is available. However, this phrase in itself is open to interpretation: clear and accessible for one group may be unclear and inaccessible to another. Charities including Mencap are concerned that groups, such as people with a learning disability, could be locked out of using these tools.

It is also relevant that people with a learning disability are less likely to own forms of photographic ID such as passports or driving licences. Should a platform require this type of ID, large numbers of people with a learning disability would be denied access. In addition, providing an email or phone number and verifying this through an authentication process could be extremely challenging for those people who do not have the support in place to help them navigate this process. This further disadvantages groups of people who already suffer some of the most extensive restrictions in living their everyday lives.

Clause 58 places a duty on Ofcom to provide guidance to help providers comply with their duty, but this guidance is optional. Amendment 141 aims to strengthen Clause 58 by requiring Ofcom to set baseline principles and standards for the guidance. It would ensure, for example, that the guidance considers accessibility for disabled as well as vulnerable adults and aligns with relevant guidance on related matters such as age verification; it would ensure that verification processes are effective; and it would ensure that the interests of disabled users are covered in Ofcom’s pre-guidance consultation.

Online can be a lifeline for disabled and vulnerable adults, providing access to support, advice and communities of interest, and this is particularly important as services in the real world are diminishing, so we need to ensure that user-verification processes do not act as a further barrier to inclusion for people with protected characteristics, especially those with learning disabilities.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, the speech of the noble Baroness, Lady Buscombe, raised so many of the challenges that people face online, and I am sure that the masses who are watching parliamentlive as we speak, even if they are not in here, will recognise what she was talking about. Certainly, some of the animal rights activists can be a scourge, but I would not want to confine this to them, because I think trashing reputations online and false allegations have become the activists’ chosen weapon these days. One way that I describe cancel culture, as distinct from no-platforming, is that it takes the form of some terrible things being said about people online, a lot of trolling, things going viral and using the online world to lobby employers to get people sacked, and so on. It is a familiar story, and it can be incredibly unpleasant. The noble Baroness and those she described have my sympathy, but I disagree with her remedy.

An interesting thing is that a lot of those activities are not carried out by those who are anonymous. It is striking that a huge number of people with large accounts, well-known public figures with hundreds of thousands of followers—sometimes with more than a million—are prepared to do exactly what I described in plain sight, often to me. I have thought long and hard about this, because I really wanted to use this opportunity to read out a list and name and shame them, but I have decided that, when they go low, I will try to go at least a little higher. But subtweeting and twitchhunts are an issue, and one reason why we think we need an online harms Bill. As I said, I know that sometimes it can feel that if people are anonymous, they will say things that they would not say to your face or if you knew who they were, but I think it is more the distance of being online: even when you know who they are, they will say it to you or about you online, and then when you see them at the drinks reception, they scuttle away.

My main objection, however, to the amendment of the noble Baroness, Lady Buscombe, and the whole question of anonymity in general is that it treats anonymity as though it is inherently unsafe. There is a worry, more broadly on verification, about creating two tiers of users: those who are willing to be verified and those who are not, and those who are not somehow having a cloud of suspicion over them. There is a danger that undermining online anonymity in the UK could set a terrible precedent, likely to be emulated by authoritarian Governments in other jurisdictions, and that is something we must bear in mind.

On evidence, I was interested in Big Brother Watch’s report on some analysis by the New Statesman, which showed that there is little evidence to suggest that anonymity itself makes online discourse more febrile. It did an assessment involving tweets sent to parliamentarians since January 2021, and said there was

“little discernible difference in the nature or tone of the tweets that MPs received from anonymous or non-anonymous accounts. While 32 per cent of tweets from anonymous accounts were classed as angry according to the metric used by the New Statesman, so too were 30 per cent of tweets from accounts with full names attached.18 Similarly, 5.6 per cent of tweets from anonymous accounts included swear words, only slightly higher than the figure of 5.3 per cent for named accounts.”

It went through various metrics, but it said, “slightly higher, not much of a difference”. That is to be borne in mind: the evidence is not there.

In this whole debate, I have wanted to emphasise freedom as at least equal to, if not of greater value than, the safetyism of this Bill, but in this instance, I will say that, as the noble Baroness, Lady Bull, said, for some people anonymity is an important safety mechanism. It is a tool in the armoury of those who want to fight the powerful. It can be anyone: for young people experimenting with their sexuality and not out, it gives them the freedom to explore that. It can be, as was mentioned, survivors of sexual violence or domestic abuse. It is certainly crucial to the work of journalists, civil liberties activists and whistleblowers in the UK and around the world. Many of the Iranian women’s accounts are anonymous: they are not using their correct names. The same is true of Hong Kong activists; I could go on.

Anyway, in our concerns about the Bill, compulsory identity verification means being forced to share personal data, so there is a privacy issue for everyone, not just the heroic civil liberties people. In a way, it is your own business why you are anonymous—that is the point I am trying to make.

There are so many toxic issues at the moment that a lot of people cannot just come out. I know I often mention the gender-critical issue, but it is true that in many professions, you cannot give your real name or you will not just be socially ostracised but potentially jeopardise your career. I wrote an article during the 2016-17 days called Meet the Secret Brexiteers. It was true that many teachers and professors I knew who voted to leave had to be anonymous online or they would not have survived the cull.

Finally, I do not think that online anonymity or pseudonymity is a barrier to tracking down and prosecuting those who commit the kind of criminal activity on the internet described, creating some of the issues we are facing. Police reports show that between 2017-18, 96% of attempts by public authorities to identify anonymous users of social media accounts, their email addresses and telephone numbers, resulted in successful identification of the suspect in the investigation; in other words, the police already have a range of intrusive powers to track down individuals, should there be a criminal problem, and the Investigatory Powers Act 2016 allows the police to acquire communications data—for example, email addresses or the location of a device—from which alleged illegal anonymous activity is conducted and use it as evidence in court.

If it is not illegal but just unpleasant, I am afraid that is the world we live in. I would argue that what we require in febrile times such as these is not bans or setting the police on people but to set the example of civil discourse, have more speech and show that free speech is a way of conducting disagreement and argument without trashing reputations.

--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I have slightly abused my position because, as the noble Baroness has just said, this is a rather oddly constructed group. My amendments, which carve great chunks out of the Bill—or would do if I get away with it—do not quite point in the same direction as the very good speech the noble Baroness made, representing of course the view of the committee that she chairs so brilliantly. She also picked out one or two points of her own, which we also want to debate. It therefore might be easier if I just explain what I was trying to do in my amendments; then I will sit down and let the debate go, and maybe come back to pick up one or two threads at the end.

In previous Bills—and I have seen a lot of them—people who stand up and move clause stand part debates usually have a deeper and more worrying purpose behind the proposition. Either they have not read the Bill and are just trying to wing it, or they have a plan that is so complex and deep that it would probably need another whole Bill to sort it out. This is neither of those approaches; it is done because I want to represent the views mainly of the Joint Committee. We had quite a lot of debate in that committee about this area, beginning with the question about why the Bill—or the White Paper or draft Bill, at that stage—used the term “democratic importance” when many people would have used the parallel term “public interest” to try to reflect the need to ensure that matters which are of public good take place as a result of publication, or discussion and debate, or on online platforms. I am very grateful that the noble Lord, Lord Black, is able to be with us today. I am sure he will recall those debates, and hopefully he will make a comment on some of the work—and other members of the committee are also present.

To be clear, the question of whether Clauses 13, 14, 15 and 18 should stand part of the Bills is meant to release space for a new clause in Amendment 48. It is basically designed to try to focus the actions that are going to be taken by the Bill, and subsequently by the regulator, to ensure that the social media companies that are affected by, or in scope of, the Bill use, as a focus, some of the issues mainly related to “not taking down” and providing an appeal mechanism for journalistic material, whether that is provided by recognised news publishers or some other form of words that we can use, or it is done by recognised journalists. “Contentious” is an overused word, but all these terms are difficult to square away and be happy with, and therefore we should have the debate and perhaps reflect on that later when we come back to it.

The committee spent quite a lot of time on this, and there are two things that exercised our minds when we were working on this area. First, if one uses “content of democratic importance”, although it is in many ways quite a clever use of words to reflect a sensibility that you want to have an open and well-founded debate about matters which affect the health of our democracy, it can be read as being quite limiting. It is very hard to express—I am arguing against myself here—in the words of a piece of legislation what it is we are trying to get down to, but, during the committee’s recommendations, we received evidence that the definition of content of democratic importance was wider, or more capable of being interpreted as wider, than the scope the Government seem to have indicated. So there is both a good side and a bad side to this. If we are talking about content which is, or appears to be, specifically intended to contribute to the democratic political debate of the United Kingdom, or a part or area of the United Kingdom, we have got to ask the Minister to put on the record that this also inclusive of matters which perhaps initially do not appear necessarily to be part of it, but include public health, crime, justice, the environment, professional malpractice, the activities of large corporations and the hypocrisy of public figures when that occurs. I am not suggesting this is what we should be doing all the time, but these are things we often read about in our papers, and much the better off we are for it. However, if these things are not inclusive and not well rooted in the phrase “content of democratic importance”, it is up to the Government to come forward with a better way of expressing that, or perhaps in debate we can find it together.

I have some narrow questions. Are we agreed that what is currently in the Bill is intended specifically to contribute to democratic political debate, and is anything more needed to be said or done in order to make sure that happens? Secondly, the breadth of democratic political debate is obviously important; are there any issues here that are going to trip us up later when the Government come back and say, “Well, that wasn’t what we meant at all, and that doesn’t get covered, and therefore that stuff can be taken down, and that stuff there doesn’t have to be subject to repeal”? Are there contexts and subjects which we need to talk about? This is a long way into the question of content of democratic importance being similar or limited to matters that one recognises as relating to public interest. I think there is a case to be argued for the replacement of what is currently in the Bill with a way of trying to get closer to what we now recognise as being the standard form of debate and discussion when matters, which either the Government of the day or people individually do not like, get taken up and made the subject of legal discussion, because we do have discussions about whether or not it is in the public interest.

We probably do not know what that means. Therefore, a third part of my argument is that perhaps this is the point at which we try to define this, even though that might cause a lot of reaction from those currently in the press. In a sense, it is a question that needs to be resolved. Maybe this is or is not the right time to do that. Are the Government on the same page as the Joint Committee on this? Do they have an alternative and is this what they are trying to get across in the Bill?

Can we have a debate and discussion in relation to those things, making it clear that we want something in the Bill ensuring that vibrant political debate—the sort of things the noble Baroness was talking about on freedom of expression, but in a broader sense covering all the things that matter to the body politic, the people of this country—is not excluded by the Bill? That was the reason for putting down a raft of rather aggressive amendments. I hope it has been made clear that that was the case. I have other things that I would like to come back to, but I will probably do that towards the end of the debate. I hope that has been helpful.

Baroness Bull Portrait Baroness Bull (CB)
- View Speech - Hansard - -

My Lords, I will speak to the amendments in the name of the noble Baroness, Lady Stowell, to which I have added my name. As we heard, the amendments originally sat in a different group, on the treatment of legal content accessed by adults. Noble Lords will be aware from my previous comments that my primary focus for the Bill has been on the absence of adequate provisions for the protection of adults, particularly those who are most vulnerable. These concerns underpin the brief remarks I will make.

The fundamental challenge at the heart of the Bill is the need to balance protection with the right to freedom of expression. The challenge, of course, is how. The noble Baroness’s amendments seek to find that balance. They go beyond the requirements on transparency reporting in Clause 68 in several ways. Amendment 46 would provide a duty for category 1 services to maintain an up-to-date document for users of the service, ensuring that users understand the risks they face and how, for instance, user empowerment tools can be used to help mitigate these risks. It also provides a duty for category 1 services to update their risk assessments before making any “significant change” to the design or operation of their service. This would force category 1 services to consider the impact of changes on users’ safety and make users aware of changes before they happen, so that they can take any steps necessary to protect themselves and prepare for them. Amendment 47 provides additional transparency by providing a duty for category 1 services to release a public statement of the findings of the most recent risk assessment, which includes any impact on freedom of expression.

The grouping of these amendments is an indication, if any of us were in doubt, of the complexity of balancing the rights of one group against the rights of another. Regardless of the groupings, I hope that the Minister takes note of the breadth and depth of concerns, as well as the willingness across all sides of the Committee to work together on a solution to this important issue.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I put my name to Amendment 51, which is also in the name of the noble Lords, Lord Stevenson and Lord McNally. I have done so because I think Clause 15 is too broad and too vague. I declare an interest, having been a journalist for my entire career. I am currently a series producer of a series of programmes on Ukraine.

This clause allows journalism on the internet to be defined simply as the dissemination of information, which surely covers all posts on the internet. Anyone can claim that they are a journalist if that is the definition. My concern is that it will make a nonsense of the Bill if all content is covered as journalism.

I support the aims behind the clause to protect journalism in line with Article 10. However, I am also aware of the second part of Article 10, which warns that freedom of speech must be balanced by duties and responsibilities in a democratic society. This amendment aims to hone the definition of journalism to that which is in the public interest. In doing so, I hope it will respond to the demands of the second part of Article 10.

It has never been more important to create this definition of journalism in the public interest. We are seeing legacy journalism of newspapers and linear television being supplanted by digital journalism. Both legacy and new journalism need to be protected. This can be a single citizen journalist, or an organisation like Bellingcat, which draws on millions of digital datapoints to create astonishing digital journalism to prove things such as that Russian separatist fighters shot down flight MH17 over Ukraine.

The Government’s view is that the definition of “in the public interest” is too vague to be useful to tech platforms when they are systematically filtering through possible journalistic content that needs to be protected. I do not agree. The term “public interest” is well known to the courts from the Defamation Act 2013. The law covers the motivation of a journalist, but does not go on to define the content of journalism to prove that it is in the public interest.

Online Safety Bill

Baroness Bull Excerpts
Lord Griffiths of Burry Port Portrait Lord Griffiths of Burry Port (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendment 192A. There can be nothing more comfortable within the terms of parliamentary debate than to find oneself cossetted by the noble Baroness, Lady Morgan, on one side and my noble friend Lord Stevenson on the other. I make no apology for repeating the thrust of the argument of the noble Baroness, but I will narrow the focus to matters that she hinted at which we need to think about in a particular way.

We have already debated suicide, self-harm and eating disorder content hosted by category 1 providers. There is a need for the Bill to do more here, particularly through strengthening the user empowerment duties in Clause 12 so that the safest option is the default. We have covered that ground. This amendment seeks to address the availability of this content on smaller services that will fall outside category 1, as the noble Baroness has said. The cut-off conditions under which services will be determined to fall within category 1 are still to be determined. We await further progress on that. However, there are medium-sized and small providers whose activities we need to look at. It is worth repeating—and I am aware that I am repeating—that these include suicide and eating disorder forums, whose main business is the sharing and discussion of methods and encouragement to engage in these practices. In other words, they are set up precisely to do that.

We know that that there are smaller platforms where users share detailed information about methods of suicide. One of these in particular has been highlighted by families and coroners as playing a role in the suicides of individuals in the UK. Regulation 28 reports—that is, an official request for action—have been issued to DCMS and DHSC by coroners to prevent future comparable deaths.

A recent systematic review, looking at the impact of suicide and self-harm-related videos and photographs, showed that potentially harmful content concentrated specifically on sites with low levels of moderation. Much of the material which promotes and glorifies this behaviour is unlikely to be criminalised through the Government’s proposed new offence of encouragement to serious self-harm. For example, we would not expect all material which provides explicit instructional information on how to take one’s life using novel and effective methods to be covered by it.

The content has real-world implications. There is clear evidence that when a particular suicide method becomes better known, the effect is not simply that suicidal people switch from one intended method to the novel one, but that suicides occur in people who would not otherwise have taken their own lives. There are, therefore, important public health reasons to minimise the discussion of dangerous and effective suicide methods.

The Bill’s pre-legislative scrutiny committee recommended that the legislation

“adopt a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model”.

This amendment is in line with that recommendation, seeking to extend category 1 regulation to services that carry a high level of risk.

The previous Secretary of State appeared to accept this argument—but we have had a lot of Secretaries of State since—and announced a deferred power that would have allowed for the most dangerous forums to be regulated; but the removal of the “legal but harmful” provisions from the legislation means that this power is no longer applicable, as its function related to the “adult risk assessment” duty, which is no longer in the Bill.

This amendment would not shut down dangerous services, but it would make them accountable to Ofcom. It would require them to warn their users of what they were about to see, and it would require them to give users control over the type of content that they see. That is, the Government’s proposed triple shield would apply to them. We would expect that this increased regulatory burden on small platforms would make them more challenging to operate and less appealing to potential users, and would diminish their size and reach over time.

This amendment is entirely in line with the Government’s own approach to dangerous content. It simply seeks to extend the regulatory position that they themselves have arrived at to the very places where much of the most dangerous content resides. Amendment 192A is supported by the Mental Health Foundation, the Samaritans and others that we have been able to consult. It is similar to Amendment 192, which we also support, but this one specifies that the harmful material that Ofcom must take account of relates to self-harm, suicide and eating disorders. I would now be more than happy to give way—eventually, when he chooses to do it—to my noble friend Lord Stevenson, who is not expected at this moment to use the true and full extent of his abilities at being cunning.

Baroness Bull Portrait Baroness Bull (CB)
- View Speech - Hansard - -

My Lords, I rise to offer support for all the amendments in this group, but I will speak principally to Amendment 192A, to which I have added my name and which the noble Lord, Lord Griffiths, has just explained so clearly. It is unfortunate that the noble Baroness, Lady Parminter, cannot be in her place today. She always adds value in any debate, but on this issue in particular I know she would have made a very compelling case for this amendment. I will speak principally about eating disorders, because the issues of self-harm have already been covered and the hour is already late.

The Bill as it stands presumes a direct relationship between the size of a platform and its potential to cause harm. This is simply not the case: a systematic review which we heard mentioned confirmed what all users of the internet already know—that potentially harmful content is often and easily found on smaller, niche sites that will fall outside the scope of category 1. These sites are absolutely not hard to find—they come up on the first page of a Google search—and some hide in plain sight, masquerading, particularly in the case of eating disorder forums, as sources of support, solace or factual information when in fact they encourage and assist people towards dangerous practices. Without this amendment, those sites will continue spreading their harm and eating disorders will continue to have the highest mortality rate of all mental illnesses in the UK.

Online Safety Bill

Baroness Bull Excerpts
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, the amendments in this group relate to provisions for media literacy in the Bill and Ofcom’s existing duty on media literacy under Section 11 of the Communications Act 2003. I am grateful to noble Lords from across your Lordships’ House for the views they have shared on this matter, which have been invaluable in helping us draft the amendments.

Media literacy remains a key priority in our work to tackle online harms; it is essential not only to keep people safe online but for them to understand how to make informed decisions which enhance their experience of the internet. Extensive work is currently being undertaken in this area. Under Ofcom’s existing duty, the regulator has initiated pilot work to promote media literacy. It is also developing best practice principles for platform-based media literacy measures and has published guidance on how to evaluate media literacy programmes.

While we believe that the Communications Act provides Ofcom with sufficient powers to undertake an ambitious programme of media literacy activity, we have listened to the concerns raised by noble Lords and understand the desire to ensure that Ofcom is given media literacy objectives which are fit for the digital age. We have therefore tabled the following amendments seeking to update Ofcom’s statutory duty to promote media literacy, in so far as it relates to regulated services.

Amendment 274B provides new objectives for Ofcom to meet in discharging its duty. The first objective requires Ofcom to take steps to increase the public’s awareness and understanding of how they can keep themselves and others safe when using regulated services, including building the public’s understanding of the nature and impact of harmful content online, such as disinformation and misinformation. To meet that objective, Ofcom will need to carry out, commission or encourage the delivery of activities and initiatives which enhance users’ media literacy in these ways.

It is important to note that, when fulfilling this new objective, Ofcom will need to increase the public’s awareness of the ways in which they can protect groups that disproportionately face harm online, such as women and girls. The updated duty will also compel Ofcom to encourage the development and use of technologies and systems that support users of regulated services to protect themselves and others. Ofcom will be required to publish a statement recommending ways in which others, including platforms, can take action to support their users’ media literacy.

Amendment 274C places a new requirement on Ofcom to publish a strategy setting out how it will fulfil its media literacy functions under Section 11, including the new objectives. Ofcom will be required to update this strategy every three years and report on progress made against it annually to provide assurance that it is fulfilling its duty appropriately. These reports will be supported by the post-implementation review of the Bill, which covers Ofcom’s media literacy duty in so far as it relates to regulated services. This will provide a reasonable point at which to establish the impact of Ofcom’s work, having given it time to take effect.

I am confident that, through this updated duty, Ofcom will be empowered to ensure that internet users become more engaged with media literacy and, as a result, are safer online. I hope that these amendments will find support from across your Lordships’ House, and I beg to move.

Baroness Bull Portrait Baroness Bull (CB)
- View Speech - Hansard - -

My Lords, I welcome this proposed new clause on media literacy and support the amendments in the names of the noble Lords, Lord Clement-Jones and Lord Knight of Weymouth. I will briefly press the Minister on two points. First, proposed new subsection (1C) sets out how Ofcom must perform its duty under proposed new subsection (1A), but it does not explicitly require Ofcom to work in partnership with existing bodies already engaged in and expert in provision of these kinds of activities. The potential for Ofcom to commission is explicit, but this implies quite a top-down relationship, not a collaboration that builds on best practice, enables scale-up where appropriate and generally avoids reinventing wheels. It seems like a wasted opportunity to fast-track delivery of effective programmes through partnership.

My second concern is that there is no explicit requirement to consider the distinct needs of specific user communities. In particular, I share the concerns of disability campaigners and charities that media literacy activities and initiatives need to take into account the needs of people with learning disabilities, autism and mental capacity issues, both in how activities are shaped and in how they are communicated. This is a group of people who have a great need to go online and engage, but we also know that they are at greater risk online. Thinking about how media literacy can be promoted, particularly among learning disability communities, is really important.

The Minister might respond by saying that Ofcom is already covered by the public sector equality duty and so is already obliged to consider the needs of people with protected characteristics when designing and implementing policies. But the unfortunate truth is that the concerns of the learning disability community are an afterthought in legislation compared with other disabilities, which are already an afterthought. The Petitions Committee in the other place, in its report on online abuse and the experience of disabled people, noted that there are multiple disabled people around the country with the skills and experience to advise government and its bodies but that there is a general unwillingness to engage directly with them. They are often described as hard to reach, which is kind of ironic because in fact most of these people use multiple services and so are very easy to reach, because they are on lots of databases and in contact with government bodies all the time.

The Minister may also point out that Ofcom’s duties in the Communications Act require it to maintain an advisory committee on elderly and disabled persons that includes

“persons who are familiar with the needs of persons with disabilities”.

But referring to an advisory committee is not the same as consulting people with disabilities, both physical and mental, and it is especially important to consult directly with people who may have difficulty understanding what is being proposed. Talking to people directly, rather than through an advisory committee, is very much the goal.

Unlike the draft Bill, which had media literacy as a stand-alone clause, the intention in this iteration is to deal with the issue by amending the Communications Act. It may be that in the web of interactions between those two pieces of legislation, my concerns can be set to rest. But I would find it very helpful if the Minister could confirm today that the intention is that media literacy programmes will be developed in partnership with—and build on best practice of—those organisations already delivering in this space and that the organisations Ofcom collaborates with will be fully inclusive of all communities, including those with disabilities and learning disabilities. Only in this way can we be confident that media literacy programmes will meet their needs effectively, both in content and in how they are communicated.

Finally, can the Minister confirm whether Ofcom considers people with lived experience of disability as subject matter experts on disability for the purpose of fulfilling its consultation duties? I asked this question during one of the helpful briefing sessions during the Bill’s progress earlier this year, but I did not get an adequate answer. Can the Minister clarify that for the House today?