Report (2nd Day)
15:56
Relevant documents: 28th and 38th Reports from the Delegated Powers Committee, 15th Report from the Constitution Committee. Scottish and Welsh Legislative Consent granted.
Clause 10: Children’s risk assessment duties
Debate on Amendment 34 resumed.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

We began this group on the previous day on Report, and I concluded my remarks, so it is now for other noble Lords to contribute on the amendments that I spoke to on Thursday.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I rise emphatically to welcome the government amendments in this group. They are a thoughtful and fulsome answer to the serious concerns expressed from the four corners of the Chamber by a great many noble Lords at Second Reading and in Committee about the treatment of age verification for pornography and online harms. For this, I express my profound thanks to my noble friend the Minister, the Secretary of State, the Bill team, the Ofcom officials and all those who have worked so hard to refine this important Bill. This is a moment when the legislative team has clearly listened and done everything it possibly can to close the gap. It is very much the House of Lords at its best.

It is worth mentioning the exceptionally broad alliance of noble Lords who have worked so hard on this issue, particularly my compadres, my noble friend Lady Harding, the noble Baroness, Lady Kidron, and the right reverend Prelate the Bishop of Oxford, who all signed many of the draft amendments. There are the Front-Benchers, including the noble Lords, Lord Stevenson, Lord Knight, Lord Clement-Jones and Lord Allan of Hallam, and the noble Baroness, Lady Merron. There are the Back-Benchers behind me, including my noble friends Lady Jenkin and Lord Farmer, the noble Lords, Lord Morrow, Lord Browne and Lord Dodds, and the noble Baroness, Lady Foster. Of those in front of me, there are the noble Baronesses, Lady Benjamin and Lady Ritchie, and there is also a number too large for me to mention, from all across the House.

I very much welcome the sense of pragmatism and proportionality at the heart of the Online Safety Bill. I welcome the central use of risk assessment as a vital tool for policy implementation and the recognition that some harms are worse than others, that some children need more protection than others, that we are legislating for future technologies that we do not know much about and that we must engage industry to achieve effective implementation. As a veteran of the Communications Act 2003, I strongly support the need for enabling legislation that has agility and a broad amount of support to stand the test of time.

16:00
However, there is also a time when it is essential that we are absolute about things and that we say that there are some contents and functions that children should never encounter anywhere on the internet. If we want to be taken seriously as a Parliament, and if we want to bring about meaningful behavioural change, we need to tether the regulation of the internet to at least some certainties, and that is why this package of government amendments is so very welcome. We need to make it clear to anyone doing business on the internet that, sometimes, there are no loopholes, no mitigations, no legal cop-outs, no consultations and no sliding scales. Sometimes Parliament makes choices and decides that some things are just plain wrong and beyond the pale.
It was the moral relativism and misguided sense of proportionality and consultative handling, when it came to the age verification measures for damaging and violent pornography, that so alarmed so many people about the draft Bill. It is very much the clarity of the newly introduced government amendments that make them so powerful. They make it crystal clear that no child should ever see any pornography anywhere on the internet. That is a massive relief to those who have campaigned so hard and for so long for final age verification for pornography and priority harms. On this, I am particularly thankful to the noble Baroness, Lady Benjamin, who I know is seething with frustration that she is not speaking, and to my noble friend Lord Farmer. By introducing clear, concrete and definitive measures, this government package provides a tether that anchors the Bill to some certainty.
The government amendments are an essential step to ending the corrosive sense of exceptionalism that has hung around the regulation of online spaces for too long. We will no longer consult with industry about what it might or might not be expected to do to protect children, as was first intended. Instead, we are defining a high bar and applying it to all pornographic content and priority harms, wherever they are on the internet, the metaverse or any future technology. We are legislating for all online businesses, wherever they are, and whether they are big, small, mobile, meta or whatever.
Thank goodness that our Government have recognised that we have reached an inflection point in the history of the online world, where the access to internet content and functions are in the pockets and bedrooms of our children. We should no longer victim-blame our children by calling for more education; we cannot scapegoat parents by making implausible expectations about how families can police or manage their children’s times on devices. Instead, with these amendments, the Government have recognised that we need internet companies to take responsibility for what is on their platforms; there will be no more dodging or obfuscation. If you have horrible, violent porn content on your service, you need a system to keep children away—full stop; no haggling.
This is a profound challenge to Twitter, Instagram, Snapchat, TikTok, WhatsApp, Reddit, Facebook and all the services that the Children’s Commissioner rightly identified as gateways to pornography: businesses that have, for too long, happily recruited kids to their algorithms with porn and have taken advertisers’ money for clicks from kids watching porn, regardless of the consequences to society. This is also a challenge to Pornhub and all the other professional pornographers that have benefited from the constructive ambiguity of the last 30 years and will now be called to account.
This is a huge victory. My noble friend Lord Grade thoughtfully and movingly told us that he would judge Ofcom’s mission to be a success if platforms finally took responsibility for what was on their services, and I take that very seriously. The government amendments in this group provide a powerful illustration of that principle and a tool for bringing it about. Effective enforcement by Ofcom is essential for giving tech bosses, who are too often happy to pay the fines and move on, a certain clarity of mind. I welcome the government amendments on senior management liability, which introduce the threat of prison to those who egregiously breach Ofcom’s standards. However, as the House knows, several enforcement measures are yet to reach us, and I flag in advance to my noble friend the Minister that these are considered critical to the success of the new regime by several noble Lords.
The government amendments are a massive step towards applying the common-sense principle that the rules about what is illegal and inappropriate for children in the real world should be applied equally to the online world, with equal vigour and equal scope. For that reason, I very much welcome the announcement of a porn review to investigate gaps in UK regulation that allow exploitation or abuse to occur online, in clear breach of long-standing criminal and civil laws, and to identify barriers to enforcing criminal law.
This is a knotty issue that involves several cross-departmental dependencies, including the allocation of resources by the Home Office, the writing of proper guidelines by the Ministry of Justice, the prioritisation of prosecutions by the CPS, and the building of a relevant skills base in our police forces. I therefore ask the Minister for guidance on the timetable, the terms of reference and the appointment of a chair for this review. I would also ask that a wide range of voices are heard and prioritised for this review.
The Government deserve considerable praise for their bold steps: not just to protect children from the harms of pornography, and to put Britain at the forefront of the global response to online safety, but also to nurture a benign environment for our critically important tech sector. These government amendments will create legislative certainty: an essential foundation for innovation. They will help rehabilitate the reputation of a tech sector reeling from the excesses of bad actors and the misplaced moral relativism of this young, exciting and vibrant industry. That will have benign consequences for investment and recruitment.
With a final word of optimism, I ask my noble friend the Minister what work will be done to bring alignment with other jurisdictions and to promote Britain as a well-regulated destination for investment, much as we do for life sciences.
Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I reiterate what the noble Lord, Lord Bethell, has said and thank him for our various discussions between Committee and Report, particularly on this set of amendments to do with age verification. I also welcome the Government’s responsiveness to the concerns raised in Committee. I welcome these amendments, which are a step forward.

In Committee, I was arguing that there should be a level playing field for regulating any online platform with pornographic content, whether it falls under Part 3 or Part 5 of the Bill. I welcome the Government’s significant changes to Clauses 11 and 72 to ensure that robust age verification or estimation must be used and that standards are consistent across the Bill.

I have a few minor concerns that I wish to highlight. I am thoughtful about whether enough is required of search services in preventing young people from accessing pornography in Clause 25. I recognise the Government believe they have satisfied the need. I fear they may have done enough in the short term, but there is a real concern that this clause is not sufficiently future-proofed. Of course, only time will tell. Maybe the Minister could advise us further in that particular regard.

In Committee, I also argued that the duties in respect of pornography in Parts 3 and 5 must come into effect at the same time. I welcome the government commitment to placing a timeframe for the codes of practice and guidance on the face of the Bill through amendments including Amendment 230. I hope that the Minister will reassure us today that it is the Government’s intention that the duties in Clauses 11 and 72 will come into effect at the same time. Subsection (3) of the new clause proposed in Amendment 271 specifically states that the duties could come into effect at different times, which leaves a loophole for pornography to be regulated differently, even if only for a short time, between Part 3 and Part 5 services. This would be extremely regrettable.

I would also like to reiterate what I said last Thursday, in case the Minister missed my contribution when he intervened on me. I say once again that I commend the Minister for the announcement of the review of the regulation, legislation and enforcement of pornography offences, which I think was this time last week. I once again ask the Minister: will he set out a timetable for publishing the terms of reference and details of how this review will take place? If he cannot set out that timetable today, will he write to your Lordships setting out the timetable before the Recess, and ensure a copy is placed in the Library?

Finally, all of us across the House have benefited from the expertise of expert organisations as we have considered this Bill. I repeat my request to the Minister that he consider establishing an external reference group to support the review, consisting of those NGOs with particular and dedicated expertise. Such groups would have much to add to the process—they have much learning and advice, and there is much assistance there to the Government in that regard.

Once again, I thank the Minister for listening and responding. I look forward to seeing the protections for children set out in these amendments implemented. I shall watch implementation very closely, and I trust and hope that the regulator will take robust action once the codes of practice and guidance are published. Children above all will benefit from a safer internet.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome the government amendments in this group, which set out the important role that age assurance will play in the online safety regime. I particularly welcome Amendment 210, which states that companies must employ systems that are “highly effective” at correctly determining whether a particular user is a child to prevent access to pornography, and Amendment 124, which sets out in a code of practice principles which must be followed when implementing age assurance—principles that ensure alignment of standards and protections with the ICO’s age appropriate design code and include, among other things, that age assurance systems should be easy to use, proportionate to the risk and easy to understand, including to those with protected characteristics, as well as aiming to be interoperable. The code is a first step from current practice, in which age verification is opaque, used to further profile children and related adults and highly ineffective, to a world in which children are offered age-appropriate services by design and default.

I pay tribute again to the noble Lord, Lord Bethell, and the noble Baroness, Lady Benjamin, and I associate myself with the broad set of thanks that the noble Lord, Lord Bethell, gave in his opening speech. I also thank colleagues across your Lordships’ House and the other place for supporting this cause with such clarity of purpose. On this matter, I believe that the UK is world-beating, and it will be a testament to all those involved to see the UK’s age verification and estimation laws built on a foundation of transparency and trust so that those impacted feel confident in using them—and we ensure their role in delivering the online world that children and young people deserve.

I have a number of specific questions about government Amendment 38 and Amendment 39. I would be grateful if the Minister were able to answer them from the Dispatch Box and in doing so give a clear sign of the Government’s intent. I will also speak briefly to Amendments 125 and 217 in my name and those of the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford, as well as Amendment 184 in the names of the noble Baroness, Lady Fox, and the noble Lord, Lord Moylan. All three amendments address privacy.

Government Amendment 38, to which I have added my name, offers exemptions in new subsections (3A) and (3B) that mean that a regulated company need not use age verification or estimation to prevent access to primary priority content if they already prevent it by means of its terms of service. First, I ask the Minister to confirm that these exemptions apply only if a service effectively upholds its terms of service on a routine basis, and that failure to do so would trigger enforcement action and/or an instruction from Ofcom to apply age assurance.

16:15
Secondly, will the Minister further explain how these exemptions impact on the child safety duties for priority content, which is not prohibited but must be age appropriate, or how they might account for aspects of the service that create harm but are not associated directly with the content? In trying to work out the logic, it appeared to me that either a company would have to identify the end user was a child by means of age assurance, but perhaps not the highest bar, or it would have to design a service that was not harmful to children even if that harm was not primary priority content. It would be good to hear what the intention is to make sure there is not—inadvertently, I am sure—a loophole by which companies can fail in their duties by ignoring children on their service because they do not allow primary priority content.
Amendments 49 and 93 would ensure that
“a provider can only conclude that children cannot access a service if age verification or age estimation is used on the service with the result that children are not normally able to access it”.
Somewhat related to my previous question, can the noble Lord confirm that a service—for example, a financial service provider that offers adult-facing products, or a gambling site that requires adults to identify themselves—will be taken as having age-verified so that age verification is no longer necessary?
I welcome the 18-month deadline for Ofcom to produce guidance on child access assessments, but can the Minister confirm that the timeframe is a backstop and the Government’s ambition is to get age assurance much quicker? Also, when he responds, will he confirm that Schedule 4 will be published as part of the child safety code, which is why it is not mentioned in Amendment 271? While I enthusiastically welcome the code of practice, it is simply the fact that many adults and children have concerns about privacy and security.
Amendments 125 and 217 address a gap in the Bill by making it clear
“that data collected for age assurance must be stored securely, deleted as soon as possible and not used for other purposes”.
Similarly, Amendment 184, in the name of the noble Lord, Lord Moylan, addresses the privacy issue in a detailed way that may be better suited to Ofcom’s fully fleshed out code of conduct but none the less speaks to the same gap. I do not expect the Minister to accept these amendments as written, and I understand that there is an overarching requirement for privacy in the Bill, but in the public discourse about the online world, safety is always put in binary opposition to privacy. If the Government were to acknowledge in the Bill the seriousness of the need for privacy and security of information relating to age verification and estimation, it would send a clear message that they have understood the validity of the privacy concerns and be an enormous contribution to ending the unhelpful binary. I hope that on this matter the Minister will agree to take these amendments away and include wording to the same effect.
Finally, last week, at the invitation of the right reverend Prelate the Bishop of Gloucester, the Minister and I attended an event at which we were addressed by children about the pressures they felt from social media. I thank all the young people present for the powerful and eloquent way in which they expressed the need for politicians and religious, civic and business leaders to do more to detoxify the digital world. If they are listening, as they said they would, I want to assure them that all of us in this Chamber hear their concerns. Importantly, when I asked Oliver, aged 12, and Arthur, aged 13, what one thing we could and should do to make their online world better, they said, “Make age checking meaningful”. Today, we are doing just that.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I shall follow on directly from some of the comments of the noble Baroness, Lady Kidron, around privacy. I shall not repeat the arguments around children and pornography but touch on something else, which is the impact of these amendments on the vast majority of internet users, the 85%-plus who are 18 or older. Of course, when we introduce age assurance measures, they will affect everyone: we should not kid ourselves that it is only about children, because everyone will have to pass through these gateways.

I shall speak particularly to Amendments 184 and 217 on privacy. I am sure that most adults will support extra safety measures for children, but they also want to be able to access a wide range of online services with the least possible friction and the lowest risk to their own personal data. We can explore how this might work in practice by looking at something that we might all do in this Chamber. Looking round, I believe that we are all at least 18 years old, and we might find ourselves idly passing the time creating an account on a new user-to-user or search service that has been recommended. We should consider this group of amendments by how that might play out. In future, the services will have to check that we are in the United Kingdom—there is a range of ways in which they can do that. Having confirmed that, they will need to understand whether we are 18-plus or a child user so that they can tailor their service appropriately.

I hope we all agree that the services should not be asking us for passports or driving licences, for example, as that would be entirely contrary to the thrust of privacy regulations and would be a huge gateway to fraud and other problems. The most efficient way would be for them to ask us for some sort of digital certificate—a certificate that we have on our devices where we have proven to a trusted third party that we are 18-plus. The certificate does not need to contain any personal data but simply confirms that we are of age. That is very similar to the way in which secure websites work: they send a digital certificate to your browser and you verify that certificate with a trusted third party—a certificate authority—and then you can open an encrypted connection. We are reversing the flow: the service will ask the user for a certificate and then verify that before granting access. A user may have a setting on their device in future where they confirm that they are happy for their 18-plus certificate to be given to anybody or whether they would like to be asked every time there will be a new set of privacy controls.

Building the infrastructure for this is non-trivial. Many things could go wrong but at least the kind of model I am describing has some hope of achieving widespread adoption. It is very good for the adult users as they can continue to have the frictionless experience as long as they are happy for their device to send a certificate to new services. It is good for the market of internet services if new services can bring users on easily. It is good for privacy by avoiding lots of services each collecting personal data, as most people access a multiplicity of services. Perhaps most importantly in terms of the Bill’s objectives, it is good for children if services can separate out the vast majority of their users who are 18-plus and then focus their real efforts on tailoring the services for the minority of users who will be children. The Bill will introduce a whole set of new obligations.

We should not underestimate the scale of the challenge in practice; it will work only if major internet companies are willing to play the game and get into the market of offering 18-plus certificates. Companies such as Google, Meta, Amazon, Apple and Microsoft—the ones we normally love to hate—will potentially provide the solution, as well as not-for-profits. There will be foundations for those who object to the big internet companies, but it is those big internet companies which will have the reach; they each have millions of users in the United Kingdom. This is not to fly the flag for those companies; it is simply a question of efficiency. I suspect that everyone in the Chamber uses a combination of services from those big providers. We already share with them the personal data necessary for age assurance, and there would be no additional sharing of data. If they were willing to provide a certificate, they could do so at the kind of scale necessary for the 50 million or so adult internet users in the United Kingdom to be able to get one easily and then pass it to services when they choose to access them.

There may be some discomfort with big tech playing this role, but I cannot see the kind of aggressive targets that we are setting in the amendments working unless we take advantage of those existing platforms and use them to make this work. Amendment 230 tells us that we have about 18 months, which is very soon in terms of trying to build something. We should be clear that if we are to deliver this package it will depend on persuading some of those big names in tech to create age certification schemes for UK users.

For this to have widespread adoption and a competitive market, we need it to be free of direct financial costs to individual users and to services choosing to age-verify, as we have asked them to do so. We need to think very carefully about that, as it raises a whole series of competition questions that I am sure Ofcom and the Competition and Markets Authority will have to address, not least because we will be asking companies to provide age certification free of charge that will be used by their existing and future competitors to meet their compliance requirements.

There may be some listening who think that we can rely on small age-assurance start-ups. Some of them have a really important role to play and we should be proud of our homegrown industry, but we should be realistic that they will reach scale only if they work with and through the large service providers. Many of them are already seeking those kinds of relationship.

As a test case, we might think of an application such as Signal, a messaging app that prides itself on being privacy-first. It does not want to collect any additional information from its users, which is perfectly reasonable, given where it is coming from. It will be really interesting to see how comfortable such a service will be with working with certification schemes, under which it can prove that users are over 18 by taking advantage of the data held by other services which collect significant amounts of data and have a very good idea of how old we are.

I have not focused on under-18s but, once this system is in place, application providers will be thinking very carefully about the pros and cons of allowing under-18s on at all. I know that the noble Baroness, Lady Kidron, is also concerned about this. There will be services that will think very carefully, if they find that the vast majority of their users are 18-plus, about the extent to which they want to put time and effort into tailoring them for users under 18. We do not intend that outcome from the Bill, but we need realistically to consider it.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Just to be clear, I say that the purpose of my question to the Minister was to get at the fact that, for low-risk situations, there can be age assurance that is a lot less effective or intrusive, for that very reason.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I agree; that is very helpful. I think Amendments 74, 93 and 99 also talk about the exclusion, as the noble Baroness raised, of services from the child safety duties if they can show that they are only 18-plus. It will be quite material and critical to know at what level they can demonstrate that.

I have avoided talking about pornography services directly, but there are interesting questions around what will happen if this model develops, as it likely will. If big tech is now starting to provide age certification for the kinds of mainstream services we may all want to access, they may be much less comfortable providing that same certification to pornography providers, for reputational reasons. A mainstream provider would not want to enter that market. Ofcom will need to take a view on this. We have talked about interoperability in the framework we have created, but it is a big question for Ofcom whether it wants to steer all age certification providers also to provide 18-plus certification for pornography providers or, effectively, to allow two markets to develop—one for mainstream certification and one for certification for pornography.

I have taken a few minutes because this is a very high-risk area for the Bill. There are material risks in willing into existence a model that depends on technical infrastructure that has not yet been built. The noble Lord, Lord Bethell, referred to prior experience; one of the reasons why we have not delivered age assurance before is that the infrastructure was not there. We now want it built, so must recognise that it is quite a high-risk endeavour. That does not mean it is not worth attempting, but we must recognise the risks and work on them.

If the implementation is poor, it will frustrate adult users, which may bring the Bill into disrepute. We need to recognise that as a genuine risk. There are people out there already saying that the Bill means that every internet service in the world will ask you for your passport. If that is not the case, we need to stress that we do not expect that to happen. There are also potentially significant impacts on the market for online services available to both adults and children in the UK, depending on the design of this system.

The purpose of thinking about some of these risks today is not to create a doom-laden scenario and say that it will not work. It is entirely the opposite—to say that, if we are to move ahead into a world in which children are protected from harmful content, for which very good reasons have been articulated and a huge amount of work has gone ahead, and in which services can tailor and gear access to the age of the child, we have to be able to take the 18-plus out of that, put it into a separate box and do so in a really easy, straightforward manner. If not, the 18-plus will end up dragging down what we want to do for the underage.

I hope that explanation helps in the context of these amendments. We will need to test them against it as implementation happens over the next few months.

16:30
Lord Farmer Portrait Lord Farmer (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the Minister for engaging with the amendment in my name and that of the noble Baroness, Lady Benjamin, in Committee, to ensure parity between the regulation of online and offline pornography. We did not table it for Report because of the welcome news of the Government’s review. At this point, I would like to give my backing to all that my noble friend Lord Bethell said and would like to thank him for his great encouragement and enthusiasm on our long journey, as well as the noble Baroness, Lady Kidron. I would particularly like to mention the noble Baroness, Lady Benjamin, who, as my noble friend Lord Bethell mentioned, must be very frustrated today at not being able to stand up and benefit us with her passion on this subject, which has kept a lot of us going.

I have some questions and comments about the review, but first I want to stand back and state why this review is so necessary. Our society must ask how pornography was able to proliferate so freely, despite all the warnings of the danger and consequences of this happening when the internet was in its infancy. Human appetites, the profit motive and the ideology of cyberlibertarianism flourished freely in a zeitgeist where notions of right and wrong had become deeply unfashionable. Pre-internet, pornography was mainly on top shelves, in poky and rather sordid sex shops, or in specialist cinemas. There was recognition that exposure to intimate sex acts should never be accidental but always the result of very deliberate decisions made by adults—hence the travesty of leaving children exposed to the danger of stumbling across graphic, violent and frequently misogynistic pornography by not bringing Part 3 of the Digital Economy Act 2017 into force.

I have talked previously in this House about sociology professor Christie Davies’ demoralisation of society thesis: what happens when religiously reinforced moralism, with its totemic notion of free will, is ditched along with God. Notions of right and wrong become subjective, individually determined, and a kind of blindness sets in; how else can we explain why legislators ignored the all-too-predictable effects of unrestrained access to pornography on societal well-being, including but not limited to harms to children? For this Bill to be an inflection point in history, this review, birthed out of it, must unashamedly call out the immorality of what has gone before. How should we define morality? Well, society simply does not work if it is governed by self-gratification and expressive individualism. Relationships—the soil of society—including intimate sexual relationships, are only healthy if they are self-giving, rather than self-gratifying. These values did not emerge from the Enlightenment but from the much deeper seam of our Judeo-Christian foundations. Pornography is antithetical to these values.

I turn to the review’s terms of reference. Can the Minister confirm that the lack of parity between online and offline regulation will be included in the legal gaps it will address? Can he also confirm that the review will address gaps in evidence? As I said in Committee, a deep seam of academic research already exists on the harmful effects of the ubiquity of pornography. The associations with greater mental ill health, especially among teenagers, are completely unsurprising; developing brains are being saturated with dark depictions of child sexual abuse, incest, trafficking, torture, rape, violence and coercion. As I mentioned earlier, research shows that adults whose sexual arousal is utterly dependent on pornography can be catastrophically impaired in their ability to form relationships with flesh-and-blood human beings, let alone engage in intimate physical sex.

Will the review also plug gaps in areas that remain underresearched and controversial and where vested interests are bound? On that point, whoever chairs this review will have to be ready, willing and able to take on powerful, ideologically motivated and profit-driven lobbies.

Inter alia, we need to establish through research the extent to which some young women are driven to change their gender because of hyper-sexualised, porn-depicted female stereotypes. Anecdotally, some individuals have described their complete inability to relate to their natal sex. It can be dangerous and distasteful to be a woman in a world of pornified relationships which expects them to embrace strangulation, degradation and sexual violence. One girl who transitioned described finding such porn as a child: “I am ashamed that I was fascinated by it and would seek it out. Despite this interest in watching it, I hated the idea of myself actually being in the position of the women. For a while, I even thought I was asexual. Sex is still scary to me, complicated”.

Finally, the Government’s announcement mentioned several government departments but does not make it clear that they will also draw in the work of DfE and DHSC—the departments for children’s and adult mental health—for reasons I have already touched on. Can the Minister confirm that the remit will include whatever areas of government responsibility are needed so that the review is genuinely broad enough to look across society at how to protect not just children but adults?

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to speak to Amendment 184 in my name—

Lord Harlech Portrait Lord Harlech (Con)
- Hansard - - - Excerpts

My Lords, the guidance in the Companion states that Peers who were not present for the opening of this debate last week should not speak in the debate today, so I will have to ask the noble Baroness to reserve her remarks on this occasion.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, that neatly brings me to the beginning of my own speech. I have expressed to the Chief Whip and the Minister my great regret that my noble friend Lady Benjamin is not able to take part in today’s debate because of the rather arbitrary way the group was started at the very end of proceedings on Thursday. The Minister is very much aware of that; it is a very sad thing.

I pay huge tribute to my noble friend, as the noble Lords, Lord Bethell and Lord Farmer, have. She is sitting behind me, yet she cannot make her contribution after a decade of campaigning so passionately on these issues. That includes pushing for age verification for pornographic content. We stood shoulder to shoulder on Part 3 of the Digital Economy Act, and she has carried that passion through into the debates on this Bill.

My noble friend believes that the Minister’s amendments in particular are a huge step forward. She describes this as a landmark moment from her point of view. She wants me to thank Barnardo’s, CARE and CEASE for their support and for bringing evidence and research to us on pornography. She would like to thank the Secretary of State and the Minister in particular for taking us to this point.

My noble friend also welcomes the review that was announced last week but, like the noble Lords, Lord Bethell and Lord Farmer, she has some questions that have be asked. This review is a good opportunity to examine the gaps in regulation, but it is proposed that the review will take a year. Is that the proposal and is it a firm year? What happens thereafter? Is there a commitment by the Government to legislate on this, if they are still the Government in a year’s time? What are their intentions and what is the road map to legislation? For instance, the gambling review started four years ago and we have not seen real change yet, so I think it is important to have some assurance in that respect.

Who will be involved in the review? Will the third sector and charity organisations working in this space be involved? The noble Lord, Lord Farmer, asked about scientific and medical research, which are all important aspects. I know that my noble friend would want to pay her own tribute to the noble Lords, Lord Farmer and Lord Bethell, to others involved in this exercise—“exercise” should be what it is called as it certainly feels like exercise—and in particular to the noble Baroness, Lady Kidron. I hope that the Minister will give my noble friend those assurances, despite the fact that she is not able to take part in this debate today.

From my point of view, I welcome the Government’s decision to strengthen the Bill’s age-verification requirements for online pornography, especially in respect of the principles for age assurance. But—and there always is a “but”—we absolutely need that age assurance to be privacy protecting. Amendment 125 is crucial and I am disappointed that it has not been included so far.

My noble friend Lord Allan referred to one of the major objections. We had a huge argument and debate about the efficacy of age verification when we discussed Part 3. There were great fears that age verification was going to be privacy invading and there was not a great deal of certainty about the kind of technology that was available for this kind of privacy-protecting age verification. I personally prefer and wanted to see third-party age verification; at the time, I thought it far better and safer to have third parties, such as Yoti, being responsible for our certification rather than the big tech companies, for all kinds of reasons and not just competitive ones. If we do not have some privacy-protecting language, we will be back in that situation of suspicion if we are not very careful.

Like my noble friend, I welcome the announcement of a review on the issue. There is a huge gap currently, and I give credit to the Secretary of State for understanding that that gap between the treatment of online pornography and offline pornography is very large indeed, as the BBFC can say from its experience. There is a wealth of evidence showing the link between violent pornography and real-life violence against women and girls. That is one of the reasons that I am so pleased that this review is taking place.

I mentioned the BBFC and have mentioned it before. It was going to be the regulator under Part 3 of the Digital Economy Bill. I very much hope that the Government will consult the BBFC, as it has a great deal of experience in offline certification, so I hope it will be heavily involved in a review of this kind.

I listened to my noble friend very intently and I think he made many points that resonate about the practical way in which will need to age-verify to make it simple for the public who are 18 and over. I much prefer the idea of third-party age verification to putting myself in the hands of big tech. I hope that Ofcom and the Government will do everything they can to make sure that those kinds of services are readily available and are not just controlled by the big tech companies in an anti-competitive way.

16:45
I think we have done a great service here in this group of amendments. The noble Lord, Lord Bethell, and other noble Lords have achieved a level playing field between Part 3 and Part 5 and, in doing so, have introduced a much more robust and safer form of age verification and age assurance which, nevertheless, as the noble Baroness, Lady Kidron, pointed out, is proportionate in the circumstances. So I pay tribute to all those involved, including the Minister for his flexibility and the Secretary of the State likewise, but we must have that privacy-protecting aspect to it.
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a good debate, perhaps unfairly curtailed in terms of the range of voices we have heard, but I am sure the points we wanted to have on the table are there and we can use them in summarising the debate we have had so far.

I welcome the Government’s amendments in this group. They have gone a long way to resolving a number of the difficulties that were left after the Digital Economy Act. As the noble Lord, Lord Clement-Jones, has said, we now have Part 3 and Part 5 hooked together in a consistent and effective way and definitions of “age verification” and “age estimation”. The noble Lord, Lord Grade, is sadly not in his place today—I normally judge the quality of the debate by the angle at which he resides in that top corner there. He is not here to judge it, but I am sure he would be upright and very excited by what we have been hearing so far. His point about the need for companies to be clearly responsible for what they serve up through their services is really important in what we are saying here today.

However, despite the welcome links across to the ICO age-appropriate design code, with the concerns we have been expressing on privacy there are still a number of questions which I think the Minister will want to deal with, either today or in writing. Several noble Lords have raised the question of what “proportionate” means in this area. I have mentioned it in other speeches in other groups. We all want the overall system to be proportionate in the way in which it allocates the powers, duties and responsibilities on the companies providing us with the services they do. But there is an exception for the question of whether children should have access to material which they should not get because of legal constraints, and I hope that “proportionate” is not being used in any sense to evade that.

I say that particularly because the concern has been raised in other debates—and I would be grateful if the Minister could make sure when he comes to respond that this issue is addressed—that smaller companies with less robust track records in terms of their income and expenditures might be able to plead that some of the responsibilities outlined in this section of the Bill do not apply to them because otherwise it would bear on their ability to continue. That would be a complete travesty of where we are trying to get to here, which is an absolute bar on children having access to material that is illegal or in the lists now in the Bill in terms of priority content.

The second worry that people have raised is: will the system that is set up here actually work in practice, particularly if it does not apply to all companies? That relates perhaps to the other half of the coin that I have just mentioned.

The third point, raised by a number of Peers, is: where does all this sit in relation to the review of pornography which was announced recently? A number of questions have been asked about issues which the Minister may be unable to respond to, but I suspect he may also want to write to us on the wider issue of timing and the terms of reference once they are settled.

I think we need to know this as we reach the end of the progress on this Bill, because you cannot expect a system being set up with the powers that are being given to Ofcom to work happily and well if Ofcom knows it is being reviewed at the same time. I hope that some consideration will be given to how we get the system up and running, even if the timescale is now tighter than it was, if at the same time a review rightly positioned to try to look at the wider range of pornography is going to impact on its work.

I want to end on the question raised by a large number of noble Lords: how does all this work sit with privacy? Where information and data are being shared on the basis of assuring access to services, there will be a worry if privacy is not ensured. The amendments tabled by the noble Baroness, Lady Kidron, are very salient to this. I look forward to the Minister’s response to them.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am sorry that the noble Baroness, Lady Benjamin, was unable to be here for the start of the debate on Thursday and therefore that we have not had the benefit of hearing from her today. I am very glad that she was here to hear the richly deserved plaudits from across the House for her years of campaigning on this issue.

I am very glad to have had the opportunity to discuss matters directly with her including, when it was first announced, the review that we have launched. I am pleased that she gave it a conditional thumbs up. Many of her points have been picked up by other noble Lords today. I did not expect anything more than a conditional thumbs up from her, given her commitment to getting this absolutely right. I am glad that she is here to hear some of the answers that I am able to set out, but I know that our discussions would have continued even if she had been able to speak today and that her campaigns on this important issue will not cease; she has been tireless in them. I am very grateful to her, my noble friends Lord Bethell and Lady Harding, the noble Baroness, Lady Kidron, and many others who have been working hard on this.

Let me pick up on their questions and those of the noble Baroness, Lady Ritchie of Downpatrick, and others on the review we announced last week. It will focus on the current regulatory landscape and how to achieve better alignment of online and offline regulation of commercial pornography. It will also look at the effectiveness of the criminal law and the response of the criminal justice system relating to pornography. This would focus primarily on the approach taken by law enforcement agencies and the Crown Prosecution Service, including considering whether changes to the criminal law would address the challenges identified.

The review will be informed by significant expert input from government departments across Whitehall, the Crown Prosecution Service and law enforcement agencies, as well as through consultation with the industry and with civil society organisations and regulators including, as the noble Baroness, Lady Ritchie, rightly says, some of the many NGOs that do important work in this area. It will be a cross-government effort. It will include but not be limited to input from the Ministry of Justice, the Home Office, the Department for Science, Innovation and Technology and my own Department for Culture, Media and Sport. I assure my noble friend Lord Farmer that other government departments will of course be invited to give their thoughts. It is not an exhaustive list.

I detected the enthusiasm for further details from noble Lords across the House. I am very happy to write as soon as I have more details on the review, to keep noble Lords fully informed. I can be clear that we expect the review to be complete within 12 months. The Government are committed to undertaking it in a timely fashion so that any additional safeguards for protecting UK users of online services can be put in place as swiftly as possible.

My noble friend Lord Bethell asked about international alignment and protecting Britain for investment. We continue to lead global discussions and engagement with our international partners to develop common approaches to online safety while delivering on our ambition to make the UK the safest place in the world to be online.

The noble Baroness, Lady Kidron, asked about the new requirements. They apply only to Part 3 providers, which allow pornography or other types of primary priority content on their service. Providers that prohibit this content under their terms of service for all users will not be required to use age verification or age estimation. In practice, we expect services that prohibit this content to use other measures to meet their duties, such as effective content moderation and user reporting. This would protect children from this content instead of requiring measures that would restrict children from seeing content that is not allowed on the service in the first place.

These providers can still use age verification and age estimation to comply with the existing duty to prevent children encountering primary priority content. Ofcom can still recommend age-verification and age-estimation measures in codes of practice for these providers where proportionate. On the noble Baroness’s second amendment, relating to Schedule 4, Ofcom may refer to the age-assurance principles set out in Schedule 4 in its children’s codes of practice.

On the 18-month timetable, I can confirm that 18 months is a backstop and not a target. Our aim is to have the regime in force as quickly as possible while making sure that services understand their new duties. Ofcom has set out in its implementation road map that it intends to publish draft guidance under Part 5 this autumn and draft children’s codes next spring.

The noble Baroness, Lady Ritchie, also asked about implementation timetables. I can confirm that Part 3 and Part 5 duties will be implemented at the same time. Ofcom will publish draft guidance shortly after Royal Assent for Part 5 duties and codes for the illegal content duties in Part 3. Draft codes for Part 3 children’s duties will follow in spring next year. Some Part 3 duties relating to category 1 services will be implemented later, after the categorisation thresholds have been set in secondary legislation.

The noble Lord, Lord Allan of Hallam, asked about interoperability. We have been careful to ensure that the Bill is technology neutral and to allow for innovation across the age-assurance market. We have also included a principle on interoperability in the new list of age-assurance principles in Schedule 4 and the Part 5 guidance.

At the beginning of the debate, on the previous day on Report, I outlined the government amendments in this group. There are some others, which noble Lords have spoken to. Amendments 125 and 217, from the noble Baroness, Lady Kidron, seek to add additional principles on user privacy to the new lists of age-assurance principles for both Part 3 and 5, which are brought in by Amendments 124 and 216. There are already strong safeguards for user privacy in the Bill. Part 3 and 5 providers will need to have regard to the importance of protecting users’ privacy when putting in place measures such as age verification or estimation. Ofcom will be required to set out, in codes of practice for Part 3 providers and in guidance for Part 5 providers, how they can meet these duties relating to privacy. Furthermore, companies that use age-verification or age-estimation solutions will need to comply with the UK’s robust data protection laws or face enforcement action.

Adding the proposed new principles would, we fear, introduce confusion about the nature of the privacy duties set out in the Bill. Courts are likely to assume that the additions are intended to mean something different from the provisions already in the Bill relating to privacy. The new amendments before your Lordships imply that privacy rights are unqualified and that data can never be used for more than one purpose, which is not the case. That would introduce confusion about the nature of—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I apologise to the Minister. Can he write giving chapter and verse for that particular passage by reference to the contents of the Bill?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am very happy to do that. That would probably be better than me trying to do so at length from the Dispatch Box.

Government Amendment 124 also reinforces the importance of protecting children’s privacy, including data protection, by ensuring that Ofcom will need to have regard to standards set out under Section 123 of the Data Protection Act 2018 in the age-appropriate design code. I hope that explains why we cannot accept Amendments 125 or 217.

The noble Baroness, Lady Fox, has Amendment 184 in this group and was unable to speak to it, but I am very happy to respond to it and the way she set it out on the Marshalled List. It seeks to place a new duty on Ofcom to evaluate whether internet service providers, internet-connected devices or individual websites should undertake user-identification and age-assurance checks. This duty would mean that such an evaluation would be needed before Ofcom produces guidance for regulated services to meet their duties under Clauses 16 and 72.

Following this evaluation, Ofcom would need to produce guidance on age-verification and age-assurance systems, which consider cybersecurity and a range of privacy considerations, to be laid before and approved by Parliament. The obligation for Ofcom to evaluate age assurance, included in the noble Baroness’s amendment, is already dealt with by Amendment 271, which the Government have tabled to place a new duty on Ofcom to publish a report on the effectiveness of age-assurance solutions. That will specifically include consideration of cost to business, and privacy, including the processing of personal data.

17:00
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I just realised I forgot to thank the Government for Amendment 271, which reflected something I raised in Committee. I will reflect back to the Minister that, as is reinforced by his response now, it goes precisely where I wanted to. That is to make sure—I have raised this many times—that we are not implementing another cookie banner, but are implementing something and then going back to say, “Did it work as we intended? Were the costs proportionate to what we achieved?” I want to put on the record that I appreciate Amendment 271.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I appreciate the noble Lord’s interjection and, indeed, his engagement on this issue, which has informed the amendments that we have tabled.

In relation to the amendment of the noble Baroness, Lady Fox, as I set out, there are already robust safeguards for user privacy in the Bill. I have already mentioned Amendment 124, which puts age-assurance principles in the Bill. These require Ofcom to have regard, when producing its codes of practice on the use of age assurance, to the principle of protecting the privacy of users, including data protection. We think that the noble Baroness’s amendment is also unnecessary. I hope that she and the noble Baroness, Lady Kidron, will be willing to not move their amendments and to support the government amendments in the group.

Amendment 34 agreed.
Amendment 35
Moved by
35: Clause 10, page 9, line 37, at end insert—
“(iv) features, functionalities or behaviours (including those enabled or created by the design or operation of the service) that are harmful to children”Member’s explanatory statement
This amendment ensures that in carrying out risk assessments, user to user services must consider the potential for the design and operation of services to create harm separately and additionally to harm relating to the dissemination of or encountering harmful content.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I rise to speak to all the amendments in this group. It is a cause of great regret that, despite many private meetings with officials, government lawyers and Ministers, we have not yet come to an agreement that would explicitly include in the Bill harm that does not derive from content. I will be listening very carefully to the Minister, if he should change his mind during the debate.

The amendments in this group fall into three categories. First, there is a series of amendments in my name and those of the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford: Amendments 35, 36, 37A and 85. I hope the Government will accept them as consequential because, in meetings last week, they would not accept that harm to children can arise from the functionality and design of services and not just from the content. Each of these amendments simply makes it clear that harm can arise absent from content: nothing more, nothing less. If the Minister agrees that harm may derive from the design of products and services, can he please explain, when he responds, why these amendments are not acceptable? Simply put, it is imperative that the features, functionalities or behaviours that are harmful to children, including those enabled or created by the design or operation of the service, are in scope of the Bill. This would make it utterly clear that a regulated company has a duty to design its service in a manner that does not harm children.

The Government have primary priority harmful content, priority content or non-designated harmful content, the latter being a category that is yet to be defined, but not the harm that emerges from how the regulated company designs its service. For example, there are the many hundreds of small reward loops that make up a doomscroll or make a game addictive; commercial decisions such as Pokémon famously did for a time, which was to end every game in a McDonald’s car park; or, more sinister still, the content-neutral friend recommendations that introduce a child to other children like them, while pushing children into siloed groups. For example, they deliberately push 13 year-old boys towards Andrew Tate—not for any content reason, but simply on the basis that 13 year-old boys are like each other and one of them has already been on that site.

The impact of a content-neutral friend recommendation has rocked our schools as female teachers and girls struggle with the attitudes and actions of young boys, and has torn through families, who no longer recognise their sons and brothers. To push hundreds of thousands of children towards Andrew Tate for no reason other than to benefit commercially from the network effect is a travesty for children and it undermines parents.

The focus on content is old-fashioned and looks backwards. The Bill is drafted as if it has particular situations and companies in mind but does not think about how fast the business moves. When we started the Bill, none of us thought about the impact of TikTok; last week, we saw a new service, Threads, go from zero to 70 million users in a single day. It is an act of stunning hubris to be so certain of the form of harm. To be unprepared to admit that some harm is simply design means that, despite repeated denials, this is just a content Bill. The promise of systems and processes being at the heart of the Bill has been broken.

The second set of amendments in this group are in the name of my noble friend Lord Russell. Amendments 46 and 90 further reveal the attitude of the Government, in that they are protecting the companies rather than putting them four-square in the middle of their regime. The Government specifically exempt the manner of dissemination from the safety duties. My noble friend Lord Russell’s amendment would leave that out and ensure that the manner of dissemination, which is fundamental to the harm that children experience, is included. Similarly, Amendment 240 would take out “presented by content” so that harm that is the result of the design decisions is included in the Bill.

The third set are government Amendments 281C and 281D, and Amendment 281F, in my name. For absence of doubt, I am totally supportive of government Amendments 281C to 281E, which acknowledge the cumulative harms; for example, those that Molly Russell experienced as she was sent more and more undermining and harmful content. In as far as they are a response to my entreaties, and those of other noble Lords, that we ensure that cumulative harmful content is the focus of our concerns, I am grateful to the Government for tabling them. However, I note that the Government have conceded only the role of cumulative harm for content. Amendments 281D and 281E once again talk about content as the only harm to children.

The noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford have added their names to Amendment 281F, and I believe I am right in saying that if there were not a limit to four names, there were a great many Peers who would have added their names also. For the benefit of the House, I will quote directly from the amendment:

“When in relation to children, references to harm include the potential impact of the design and operation of a regulated service separately and additionally from harms arising from content, including the following considerations … the potential cumulative impact of exposure to harm or a combination of harms … the potential for harm to result from features, functionalities or behaviours enabled or created by the design and operation of services … the potential for some features and functionalities within a service to be higher risk than other aspects of the service … that a service may, when used in conjunction with other services, facilitate harm to a child on a different service … the potential for design strategies that exploit a child’s developmental vulnerabilities to create harm, including validation metrics and compulsive reward loops … the potential for real time services, features and functionalities such as geolocation, livestream broadcasts or events, augmented and virtual environments to put children at immediate risk … the potential for content neutral systems that curate or generate environments, content feeds or contacts to create harm to children … that new and emerging harms may arise from artificial intelligence, machine generated and immersive environments”.


Before I continue, I ask noble Lords to consider which of those things they would not like for their children, grandchildren or, indeed, other people’s children. I have accepted that the Government will not add the schedule of harms as I first laid it: the four Cs of content, conduct, contact and commercial harms. I have also accepted that the same schedule, written in the less comfortable language of primary priority, priority and non-designated harms, has also been rejected. However, the list that I just set out, and the amendment to the duties that reflect those risks, would finally put the design of the system at the heart of the Bill. I am afraid that, in spite of all our conversations, I cannot accept the Government’s argument that all harm comes from content.

Even if we are wrong today—which we are most definitely not—in a world of AI, immersive tech and augmented reality, is it not dangerous and, indeed, foolish, to exclude harm that might come from a source other than content? I imagine that the Minister will make the argument that the features are covered in the risk assessment duties and that, unlike content, features may be good or bad so they cannot be characterised as harmful. To that I say: if the risk assessment is the only process that matters, why do the Government feel it necessary to define the child safety duties and the interpretation of harm? The truth is, they have meaning. In setting out the duty of a company to a child, why would the Government not put the company’s design decisions right at the centre of that duty?

As for the second part of the argument, a geolocation feature may of course be great for a map service but less great if it shows the real-time location of a child to a predator, and livestreaming from a school concert is very different from livestreaming from your bedroom. Just as the noble Lord, Lord Allan, explained on the first day on Report, there are things that are red lines and things that are amber; in other words, they have to be age-appropriate. This amendment does not seek—nor would it mean—that individual features or functionalities would be prevented, banned or stopped. It would mean that a company had a duty to make sure that their features and functionalities were age-appropriate and did not harm children—full stop. There would be no reducing this to content.

Finally, I want to repeat what I have said before. Sitting in the court at Molly Russell’s inquest, I watched the Meta representative contest content that included blood cascading down the legs of a young woman, messages that said, “You are worthless”, and snippets of film of people jumping off buildings. She said that none of those things met the bar of harmful content according to Meta’s terms and conditions.

Like others, I believe that the Online Safety Bill could usher in a new duty of care towards children, but it is a category error not to see harm in the round. Views on content can always differ but the outcome on a child is definitive. It is harm, not harmful content, that the Bill should measure. If the Minister does not have the power to accede, I will, with great regret, be testing the opinion of the House. I beg to move.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, as so often in the course of the Bill, I associate myself wholeheartedly with the comments that the noble Baroness, Lady Kidron, just made. I, too, thank my noble friend the Minister and the Secretary of State for listening to our debates in Committee on the need to be explicit about the impact of cumulative harmful content. So I support Amendments 281C, 281D and 281E, and I thank them for tabling them.

17:15
In the previous group, we talked about and debated the hugely important topic of content harm, most particularly pornography, and the need to ensure an absolutely firm bar that prevents our children seeing such content. As my friend the noble Baroness, Lady Kidron, said, content is not the only harm on the internet—quite the opposite. Internet algorithms on social media platforms do not care what the content is. The computer does not read the content; the algorithm simply drives addiction. So the functionality is the root of an awful lot of the harms that our children are experiencing today, completely unregulated—whether it is driving addiction, creating dangerous friendship groups, connecting people who should not be able to be connected to our underage minors, or tracking individuals in real time.
My teenage daughter is currently in America on a school tour, and I have been stalking and tracking her to see where she is. But, each time I do, a shiver runs down my spine as I think how easy it would be for a predator to do the same thing, without recognising that non-content harm is a real and present danger to our children. As the noble Baroness, Lady Kidron, said, this is not to say that these functionalities are not brilliant. It makes me, as her mum, feel good that I can track her. As the noble Lord, Lord Allan, said last week, we need to remember that this is about priority harm and not primary priority harm. It is not black and white that it is always bad; it is a functionality, and we should require companies to assess the risk that it imposes on young people. That is why it is so important that we recognise this as a part of the Bill.
I know that my noble friend the Minister will want to say, “This is all included in the Bill anyway. Why have you all got your knickers in a twist about this? We’re all on track, and we’re going to do it. Ofcom has done all of the pre-work. It’s there”. My worry is that this is a complex and technical Bill. We have all got ourselves tangled up in the structure of it, and, if it is not in the Bill that non-content harms are real harms, the risk of it not being clear in the future is very great. I do not understand the argument—presented to us many times over the last few weeks—that, by putting it in the Bill, we make it worse, not better. I am no lawyer, but it seems strange to me that we are now specifying every other element of harm clearly in the Bill, but, together, we have not been able to find a wording that puts this in.
I am willing to accept that the amendments that I put my name to and that the noble Baroness, Lady Kidron, introduced so powerfully might not be the best way to do this. We might well have unintentionally fallen on to a landmine in this complex Bill. But I cannot accept that it is not necessary to put it in the Bill, so I urge my noble friend the Minister to accept the principle behind these amendments. If he cannot accept them today, I ask him to firmly commit to bring back government amendments that put non-content harms in the Bill. Otherwise, I will need to follow the noble Baroness, Lady Kidron, through the Lobbies.
Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - - - Excerpts

My Lords, as often, it is a pleasure to follow the noble Baronesses, Lady Harding and Lady Kidron, and to support this group of amendments, especially those to which I put my name. I thank the Minister and the Secretary of State for the many amendments they are introducing, including in the last group, on which I was not able to speak for similar reasons to other noble Lords. I especially note Amendment 1, which makes safety by design the object of the Bill and makes implicit the amendments that we are speaking to this afternoon, each of which is consistent with that object of safety by design running through the Bill.

As others have said, this is an immensely complex Bill, and anything which introduces clarity for the technology companies and the users is to be welcomed. I particularly welcome the list in Amendment 281F, which the noble Baroness, Lady Kidron, has already read aloud and which spells out very clearly the harm which results from functionality as well as content. It is imperative to have that in the Bill.

In Committee, I referred to the inequality of harms between the user of a service and the forces arrayed against them. You may like to imagine a child of eight, 12 or 15 using one of the many apps we are discussing this afternoon. Now imagine the five As as forces arrayed against them; they are all about functionality, not content. We must consider: the genius of the advertising industry, which is designed on a commercial basis for sales and profit; the fact that processes, applications and smartphones mean that there is 24/7 access to those who use these services and that there is no escape from them; the creation of addictions by various means of rewarding particular features, which have little to do with content and everything to do with design and function; the creative use of algorithms, which will often be invisible and undetectable to adult users and certainly invisible to children; and the creation of the generation of more harms through artificial intelligence, deep fakes and all the harms resulting from functionality. Advertising, access, addiction, algorithms and artificial intelligence are multiplying harms in a range of ways, which we have heard discussed so movingly today.

The quantity of harm means the socialisation, normalisation and creation of environments which are themselves toxic online and which would be completely unacceptable offline. I very much hope, alongside others, that the Government will give way on these amendments and build the naming of functionality and harm into the Bill.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak, in part, to two amendments with my name on them and which my noble friend Lady Kidron referred to: Amendments 46 and 90 on the importance of dissemination and not just content.

A more effective way of me saying the same thing differently is to personalise it by trying to give your Lordships an understanding of the experience taking place, day in, day out, for many young people. I address this not only to the Minister and the Bill team but, quite deliberately, to the Office of the Parliamentary Counsel. I know full well that the Bill has been many years in gestation and, because the online world, technology and now AI are moving so fast, it is almost impossible for the Bill and its architecture to keep pace with them. But that is not a good reason for not listening to and accepting the force of the argument which my noble friend Lady Kidron and many others have put forward.

Last week, on the first day on Report, when we were speaking to a group of amendments, I spoke to your Lordships about a particular functionality called dark patterns, which are a variety of different features built into the design of these platforms to drive more and more volume and usage.

The individual whose journey I will be describing is called Milly. Milly is online and she accepts an automatic suggestion that is on a search bar. Let us say it is about weight loss. She starts to watch videos that she would not otherwise have found. The videos she is watching are on something called infinite scroll, so one just follows another that follows another, potentially ad infinitum. To start off, she is seeing video after video of people sharing tips about dieting and showing how happy they are after losing weight. As she scrolls and interacts, the women she sees mysteriously seem to get thinner and thinner. The platform’s content dispersal strategy—if indeed it has one, because not all do—that tempers the power of the algorithm has not yet kicked in. The Bill does not address this because, individually, not a single one of the videos Milly has been watching violates the definition of primary priority content. Coding an algorithm to meet a child’s desire to view increasingly thin women is what they are doing.

The videos that Milly sees are captioned with a variety of hashtags such as #thinspo, #thighgap and #extremeweightloss. If she clicks on those, she will find more extreme videos and will start to click on the accounts that have posted the content. Suddenly, she is exposed to the lives of people who are presenting disordered eating not just as normal but as aspirational. Developmentally, Milly is at an age where she does not have the critical thinking skills to evaluate what she is seeing. She has entered a world that she is too young to understand and would never have found were it not for the design of the platform. Throughout her journey thus far, she has yet to see a single video that meets the threshold of primary priority harm content. This world is the result of cumulative design harms.

She follows some of the accounts that prompts the platform to recommend similar accounts. Many of the accounts recommended to her are even more extreme. They are managed by people who have active eating disorders but see what is known as their pro-ana status—that is, pro anorexia—as a lifestyle choice rather than a mental health issue. These accounts are very savvy about the platform’s community guidelines, so the videos and the language they use are coded specifically to avoid detection.

Every aspect of the way Milly is interacting with the platform has now been polluted. It is not just the videos she sees. It is the autocomplete suggestions she gets on searches. It is the algorithmically determined account recommendations. It is the design strategies that make it impossible for her to stop scrolling. It is the notifications she receives encouraging her back to the platform to watch yet another weight-loss video or follow yet another account. It is the filters and effects she is offered before she posts. It is the number of likes her videos get. It goes on and on, and the Bill as it is stands will fail Milly. This is why I am talking directly to the Minister and the Office of the Parliamentary Counsel, because they need to sort this out.

Earlier on this afternoon, before we began this debate, I was talking to an associate professor in digital humanities at UCL, Dr Kaitlyn Regehr. We were talking about incels—involuntary celibates—and the strange world they live in, and she made a comment. This is a quote that I wrote down word for word because it struck me. She said:

“One off-day seeds the algorithm. The algorithm will focus on that and amplify that one off-day”—


that one moment when we click on something and suddenly it takes us into a world and in a direction that we had no idea existed but, more importantly, because of the way these are designed, we feel we have no control over. We really must do something about this.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to support the amendments in the names of the intrepid noble Baroness, Lady Kidron, the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford. They fit hand in hand with the amendments that have just been debated in the previous group. Sadly, I was unable to take part in that debate because of a technical ruling, but I thank the Minister for his kind words and thank other noble Lords for what they have said. But my heart is broken, because they included age verification, for which I have campaigned for the past 12 years, and I wanted to thank the Government for finally accepting that children need to be protected from online harmful content, pornography being one example; it is the gateway to many other harms.

17:30
As we have heard around the House, the Government need to make it clear that harm to children can arise from functionality and design of online services too—not only content. These amendments would show the tech industry that there is no place to hide when it comes to fulfilling its obligations to protect children, especially as AI is emerging. The consequences of this could open a whole new Pandora’s box of harms, which have already started to spread—with horror. These amendments are an excellent and golden opportunity to protect children from them.
Many of us from across the House have been fighting for years for this day, and it has been good to see that the Government have finally listened—I say, “Hallelujah”. But why they should stop the Bill being absolutely clear about harm fails me. If they are saying that it is covered in the Bill, what is the objection to them making it explicit? These amendments would send a loud, long message to the industry that it is responsible for the design of its products. Surely, the Government should be on the side of children who have suffered for far too long from being exposed to harmful content, not on the side of the multinational tech companies.
As the children’s charity Barnardo’s said—and I declare an interest as vice president—children do not have a voice. I feel that we have a responsibility to protect them, and we must expect the Government to take children into consideration and show that they have a holistic view about protecting them from harm. I hope that the Government will embrace these amendments by continuing to listen to common sense and will support them.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - - - Excerpts

My Lords, it is a great pleasure to follow the veteran campaigner on this issue, the noble Baroness, Lady Benjamin. I, too, rise briefly to support Amendments 35 to 37A, 85 and 240 in the name of my noble friend Lady Kidron.

In Committee, I put my name to amendments that aimed to produce risk assessments on harms to future-proof the Bill. Sadly, they were thought unnecessary by the Government. Now the Minister has another chance to make sure that Ofcom will be able to assess and respond to potential harms from one of the fastest-changing sectors in the world in order to protect our children. I praise the Minister for having come so far but, if this Bill is to stand the test of time, we will have to be prepared for the ever-changing mechanisms that would deliver that content to children. Noble Lords have already told the House about the fast-changing algorithms and the potential of AI to create harms. Many tech companies do not even understand how their algorithms work; a risk assessment of their functions would ensure that they found out soon enough.

In the Communications and Digital Select Committee inquiry into regulating the internet, we recommended that, because the changes in digital delivery and technology were happening so fast, a specific body needed to be set up to horizon scan. In these amendments, we would build these technological changes into this Bill’s regulatory mechanism to safeguard our children in future. I hope that noble Lords will support the amendment.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I also support the amendments from the noble Baroness, Lady Kidron. It is relatively easy to stand here and make the case for age verification for porn: it is such a black and white subject and it is disgusting pornography, so of course children should be protected from it. Making the case for the design of the attention economy is more subtle and complex—but it is incredibly important, because it is the attention economy that is driving our children to extreme behaviours.

I know this from my own personal life; I enjoy incredibly lovely online content about wild-water swimming, and I have been taken down a death spiral towards ice swimming and have become a compulsive swimmer in extreme temperatures, partly because of the addiction generated by online algorithms. This is a lovely and heart-warming anecdote to give noble Lords a sense of the impact of algorithms on my own imagination, but my children are prone to much more dangerous experiences. The plasticity of their brains is so much more subtle and malleable; they are, like other children, open to all sorts of addiction, depression, sleeplessness and danger from predators. That is the economy that we are looking at.

I point noble Lords to the intervention from the surgeon general in America, Admiral Vivek Murthy—an incredibly impressive individual whom I came across during the pandemic. His 25-page report on the impact of social media on the young of America is incredibly eye-opening reading. Some 95% of American children have come across social media, and one-third of them see it almost constantly, he says. He attributes to the impact of social media depression, anxiety, compulsive behaviours and sleeplessness, as well as what he calls the severe impact on the neurological development of a generation. He calls for a complete bar on all social media for the under-13s and says that his own children will never get anywhere near a mobile phone until they are 16. That is the state of the attention economy that the noble Baroness, Lady Kidron, talks about, and that is the state of the design of our online applications. It is not the content itself but the way in which it is presented to our children, and it traps their imagination in the kind of destructive content that can lead them into all kinds of harms.

Admiral Murthy calls on legislators to act today—and that was followed on the same day by a commitment from the White House to look into this and table legislation to address the kind of design features that the noble Baroness, Lady Kidron, is looking at. I think that we should listen to the surgeon general in America and step up to the challenge that he has given to American legislators. I am enormously grateful to my noble friend the Minister for the incredible amount of work that he has already done to try to bridge the gap in this matter, but there is a way to go. Like my noble friend Lady Harding, I hope very much indeed that he will be able to tell us that he has been able to find a way across the gap, or else I shall be supporting the noble Baroness, Lady Kidron, in her amendment.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

I rise briefly to speak to this group of amendments. I want to pick up where my noble friend Lord Bethell has just finished. The Government have listened hugely on this Bill and, by and large, the Bill, and the way in which Ministers have engaged, is a model of how the public wants to see their Parliament acting: collaboratively and collegiately, listening to each other and with a clear sense of purpose that almost all of us want to see the Bill on the statute book as soon as possible. So I urge my noble friend the Minister to do so again. I know that there have been many conversations and I think that many of us will be listening with great care to what he is about to say.

There are two other points that I wanted to mention. The first is that safety by design was always going to be a critical feature of the Bill. I have been reminding myself of the discussions that I had as Culture Secretary. Surely and in general, we want to prevent our young people in particular encountering harms before they get there, rather than always having to think about the moderation of harmful content once it has been posted.

Secondly, I would be interested to hear what the Minister has to say about why the Government find it so difficult to accept these amendments. Has there been some pushback from those who are going to be regulated? That would suggest that, while they can cope with the regulation of content, there is still secrecy surrounding the algorithms, functionalities and behaviours. I speak as the parent of a teenager who, if he could, would sit there quite happily looking at YouTube. In fact, he may well be doing that now—he certainly will not be watching his mother speaking in this House. He may well be sitting there and looking at YouTube and the content that is served up automatically, time after time.

I wonder whether this is, as other noble Lords have said, an opportunity. If we are to do the Bill properly and to regulate the platforms—and we have decided we need to do that—we should do the job properly and not limit ourselves to content. I shall listen very carefully to what my noble friend says but, with regret, if there is a Division, I will have to support the indomitable noble Baroness, Lady Kidron, as I think she was called.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I very strongly support the noble Baroness, Lady Kidron, in her Amendments 35, 36 and 281F and in spirit very much support what the noble Lord, Lord Russell, said in respect of his amendments. We have heard some very powerful speeches from the noble Baroness, Lady Kidron, herself, from the noble Baronesses, Lady Harding and Lady Morgan, from the right reverend Prelate the Bishop of Oxford, from my noble friend Lady Benjamin and from the noble Lords, Lord Russell and Lord Bethell. There is little that I can add to the colour and the passion that they brought to the debate today.

As the noble Baroness, Lady Kidron, started by saying that it is not just about content; it is about functionalities, features and behaviours. It is all about platform design. I think the Government had pretty fair warning throughout the progress of the Bill that we would be keen to probe this. If the Minister looks back to the Joint Committee report, he will see that there was a whole chapter titled “Societal harm and the role of platform design”. I do not think we could have been clearer about what we wanted from this legislation. One paragraph says:

“We heard throughout our inquiry that there are design features specific to online services that create and exacerbate risks of harm. Those risks are always present, regardless of the content involved, but only materialise when the content concerned is harmful”.


It goes on to give various examples and says:

“Tackling these design risks is more effective than just trying to take down individual pieces of content (though that is necessary in the worst cases). Online services should be identifying these design risks and putting in place systems and process to mitigate them before people are harmed”.


That is the kind of test that the committee put. It is still valid today. As the noble Baroness said, platforms are benefiting from the network effect, and the Threads platform is an absolutely clear example of how that is possible.

The noble Lord, Lord Russell, gave us a very chilling example of the way that infinite scrolling worked for Milly. A noble Lord on the Opposition Bench, a former Home Secretary whose name I momentarily forget, talked about the lack of empathy of AI in these circumstances. The algorithms can be quite relentless in pushing this content; they lack human qualities. It may sound over the top to say that, but that is exactly what we are trying to legislate for. As the noble Lord, Lord Russell, says, just because we cannot always anticipate what the future holds, there is no reason why we should not try. We are trying to future-proof ourselves as far as possible, and it is not just the future but the present that we are trying to proof against through these amendments. We know that AI and the metaverse are coming down the track, but there are present harms that we are trying to legislate for as well. The noble Baroness, Lady Kidron, was absolutely right to keep reminding us about Molly Russell. It is this kind of algorithmic amplification that is so dangerous to our young people.

The Minister has a chance, still, to accede to these amendments. He has heard the opinion all around the House. It is rather difficult to understand what the Government’s motives are. The noble Baroness, Lady Morgan, put her finger on it: why is it so difficult to accede to these? We have congratulated the Government, the Minister and the Secretary of State throughout these groups over the last day and a bit; they have been extremely consensual and have worked very hard at trying to get agreement on a huge range of issues. Most noble Lords have never seen so many government amendments in their life. So far, so good; why ruin it?

17:45
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

There is always a simple question. We are in a bit of a mess—again. When I said at Second Reading that I thought we should try to work together, as was picked up by the noble Baroness in her powerful speech, to get the best Bill possible out of what we had before us, I really did not know what I was saying. Emotion caught me and I ripped up a brilliant speech which will never see the light of day and decided to wing it. I ended up by saying that I thought we should do the unthinkable in this House—the unthinkable in politics, possibly—and try to work together to get the Bill to come right. As the noble Lord, Lord Clement-Jones, pointed out, I do not think I have ever seen, in my time in this House, so many government amendments setting out a huge number of what we used to call concessions. I am not going to call them concessions—they are improvements to the Bill. We should pay tribute to the Minister, who has guided his extensive team, who are listening anxiously as we speak, in the good work they have been doing for some time, getting questioned quite seriously about where it is taking us.

The noble Lord, Lord Clement-Jones, is quite right to pick up what the pre-legislative scrutiny committee said about this aspect of the work we are doing today and what is in the Bill. We have not really nailed the two big things that social media companies ask: this amplification effect, where a single tweet—or thread, let us call it now—can go spinning around the world and gather support, comment, criticism, complaint, anger and all sorts of things that we probably do not really understand in the short period of time it takes to be read and reacted to. That amplification is not something we see in the real world; we do not really understand it and I am not quite sure we have got to the bottom of where we should be going at this stage.

The second most important point—the point we are stuck on at the moment; this rock, as it were, in the ocean—is the commercial pressure which, of course, drives the way in which companies operate. They are in it for the money, not the social purpose. They did not create public spaces for people to discuss the world because they think it is a good thing. There is no public service in this—this is a commercial decision to get as much money as possible from as many people as possible and, boy, are they successful.

But commercial pressures can have harms; they create harms in ways that we have discussed, and the Bill reflects many of those. This narrow difference between the way the Bill describes content, which is meant to include many of the things we have been talking about today—the four Cs that have been brought into the debate helpfully in recent months—does not really deal with the commercial pressures under which people are placed because of the way in which they deal with social media. We do not think the Bill is as clear as it could be; nor does it achieve as much as it should in trying to deal with that issue.

That is in part to do with the structure. It is almost beyond doubt that the sensibility of what we are trying to achieve here is in the Bill, but it is there at such a level of opacity that it does not have the clarity of the messages we have heard today from those who have spoken about individuals—Milly and that sort of story—and the impact on people. Even the noble Lord, Lord Bethell, whose swimming exploits we must admire, is an unwitting victim of the drive of commercial pressures that sees him in his underwear at inappropriate moments in order that they should seek the profits from that. I think it is great, but I wonder why.

I want to set the Minister a task: to convince us, now that we are at the bar, that when he says that this matter is still in play, he realises what that must imply and will give us a guarantee that we will be able to gain from the additional time that he seeks to get this to settle. There is a case, which I hope he will agree to, for having in the Bill an overarching statement about the need to separate out the harms that arise from content and the harms that arise from the system discussions and debates we have been having today where content is absent. I suggest that, in going back to Clause 1, the overarching objectives clause, it might well be worth seeing whether that might be strengthened so that it covers this impact, so that the first thing to read in the Bill is a sense that we embrace, understand and will act to improve this question of harm arising absent content. There is a case for putting into Clauses 10, 11, 25 and 82 the wording in Amendments 35, 36, 37A and 240, in the name of the noble Baroness, Lady Kidron, and to use those as a way of making sure that every aspect of the journey through which social media companies must go to fulfil the duties set out in the Bill by Ofcom reflects both the content that is received and the design choices made by those companies in bringing forward those proposals for material content harms and the harms that arise from the design choices. Clauses 208 and 209 also have to provide a better consideration of how one describes harms so that they are not always apparently linked to content.

That is a very high hurdle, particularly because my favourite topic of how this House works will be engaged. We have, technically, already passed Clause 1; an amendment was debated and approved, and now appears in versions of the Bill. We are about to finish with Clauses 10 and 11 today, so we are effectively saying to the Minister that he must accept that there are deficiencies in the amendments that have already been passed or would be, if we were to pass Amendments 35, 36, 37A, 85 and 240 in the name of the noble Baroness, Lady Kidron, and others. It is not impossible, and I understand that it would be perfectly reasonable, for the Government to bring back a series of amendments on Third Reading reflecting on the way in which the previous provisions do not fulfil the aspirations expressed all around the House, and therefore there is a need to change them. Given the series of conversations throughout this debate—my phone is red hot with the exchanges taking place, and we do not have a clear signal as to where that will end up—it is entirely up to the Minister to convince the House whether these discussions are worth it.

To vote on this when we are so close seems ridiculous, because I am sure that if there is time, we can make this work. But time is not always available, and it will be up to the Minister to convince us that we should not vote and up to the noble Baroness to decide whether she wishes to test the opinion of the House. We have a three-line Whip on, and we will support her. I do not think that it is necessary to vote, however—we can make this work. I appeal to the Minister to get over the bar and tell us how we are to do it.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am very grateful for the discussion we have had today and the parallel discussions that have accompanied it, as well as the many conversations we have had, not just over the months we have been debating the Bill but over the past few days.

I will turn in a moment to the amendments which have been the focus of the debate, but let me first say a bit about the amendments in this group that stand in my name. As noble Lords have kindly noted, we have brought forward a number of changes, informed by the discussions we have had in Committee and directly with noble Lords who have taken an interest in the Bill for a long time.

Government Amendments 281C, 281D, 281E and 281G relate to the Bill’s interpretation of “harm”, which is set out in Clause 209. We touched on that briefly in our debate on Thursday. The amendments respond to concerns which I have discussed with many across your Lordships’ House that the Bill does not clearly acknowledge that harm and risk can be cumulative. The amendments change the Bill to make that point explicit. Government Amendment 281D makes it clear that harm may be compounded in instances where content is repeatedly encountered by an individual user. That includes, but is not limited to, instances where content is repeatedly encountered as a result of algorithms or functionalities on a service. Government Amendment 281E addresses instances in which the combination of multiple functionalities on a service cumulatively drives up the risk of harm.

Those amendments go hand in hand with other changes that the Government have made on Report to strengthen protections for children. Government Amendment 1, for instance, which we discussed at the beginning of Report, makes it clear that services must be safe by design and that providers must tackle harms which arise from the design and operation of their service. Government Amendments 171 and 172 set out on the face of the Bill the categories of “primary priority” and “priority” content which is harmful to children to allow the protections for children to be implemented as swiftly as possible following Royal Assent. As these amendments demonstrate, the Government have indeed listened to concerns which have been raised from all corners of your Lordships’ House and made significant changes to strengthen the Bill’s protections for children. I agree that it has been a model of the way in which your Lordships’ House operates, and the Bill has benefited from it.

Let me turn to the amendments in the name of the noble Baroness, Lady Kidron. I am very grateful for her many hours of discussion on these specific points, as well as her years of campaigning which led to them. We have come a long way and made a lot of progress on this issue since the discussion at the start of Committee. The nature of online risk versus harm is one which we have gone over extensively. I certainly accept the points that the noble Baroness makes; I know how heartfelt they are and how they are informed by her experience sitting in courtrooms and in coroners’ inquests and talking to people who have had to be there because of the harms they or their families have encountered online. The Government are firmly of the view that it is indisputable that a platform’s functionalities, features or wider design are often the single biggest factor in determining whether a child will suffer harm. The Bill makes it clear that functions, features and design play a key role in the risk of harm occurring to a child online; I draw noble Lords’ attention to Clause 11(5), which makes it clear that the child safety duties apply across all areas of a service, including the way it is designed, operated and used, as well as content present on the service. That makes a distinction between the design, operation and use, and the content.

In addition, the Bill’s online safety objectives include that regulated services should be designed and operated so as to protect from harm people in the United Kingdom who are users of the service, including with regard to algorithms used by the service, functionalities of the services and other features relating to the operation of the service. There is no reference to content in this section, again underlining that the Bill draws a distinction.

This ensures that the role of functionalities is properly accounted for in the obligations on providers and the regulator, but I accept that noble Lords want this to be set out more clearly. Our primary aim must be to ensure that the regulatory framework can operate as intended, so that it can protect children in the way that they deserve and which we all want to see. Therefore, we cannot accept solutions that, however well meaning, may inadvertently weaken the Bill’s framework or allow providers to exploit legal uncertainty to evade their duties. We have come back to that point repeatedly in our discussions.

18:00
I will address the problems with the amendments as drafted; as the noble Baroness knows, if she presses them to a vote, we will not be able to accept them, although we are very happy to continue to discuss the concerns lying behind them. I am happy to reassure noble Lords that the Bill recognises and addresses that services can be risky by design and that features and functionalities can exacerbate the risk of harm to users, including children.
First, I have mentioned the new introductory clause that your Lordships have put into the Bill, which establishes safety by design as a key objective of it. As such, features and functionalities are captured in the existing children’s risk assessment and safety duties. I am grateful to the noble Lord, Lord Stevenson, for his suggestion that, if there is interest from the noble Baroness, Lady Kidron, we could use the time between now and Third Reading, in addition to our continuing discussions, to look at that again and try to make it clearer. However, its inclusion in the Bill has already been of benefit.
Secondly, providers must comprehensively consider and assess the risk presented by the design and operation of their service, including the risk of their design choices, which, as many noble Lords have highlighted, are often motivated by commercial aims rather than safety. These assessments also require providers to assess the risk that a service’s features and functionalities pose. Once this mandatory risk assessment is completed, they are required to mitigate and manage the risks to children that they have identified. For example, if a service has a direct messaging function, it will need to consider how this increases the risk of users encountering harms such as bullying and to follow steps in codes of practice, or take equivalent measures, to mitigate this.
It is not right to say that functionalities are excluded from the child safety duties. Clause 11(5) clearly sets out that safety duties apply across all areas of a service, including the way it is designed, operated and used, and not only to content that is present on the service.
The noble Lord, Lord Russell, spoke to his Amendments 46 and 90. They seek to remove the provisions in Clauses 11(15) and 25(13), which limit corresponding duties elsewhere in the Bill to cases where the risk of harm is presented by the nature of the content rather than the fact of its dissemination. Clause 209 is clear that harm from content may arise from the fact or manner of its dissemination. As I have mentioned, the Government’s amendments to Clause 209 make it clear that this includes instances where algorithms bombard a user with content, such as in the scenario the noble Lord set out. As such, user-to-user and search service providers must take action to address this as part of their child safety duties.
The duties in Clauses 11(2) and 25(2) apply to content that is harmful due to the manner of its dissemination, requiring providers to design and operate their services so as to mitigate the risks of harm identified in their risk assessments. This includes risks such as an algorithm pushing content at high volume to a user. If Clauses 11(15) and 25(13) were removed, Clause 11(3) and (6) and Clause 25(3) would require children to be protected from inherently harmless content on the grounds that harm could be caused if that content were encountered repeatedly over time. I am sure that is not what the noble Lord, Lord Russell, has in mind with his amendments, but that is why, if he pushes them to a vote, we will not be able to accept them.
We have talked about this at great length. If we can use the time between now and Third Reading fruitfully to address the points I have raised on these amendments—the noble Baroness, Lady Kidron, has heard them repeatedly; I make them for the benefit of the rest of your Lordships’ House, because we have had much discussion—I am very willing to look at that and bring forward points to address this at Third Reading. However, I have set out our concerns about the approach taken in the amendments she has tabled. I am very grateful to her for her time and for discussing this. Procedurally, if she presses them to a vote now, the matter will have been dealt with on Report and we will not be able to look at this again at Third Reading. I hope she may yet have found comfort in what I have said and be willing to continue those discussions, but if she wishes to press her amendments to a Division now, the Government will not be able to accept them and I would recommend that noble Lords vote against.
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I thank everybody who has spoken for these amendments. I also thank the Minister for our many discussions and apologise to the House for the amount of texts that I sent while we were trying to get stand-alone harms into the Bill—unfortunately, we could not; we were told that it was a red line.

It is with some regret that I ask the House to walk through the Lobbies. Before I do so, I acknowledge that the Government have met me on very many issues, for which I am deeply grateful. There are no concessions on this Bill, only making it better. From my perspective, there is no desire to push anybody anywhere, only to protect children and give citizens the correct relationship with the digital world.

I ask those who were not here when I said this before: please think about your children and grandchildren and other people’s children and grandchildren before you vote against these amendments. They are not only heartfelt, as the Minister said, but have been drafted with reference to many experts and people in the business, who, in their best practice, meet some of these things already. We do not want the Bill, by concentrating on content, to be a drag on what we are pushing forward. We want it to be aspirational and to push the industry into another culture and another place. At a personal level, I am very sorry to the Minister, for whom I have a great deal of respect, but I would like to test the opinion of the House.

18:08

Division 1

Ayes: 240

Noes: 168

18:20
Clause 11: Safety duties protecting children
Amendment 36
Moved by
36: Clause 11, page 10, line 38, at end insert—
“(c) mitigate the impact of harm to children in different age groups presented by features, functionalities or behaviours enabled or created by the design or operation of the service.”Member’s explanatory statement
This amendment ensures that User to user services’ duty to protect children from harm includes the ways in which the design and operation of services may create harm separately and additionally to harm relating to the dissemination of or encountering harmful content.
Amendment 36 agreed.
Amendment 37
Moved by
37: Clause 11, page 10, line 42, leave out “(for example, by using age verification)”
Member’s explanatory statement
This amendment is consequential on the next amendment of Clause 11 in my name.
Amendment 37 agreed.
Amendment 37A
Moved by
37A: Clause 11, page 10, line 46, at end insert—
“(c) protect children in age groups judged to be at risk of harm from features, functionalities or behaviours enabled or created by the design or operation of the service”Member’s explanatory statement
This amendment ensures that user to user services’ duty to protect children from harm includes the ways in which the design and operation of services may create harm separately and additionally to harm relating to the dissemination or encountering harmful content.
Amendment 37A agreed.
Amendment 38
Moved by
38: Clause 11, page 10, line 46, at end insert—
“(3A) The duty set out in subsection (3)(a) requires a provider to use age verification or age estimation (or both) to prevent children of any age from encountering primary priority content that is harmful to children which the provider identifies on the service.(3B) That requirement applies to a provider in relation to a particular kind of primary priority content that is harmful to children in every case except where—(a) a term of service indicates (in whatever words) that the presence of that kind of primary priority content that is harmful to children is prohibited on the service, and(b) that policy applies in relation to all users of the service.(3C) If a provider is required by subsection (3A) to use age verification or age estimation for the purpose of compliance with the duty set out in subsection (3)(a), the age verification or age estimation must be of such a kind, and used in such a way, that it is highly effective at correctly determining whether or not a particular user is a child.”Member’s explanatory statement
This amendment requires providers of user-to-user services to use age verification or age estimation to prevent children from encountering identified primary priority content that is harmful to children, unless the terms of service indicate that that kind of content is prohibited; and where that requirement applies, new subsection (3C) provides that the age verification or age estimation must be highly effective.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I beg to move.

Amendment 39 (to Amendment 38)

Moved by
39: Clause 11, at end insert—
“(3D) If the duty in subsection (3)(a) relates to pornographic content, the duty applies regardless of the size and capacity of a service.”Member’s explanatory statement
This amendment does not allow a service to determine age verification or age estimation is not needed because of their size and capacity.
Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

My Lords, I commend the Minister for the great strides forward which have been made since Committee. There remains one concern which has necessitated a further amendment in my name, that refers to this group. In Committee, I and others probed whether pornographic content would be caught by the Bill. It is the opening words of Clause 11(3) which give rise to this concern, while amendments helpfully put forward by the Government—which I wholeheartedly support—bolster age-verification amendments. These amendments are still subject to qualification.

The Government’s amendments leave the beginning of Clause 11(3) unchanged. User-to-user services now have a duty to use age verification and age estimation, or both, to prevent children of any age from encountering primary priority content that is harmful to children. This duty is qualified by the words

“using proportionate systems and processes”.

It is that word “proportionate” that gives rise to concern, and which Amendment 39 seeks to address for pornographic content.

In a document produced by the Government in January 2021, the British Board of Film Classification said that there were literally millions of pornographic websites. This study did not include social media websites, some of which also host pornographic content—a point made by the Children’s Commissioner in her powerful recent report.

When announcing the new age-verification and age-estimation amendments on 30 June, the government press release said that

“pornography companies, social media platforms and other services”

will

“be explicitly required to use age verification or estimation measures to prevent children accessing pornography”.

My question to the Minister is this: will all websites and social media be covered by the Bill? With millions of sites on the internet, it is not unreasonable to think that some sites will argue that despite hosting pornographic content, they are not of a size or a capacity that necessitates them investing in age verification or estimation technology.

A further concern relates to large, particularly social media, providers. A proportionality clause may leave it open to them to claim that while they host pornographic content, the amount of pornography or the number of children accessing the platform simply does not warrant age verification as it is statistically a small part of what they provide. I think most people expect that the Bill will ensure that all pornographic content, wherever it is found, is subject to age verification or estimation. In fact, I congratulated my noble friend the Minister on that point earlier this afternoon.

In Committee, many noble Lords across the House argued that Parts 3 and 5 should be subject to the same duties. I am pleased to say that this is the last anomaly regarding pornographic content in the Bill. The Government have gone a very long way to ensure that the duties across Parts 3 and 5 are identical, which is very welcome. However, websites which fall under the scope of Part 5 do not have any exceptions. There is no proportionality test: they must have age verification or estimation to meet that duty. All I am seeking to do with Amendment 39 is to ensure parity of regulation across the Bill.

Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, the final issue I raised in Committee is dealt with in this group on so-called proportionality. I tabled amendments in Committee to ensure that under Part 3 no website or social media service with pornographic content could argue that it should be exempt from implementing age verification under Clause 11 because to do so would be disproportionate based on its size and capacity. I am pleased today to be a co-signatory to Amendment 39 tabled by the noble Lord, Lord Bethell, to do just that.

The noble Lord, Lord Russell, and the noble Baroness, Lady Kidron, have also tabled amendments which raise similar points. I am disappointed that despite all the amendments tabled by the Minister, the issue of proportionality has not been addressed; maybe he will give us some good news on that this evening. It feels like the job is not quite finished and leaves an unnecessary and unhelpful loophole.

I will not repeat all the arguments I made in Committee in depth but will briefly recap that we all know that in the offline world, we expect consistent regulation regardless of size when it comes to protecting children. We do not allow a small corner shop to act differently from a large supermarket on the sale of alcohol or cigarettes. In a similar online scenario, we do not expect small or large gambling websites to regulate children’s access to gambling in a different way.

We know that the impact of pornographic content on children is the same whether it is accessed on a large pornographic website or a small social media platform. We know from the experience of France and Germany that pornographic websites will do all they can to evade age verification. As the noble Lord, Lord Stevenson, said on the eighth day of Committee, whether pornography

“comes through a Part 3 or Part 5 service, or accidently through a blog or some other piece of information, it has to be stopped. We do not want our children to receive it. That must be at the heart of what we are about, and not just something we think about as we go along”.—[Official Report, 23/5/23; col. 821.]

By not shutting off the proportionality argument, the Government are allowing different-sized online services to act differently on pornography and all the other primary priority content, as I raised in Committee. At that stage, the noble Baroness, Lady Kidron, said,

“we do not need to take a proportionate approach to pornography”.—[Official Report, 2/5/23; col. 1481.]

Amendment 39 would ensure that pornographic content is treated as a separate case with no loopholes for implementing age verification based on size and capacity. I urge the Minister to reflect on how best we can close this potential loophole, and I look forward to his concluding remarks.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will briefly address Amendments 43 and 87 in my name. I thank the noble Baronesses, Lady Harding and Lady Kidron, and the noble Lord, Lord Knight, for adding their names to these amendments. They are complementary to the others in this group, on which the noble Lord, Lord Bethell, and the noble Baroness, Lady Ritchie, have spoken.

In Committee the Minister argued that it would be unfair to place the same child safety duties across all platforms. He said:

“This provision recognises that what it is proportionate to require of providers at either end of that scale will be different”.—[Official Report, 2/5/23; col. 1443.]


Think back to the previous group of amendments we debated. We talked about functionality and the way in which algorithms drive these systems. They drive you in all directions—to a large platform with every bell and whistle you might anticipate because it complies with the legislation, but also, willy-nilly, without any conscious thought because that is how it is designed, to a much smaller site. If we do not amend the legislation as it stands, they will take you to smaller sites that do not require the same level of safety duties, particularly towards children. I think we all fail to understand the logic behind that argument.

18:30
Child safety duties are based on the risk identified in the child risk assessments, which all services must carry out. If the risk is found to be low, the duties on that service will not be too onerous. If the risk is high, the duties should be onerous. Is the Minister seriously saying that, if the platform is small but the risk to children is high, because the platform is small it does not need the same level of safety duties as a large platform? That completely goes against the spirit and direction of the Bill.
Smaller is not safer. We know that, even with the very smallest platforms, real harm can transfer into the real world. I mentioned that I had been talking earlier today with an associate professor at UCL who has recently been looking at the world of incels—involuntary celibates. These forums often have very small memberships and numbers of people visiting, but their ability to get into a very unpleasant world of anti-immigrant hate speech, stirring up communities against refugees and migrants, and potentially women if they are incels, is a real-world problem.
I simply ask the Minister to reflect and look carefully at this and, frankly, the illogicality of the Government’s current approach to see whether we can yet again improve the Bill—as he has on so many occasions.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I follow the noble Lord, Lord Russell, particularly in talking about Amendments 43, 87 and 242, which raise some interesting and quite profound questions on what we are expecting from the market of internet services once the Online Safety Bill is in place.

It is worth taking a moment to remind ourselves of what we do and do not want from the Bill. We want services that are causing harm and are unwilling to take reasonable steps to address that to leave the UK market. That is clear. As a result of this legislation, it will be likely that some services leave the UK market, because we have asked them to do reasonable things and they have said no; they are not willing to comply with the law and therefore they need to be out. There is a whole series of measures in the Bill that will lead to that.

Equally, we want services that are willing to take reasonable steps to stay in the UK market, do the risk assessments, work at improvements and have the risks under control. They may not all be resolved on day one—otherwise, we would not need the legislation—but they should be on a path to address the risks that have been identified. We want those people to be in the market, for two reasons.

The first is that we want choice for people; we do not take pleasure in shutting people who are providing services out of the market. Also, from a child safety point of view, there is a genuine concern that, if you limit choice too far, you will end up creating more of a demand for completely unregulated services that sit outside the UK and will fill the gap. There is a balance in making sure that there is a range of attractive services, so that teenagers in particular feel that their needs are being met. We want those services to be regulated and committed to improvement.

Something that is in between will be a hard decision for Ofcom—something that is not great today, but not so bad that we want it out tomorrow. Ofcom will have to exercise considerable judgment in how it deals with those services. This is my interpretation of where proportionality and capacity come in. If you are running a very large internet service, something such as PhotoDNA, which is the technology that allows you to scan photos and detect child abuse images, is relatively straightforward to implement. All the major providers do it, but there are costs to that for smaller services. There are some real capacity challenges around implementing those kinds of technology. It is getting better over time and we would like them to do it, but you would expect Ofcom to engage in a conversation as a smaller service—smaller not in terms of its users but in its engineers and capacity—may need a little longer to implement such a technology.

A larger service could do proactive investigations. If it has a large team, once it has identified that something is problematic, it can investigate proactively. Again, a smaller service may not have the bodies on the ground to do that, but you would hope it would develop that capacity. It is important to recognise something about capacity if we are to encourage those that are half way between to come to the light side rather than slip off to the dark side.

I am interested in the Minister’s interpretation of these words and the instruction to Ofcom. We will be dependent on Ofcom, which will sit on the other side of a real or virtual table with the people who run these companies, as Ofcom can insist that they come in and talk to it. It will have to make these judgments, but we do not want it to be conned or to be a walkover for an organisation that has the capacity or could do things that are helpful, but is simply refusing to do them or somehow trying to pull the wool over Ofcom’s eyes.

Equally, we do not want Ofcom to demand the impossible of a service that genuinely struggles to meet a demand and that has things broadly under control. That is the balance and the difficult judgment. I think we are probably aiming for the same thing, and I hope the Minister is able to clarify these instructions and the way the Government expect Ofcom to interpret them. We are looking for that point at which Ofcom is seriously demanding but does not get overbearing and unnecessarily drive out of the market people who are making best efforts to do their risk assessments and then work hard to resolve those risks.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, as somebody who is only five feet and two inches, I have felt that size does not matter for pretty much all my life and have long wanted to say that in a speech. This group of amendments is really about how size does not matter; risk does. I will briefly build on the speech just given by the noble Lord, Lord Allan, very eloquently as usual, to describe why risk matters more than size.

First, there are laws for which size does matter—small companies do not need to comply with certain systems and processes—but not those concerned with safety. I have in my mind’s eye the small village fête, where we expect a risk assessment if we are to let children ride on rides. That was not the case 100 years ago, but is today because we recognise those dangers. One of the reasons why we stumbled into thinking that size should matter in this Bill is that we are not being honest about the scale of the risk for our children. If the risk is large enough, we should not be worrying about size; we should be worrying about that risk. That is the first reason why we have to focus on risk and not size.

The second reason is subsequent to what I have just said—the principles of the physical world should apply to the online world. That is one of the core tenets of this Bill. That means that if you recognise the real and present risks of the digital world you have to say that it does not matter whether a small number of people are affected. If it is a small business, it still has an obligation not to put people in harm’s way.

Thirdly, small becomes big very quickly—unfortunately, that has not been true for me, but it is true in the digital world as Threads has just shown us. Fourthly, we also know that in the digital world re-engineering something once it has got very big is really difficult. There is also a practical reason why you want engineers to think about the risks before they launch services rather than after the event.

We keep being told, rightly, that this is a Bill about systems and processes. It is a Bill where we want not just the outcomes that the noble Lord, Lord Allan, has referred to in terms of services in the UK genuinely being safer; we are trying to effect a culture change. I would argue one of the most important culture changes is that any bright, young tech entrepreneur has to start by thinking about the risks and therefore the safety procedures they need to put in place as they build their tech business from the ground up and not once they have reached some artificial size threshold.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I have to admit that it was incompetence rather than lack of will that meant I did not add my name to Amendment 39 in the name of the noble Lord, Lord Bethell, and I would very much like the Government to accept his argument.

In the meantime, I wonder whether the Minister would be prepared to make it utterly clear that proportionality does not mean a little bit of porn to a large group of children or a lot of porn to a small group of children; rather, it means that high-risk situations require effective measures and low-risk situations should be proportionate to that. On that theme, I say to the noble Lord, Lord Allan, whose points I broadly agree with, that while we would all wish to see companies brought into the fold rather than being out of the fold, it rather depends on their risk.

This brings me neatly to Amendments 43 and 87 from the noble Lord, Lord Russell, to which I managed to add my name. They make a very similar point to Amendment 39 but across safety duties. Amendment 242 in my name, to which the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford have added their names, makes the same point—yet again—in relation to Ofcom’s powers.

All these things are pointing in the same direction as Amendment 245 in the name of the noble Baroness, Lady Morgan, which I keep on trumpeting from these Benches and which offers an elegant solution. I urge the Minister to consider Amendment 245 before day four of Report because if the Government were to accept it, it would focus company resources, focus Ofcom resources and, as we discussed on the first day of Report, permit companies which do not fit the risk profile of the regime and are unable to comply with something that does not fit their model yet leaves them vulnerable to enforcement also to be treated in an appropriate way.

Collectively, the ambition is to make sure that we are treating things in proportion to the risk and that proportionate does not start meaning something else.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I agree with the noble Baroness, Lady Kidron, that all these amendments are very much heading in the same direction, and from these Benches I am extremely sympathetic to all of them. It may well be that this is very strongly linked to the categorisation debate, as the noble Baroness, Lady Kidron, said.

The amendment from the noble Lord, Lord Bethell, matters even more when we are talking about pornography in the sense that child safety duties are based on risks. I cannot for the life of me see why we should try to contradict that by adding in capacity and size and so on.

My noble friend made a characteristically thoughtful speech about the need for Ofcom to regulate in the right way and make decisions about risk and the capacity challenges of new entrants and so on. I was very taken by what the noble Baroness, Lady Harding, had to say. This is akin to health and safety and, quite frankly, it is a cultural issue for developers. What after all is safety by design if it is not advance risk assessment of the kinds of algorithm that you are developing for your platform? It is a really important factor.

18:45
I hope that we adopt that in AI regulation more broadly than simply online safety of this kind for social media platforms. We need a change of culture so that this is not just a question of developing without thinking about the ethical aspects of it. It is really important that we start with this kind of debate talking about assessing risk upfront. That should be the key test and not the size or capacity of a particular platform.
I support these amendments. I hope the Minister can give us some indication that we are all heading in the same direction as he is or that he is heading in the same direction as us. That would be enormously helpful.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, as we have heard, this is a small group of amendments concerned with preventing size and lack of capacity being used as a reasonable excuse for allowing children to be unsafe. Part of the problem is the complexity of the Bill and the way it has been put together.

For example, Clause 11, around user-to-user services, is the pertinent clause and it is headed “Safety duties protecting children”. Clause 11(2) is preceded in italics with the wording “All services” so anyone reading it would think that what follows applies to all user-to-user services regardless of size. Clause 11(3) imposes a duty on providers

“to operate a service using proportionate systems and processes”

to protect children from harm. That implies that there will be judgment around what different providers can be expected to do to protect children; for example, by not having to use a particular unaffordable technical solution on age assurance if they can show the right outcome by doing things differently. That starts to fudge things a little.

The noble Lord, Lord Bethell, who introduced this debate so well with Amendment 39, supported by my noble friend Lady Ritchie, wants to be really sure that the size of the provider can never be used to argue that preventing all children from accessing porn is disproportionate and that a few children slipping through the net might just be okay.

The clarity of Clause 11 unravels even further at the end of the clause, where in subsection (12)(b) it reads that

“the size and capacity of the provider of a service”

is relevant

“in determining what is proportionate”.

The clause starts to fall apart at that point quite thoroughly in terms of anyone reading it being clear about what is supposed to happen.

Amendment 43 seeks to take that paragraph out, as we have heard from the noble Lord, Lord Russell, and would do the same for search in Amendment 87. I have added my name to these amendments because I fear that the ambiguity in the wording of this clause will give small and niche platforms an easy get out from ensuring that children are safe by design.

I use the phrase “by design” deliberately. We need to make a choice with this Bill even at this late stage. Is the starting point in the Bill children’s safety by design? Or is the starting point one where we do not want to overly disrupt the way providers operate their business first—which is to an extent how the speech from the noble Lord, Lord Allan, may have been heard—and then overlay children’s safety on top of that?

Yesterday, I was reading about how children access inappropriate and pornographic content, not just on Twitter, Instagram, Snapchat, TikTok and Pinterest but on Spotify and “Grand Theft Auto”—the latter being a game with an age advisory of “over 17” but which is routinely played by teenaged children. Wherever we tolerate children being online, there are dangers which must be tackled. Listening to the noble Baroness, Lady Harding, took me to where a big chunk of my day job in education goes to—children’s safeguarding. I regularly have to take training in safeguarding because of the governance responsibilities that I have. Individual childminders looking after one or two children have an assessment and an inspection around their safeguarding. In the real world we do not tolerate a lack of safety for children in this context. We should not tolerate it in the online world either.

The speech from the noble Lord, Lord Russell, reminded me of the breadcrumbing from big platforms into niche platforms that is part of that incel insight that he referenced. Content that is harmful to children can also be what some children are looking for, which keeps them engaged. Small, emergent services aggressively seeking growth could set algorithms accordingly. They must not be allowed to believe that engaging harmful content is okay until they get to the size that they need to be to afford the age-assurance technology which we might envisage in the Bill. I hope that the Minister shares our concerns and can help us with this problem.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, short debates can be helpful and useful. I am grateful to noble Lords who have spoken on this group.

I will start with Amendment 39, tabled by my noble friend Lord Bethell. Under the new duty at Clause 11(3)(a), providers which allow pornography or other forms of primary priority content under their terms of service will need to use highly effective age verification or age estimation to prevent children encountering it where they identify such content on their service, regardless of their size or capacity. While the size and capacity of providers is included as part of a consideration of proportionality, this does not mean that smaller providers or those with less capacity can evade the strengthened new duty to protect children from online pornography. In response to the questions raised by the noble Baronesses, Lady Ritchie of Downpatrick and Lady Kidron, and others, no matter how much pornographic content is on a service, where providers do not prohibit this content they would still need to meet the strengthened duty to use age verification or age estimation.

Proportionality remains relevant for the purposes of providers in scope of the new duty at Clause 11(3)(a) only in terms of the age-verification or age-estimation measures that they choose to use. A smaller provider with less capacity may choose to go for a less costly but still highly effective measure. For instance, a smaller provider with less capacity might seek a third-party solution, whereas a larger provider with greater capacity might develop their own solution. Any measures that providers use will need to meet the new high bar of being “highly effective”. If a provider does not comply with the new duties and fails to use measures which are highly effective at correctly determining whether or not a particular user is a child, Ofcom can take tough enforcement action.

The other amendments in this group seek to remove references to the size and capacity of providers in provisions relating to proportionality. The principle of proportionate, risk-based regulation is fundamental to the Bill’s regulatory framework, and we consider that the Bill as drafted already strikes the correct balance. The Bill ultimately will regulate a large number of services, ranging from some of the biggest companies in the world to smaller, voluntary organisations, as we discussed in our earlier debate on exemptions for public interest services.

The provisions regarding size and capacity recognise that what it is proportionate to require of companies of various sizes and business models will be different. Removing this provision would risk setting a lowest common denominator standard which does not create incentives for larger technology companies to do more to protect their users than smaller organisations. For example, it would not be proportionate for a large multinational company which employs thousands of content moderators and which invests in significant safety technologies to argue that it is required to take only the same steps to comply as a smaller provider which might have only a handful of employees and a few thousand UK users.

While the size and capacity of providers is included as part of a consideration of proportionality, let me be clear that this does not mean that smaller providers or those with less capacity do not need to meet the child safety duties and other duties in the Bill, such as the illegal content safety duties. These duties set out clear requirements for providers. If providers do not meet these duties, they will face enforcement action.

I hope that is reassuring to my noble friend Lord Bethell and to the other noble Lords with amendments in this group. I urge my noble friend to withdraw his amendment.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I thank my noble friend the Minister for that reassurance. He put the points extremely well. I very much welcome his words from the Dispatch Box, which go a long way towards clarifying and reassuring.

This was a short and perfectly formed debate. I will not go on a tour d’horizon of everyone who has spoken but I will mention the noble Lord, Lord Allan of Hallam. He is entirely right that no one wants gratuitously to hound out businesses from the UK that contribute to the economy and to our life here. There are good regulatory principles that should be applied by all regulators. The five regulatory principles of accountability, transparency, targeting, consistency and proportionality are all in the Legislative and Regulatory Reform Act 2006. Ofcom will embrace them and abide by them. That kind of reassurance is important to businesses as they approach the new regulatory regime.

I take on board what my noble friend the Minister said in terms of the application of regulations regardless of size or capacity, and the application of these strengthened duties, such as “highly effective”, regardless of any economic or financial capacity. I feel enormously reassured by what he has said. I beg leave to withdraw my amendment.

Amendment 39 (to Amendment 38) withdrawn.
Amendment 38 agreed.
Amendment 40 had been withdrawn from the Marshalled List.
Amendments 41 and 42
Moved by
41: Clause 11, page 11, line 1, leave out from beginning to “may” in line 2 and insert “Age verification or age estimation to identify who is or is not a child user or which age group a child user is in are examples of measures which (if not required by subsection (3A))”
Member’s explanatory statement
This amendment refers to age verification and age estimation as mentioned in the preceding amendment in my name, and clarifies the relationship between Clause 11(4) and new subsection (3A) of Clause 11 inserted by that amendment.
42: Clause 11, page 12, line 6, leave out “this section” and insert “section 11”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 11 into two Clauses.
Amendments 41 and 42 agreed.
Amendment 43 not moved.
Amendments 44 and 45
Moved by
44: Clause 11, page 12, line 12, leave out “this section” and insert “section 11”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 11 into two Clauses.
45: Clause 11, page 12, line 16, leave out “subsections (3)(b)” and insert “section 11(3)(b)”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 11 into two Clauses.
Amendments 44 and 45 agreed.
Amendment 46 not moved.
Amendments 47 to 52
Moved by
47: Clause 11, page 12, line 21, leave out “subsections (3)” and insert “section 11(3)”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 11 into two Clauses.
48: Clause 11, page 12, line 24, leave out “this section” and insert “section 11”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 11 into two Clauses.
49: Clause 11, page 12, line 27, leave out from “if” to “the” in line 29 and insert “age verification or age estimation is used on the service with”
Member’s explanatory statement
This amendment provides that a provider can only conclude that children cannot access a service if age verification or age estimation is used on the service with the result that children are not normally able to access it.
50: Clause 11, page 12, line 31, after “In” insert “section 11 and”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 11 into two Clauses.
51: Clause 11, page 12, line 33, leave out “this section” and insert “section 11”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 11 into two Clauses.
52: Clause 11, divide Clause 11 into two clauses, the first (Safety duties protecting children) to consist of subsections (1) to (11) and the second (Safety duties protecting children: interpretation) to consist of subsections (12) to (19)
Member’s explanatory statement
This amendment splits up Clause 11 into two Clauses.
Amendments 47 to 52 agreed.
Amendment 53
Moved by
53: After Clause 11, insert the following new Clause—
“Assessment duties: user empowerment
(1) This section sets out the duties about assessments related to adult user empowerment which apply in relation to Category 1 services (in addition to the duties about risk assessments set out in section 8 and, in the case of Category 1 services likely to be accessed by children, section 10).(2) A duty to carry out a suitable and sufficient assessment for the purposes of section 12(2) at a time set out in, or as provided by, Schedule 3.(3) A duty to take appropriate steps to keep such an assessment up to date.(4) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient assessment for the purposes of section 12(2) relating to the impacts of that proposed change.(5) An assessment of a service “for the purposes of section 12(2)” means an assessment of the following matters—(a) the user base;(b) the incidence of relevant content on the service;(c) the likelihood of adult users of the service encountering, by means of the service, each kind of relevant content (with each kind separately assessed), taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service;(d) the likelihood of adult users with a certain characteristic or who are members of a certain group encountering relevant content which particularly affects them;(e) the likelihood of functionalities of the service facilitating the presence or dissemination of relevant content, identifying and assessing those functionalities more likely to do so;(f) the different ways in which the service is used, and the impact of such use on the likelihood of adult users encountering relevant content;(g) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to strengthen adult users’ control over their interaction with user-generated content, and other systems and processes) may reduce or increase the likelihood of adult users encountering relevant content.(6) In this section “relevant content” means content to which section 12(2) applies (content to which user empowerment duties set out in that provision apply).(7) See also—(a) section 19(8A) and (9) (records of assessments), and(b) Schedule 3 (timing of providers’ assessments).” Member’s explanatory statement
This amendment requires providers of Category 1 services to carry out and update as necessary an assessment about how likely it is that adult users will encounter content to which Clause 12(2) applies (suicide and self-harm content and so on - see Clause 12(10), (11) and (12)).
Amendment 54 (to Amendment 53) not moved.
Amendment 53 agreed.
Clause 12: User empowerment duties
Amendments 55 and 56 not moved.
Amendment 57
Moved by
57: Clause 12, page 13, line 9, after “(2)” insert “(“control features”)”
Member’s explanatory statement
This amendment is a technical drafting change related to the next amendment in my name.
Amendment 57 agreed.
Amendments 58 and 59 not moved.
Amendments 60 to 62
Moved by
60: Clause 12, page 13, line 10, at end insert—
“(4A) A duty to operate a service using a system or process which seeks to ensure that all registered adult users are offered the earliest possible opportunity, in relation to each control feature included in the service, to take a step indicating to the provider that—(a) the user wishes to retain the default setting for the feature (whether that is that the feature is in use or applied, or is not in use or applied), or(b) the user wishes to change the default setting for the feature.(4B) The duty set out in subsection (4A)—(a) continues to apply in relation to a user and a control feature for so long as the user has not yet taken a step mentioned in that subsection in relation to the feature;(b) no longer applies in relation to a user once the user has taken such a step in relation to every control feature included in the service.”Member’s explanatory statement
This amendment imposes a new duty on providers of Category 1 services to proactively ask all registered adult users whether they wish to opt in or opt out of any features offered in compliance with the duty in subsection (2), until a choice is made.
61: Clause 12, page 13, line 12, leave out from “which” to “and” in line 13 and insert “control features are offered”
Member’s explanatory statement
This amendment is a technical drafting change related to the preceding amendment in my name.
62: Clause 12, page 13, line 13, at end insert—
“(5A) A duty to summarise in the terms of service the findings of the most recent assessment of a service under section (Assessment duties: user empowerment) (assessments related to the duty set out in subsection (2)).” Member’s explanatory statement
This amendment requires providers of Category 1 services to summarise in their terms of service the findings of their latest assessment under the new clause proposed after Clause 11 in my name.
Amendments 60 to 62 agreed.
Amendments 63 and 64 not moved.
Amendments 65 to 73
Moved by
65: Clause 12, page 13, line 24, leave out “subsection (2)” and insert “section 12(2)”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 12 into two Clauses.
66: Clause 12, page 13, line 26, leave out paragraph (a) and insert—
“(a) all the findings of the most recent assessment under section (Assessment duties: user empowerment), and”Member’s explanatory statement
This amendment makes it clear that the findings of the latest assessment under the new Clause proposed after Clause 11 in my name are a relevant factor for the purposes of determining what it is proportionate for a provider to do to comply with the duty under Clause 12(2).
67: Clause 12, page 13, line 29, leave out “Subsection (2)” and insert “Section 12(2)”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 12 into two Clauses.
68: Clause 12, page 14, line 3, at end insert—
“(12A) The duty set out in section 12(4A) applies in relation to all registered adult users, not just those who begin to use a service after that duty begins to apply.”Member’s explanatory statement
This amendment makes it clear that the new duty on providers to offer registered users a choice about whether to use the user empowerment tools applies to existing as well as new users.
69: Clause 12, page 14, line 4, after “In” insert “section 12 and”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 12 into two Clauses.
70: Clause 12, page 14, line 12, after “In” insert “section 12 and”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 12 into two Clauses.
71: Clause 12, page 14, line 16, after first “of” insert “section 12 and”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 12 into two Clauses.
72: Clause 12, page 14, line 21, at end insert—
“(16) See also, in relation to duties set out in section 12, section 18 (duties about freedom of expression and privacy).”Member’s explanatory statement
This amendment inserts a signpost to Clause 18, to which the duties in Clause 12 are relevant.
73: Clause 12, divide Clause 12 into two clauses, the first (User empowerment duties) to consist of subsections (1) to (7) and the second (User empowerment duties: interpretation) to consist of subsections (8) to (16)
Member’s explanatory statement
This amendment splits up Clause 12 into two Clauses.
Amendments 65 to 73 agreed.
Clause 16: Duty about content reporting
Amendment 74
Moved by
74: Clause 16, page 19, line 26, leave out from “if” to “the” in line 28 and insert “age verification or age estimation is used on the service with”
Member’s explanatory statement
This amendment provides that a provider can only conclude that children cannot access a service if age verification or age estimation is used on the service with the result that children are not normally able to access it.
Amendment 74 agreed.
Clause 17: Duties about complaints procedures
Amendments 75 and 76
Moved by
75: Clause 17, page 21, line 2, leave out “11(3)” and insert “11(2) or (3)”
Member’s explanatory statement
This amendment is about complaints of content being blocked because of an incorrect assessment of a user’s age. A reference to Clause 11(2) is inserted, as the duty in that provision can also be complied with by using age verification or age estimation.
76: Clause 17, page 21, line 16, leave out from “if” to “the” in line 18 and insert “age verification or age estimation is used on the service with”
Member’s explanatory statement
This amendment provides that a provider can only conclude that children cannot access a service if age verification or age estimation is used on the service with the result that children are not normally able to access it.
Amendments 75 and 76 agreed.
19:00
Clause 18: Duties about freedom of expression and privacy
Amendment 77
Moved by
77: Clause 18, page 21, line 30, after “implementing,” insert “terms of service,”
Member’s explanatory statement
This amendment, and others in the name of Baroness Fox of Buckley, ensure free speech is not just considered at an abstract policy level but is included in providers’ terms of service.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I am rather disappointed that, while this is a large group on freedom of expression, it is dominated by amendments by myself and the noble Lord, Lord Moylan. I welcome the noble Baroness, Lady Fraser of Craigmaddie, and the noble Lord, Lord Stevenson of Balmacara, dipping their toes in the free-expression water here and I am glad that the Minister has added his name to their amendment, although it is a shame that he did not add his name to one of mine.

Earlier today we heard a lot of congratulations to the Government for listening. I have to say, it depends who you are, because the Government have not listened to all of us. It is notable that, of the hundreds of new government concessions that have taken the form of amendments on Report, none relates to free speech. Before I go through my amendments, I want to note that, when the noble Lord, Lord Moylan, and I raise concerns about free speech, it can be that we get treated as being slightly eccentric. There has been a generally supportive and generous mood from the regulars in this House. I understand that, but I worry that free speech is being seen as peripheral.

This country, our country, that we legislate for and in, has a long history of boasting that it is the home of liberty and adopts the liberal approach that being free is the default position: that free speech and the plurality and diversity of views it engenders are the cornerstone of democracy in a free society and that any deviation from that approach must require extraordinary and special justification. A comprehensive piece of law, such as the one we are dealing with, that challenges many of those norms, deserves thorough scrutiny through the prism of free speech.

When I approached this Bill, which I had been following long before I arrived in this House, I assumed that there would be packed Benches—as there are on the Illegal Migration Bill—and that everybody, including all these Law Lords, would be in quoting the European Court of Human Rights on Article 8 and Article 10. I assumed there would be complaints about Executive power grabs and so on. But it has been a bit sparse.

That is okay; I can live with that, even if it is a bit dispiriting. But I am concerned when the Government cite that the mood of the Committee has been reflected in their amendments, because it has not been a very large Committee. Many of the amendments that I, the noble Lord, Lord Moylan, and others tabled about free expression represent the concerns of a wide range of policy analysts, civil rights groups, academics, lawyers, free speech campaigners and industry representatives. They have been put forward in good faith—I continue to do that—to suggest ways of mitigating some of the grave threats to free speech in this Bill, with constructive ideas about how to tackle flaws and raising some of the problems of unintended consequences. I have, at times, felt that those concerns were batted away with a certain indifference. Despite the Minister being very affable and charming, none the less it can be a bit disappointing.

Anyway, I am here to bat again. I hope that the Government now will listen very closely and consider how to avoid the UK ending up with the most restrictive internet speech laws of any western democracy at the end of this. I have a lot of different amendments in my name in this group. I wholeheartedly support the amendments in the name of the noble Lord, Lord Moylan, requiring Ofcom to assess the impact of its codes on free speech, but I will not speak to them.

I will talk about my amendments, starting with Amendments 77, 78, 79, 80 and 81. These require platforms to have particular regard to freedom of expression, not just when implementing safety measures and policies but when writing their terms of service. This is to ensure that freedom of expression is not reduced to an abstract “have regard to” secondary notion but is visible in drafting terms of services. This would mean that users know their rights in clear and concrete terms. For example, a platform should be expected to justify how a particular term of service, on something such as religious hatred, will be balanced with consideration of freedom of expression and conscience, in order to allow discussions over different beliefs to take place. Users need to be able to point to specific provisions in the terms of service setting out their free speech protections.

This is all about parity between free speech and safety. Although the Government—and I welcome this—have attempted some balance, via Clause 18, to mitigate the damage to individual rights of free expression from the Bill, it is a rather weak, poor cousin. We need to recognise that, if companies are compelled to prevent and minimise so-called harmful content via operational safety duties, these amendments are saying that there should be parity with free expression. They should be compelled to do the same on freedom of expression, with a clear and positive duty, rather than Clause 64, which is framed rather negatively.

Amendment 188 takes on the issue of terms of service from a different direction, attempting to ensure that duties with regard to safety must not be allowed to restrict lawful expression or that protected by Article 10 of the European Convention on Human Rights. That states that interference in free speech rights is not lawful unless it is a last resort. I note, in case anyone is reading the amendment carefully, and for Hansard, that the amendment cites Article 8—a rather Freudian slip on my part that was not corrected by the Table Office. That is probably because privacy rights are also threatened by the Bill, but I meant Article 10 of course.

Amendment 188 addresses a genuine dilemma in terms of Ofcom enforcing safety duties via terms and conditions. These will transform private agreements between companies and users into statutory duties under Clause 65. This could mean that big tech companies would be exercising public law functions by state-backed enforcement of the suppression of lawful speech. One worry is that platforms’ terms of service are not neutral; they can change due to external political or commercial pressures. We have all been following with great interest what is happening at Twitter. They are driven by values which can be at odds with UK laws. So I hope the Minister will answer the query that this amendment poses: how is the UK able to uphold its Article 10 obligations if state regulators are legally instructed to enforce terms of service attitudes to free speech, even when they censor far more than UK domestic law requires?

Amendment 162 has a different focus and removes offences under Section 5 of the Public Order Act from the priority offences to be regulated as priority illegal content, as set out in Schedule 7. This amendment is prompted by a concern that the legislation enlists social media companies to act as a private online police force and to adjudicate on the legality of online content. This is especially fraught in terms of the legal limits on speech, where illegality is often contested and contentious—offline as well as online.

The inclusion of Section 5 would place a duty on service providers to take measures to prevent individuals ever encountering content that includes

“threatening or abusive words or behaviour, or disorderly behaviour”

that is likely to cause “harassment, alarm or distress”. It would also require service providers to minimise the length of time such content is present on the service.

I am not sure whether noble Lords have been following the dispute that broke out over the weekend. There is a film on social media doing the rounds of a trans speaker, Sarah Jane Baker, at the Trans Pride event screaming pretty hysterically “If you see a TERF, punch them in the effing face”—and I am being polite. You would think that that misogynistic threat would be the crime people might be concerned about, yet some apologists for Trans Pride claim that those women—TERFs such as myself—who are outraged, and have been treating the speech as saying that, are the ones who are stirring up hate.

Now, that is a bit of a mess, but asking service providers, or indeed algorithms, to untangle such disputes can surely lead only to the over-removal of online expression, or even more of a muddle. As the rule of law charity Justice points out, this could also catch content that depicts conflict or atrocities, such as those taking place in the Russia-Ukraine war. Justice asks whether the inclusion of Section 5 of the POA could lead to the removal of posts by individuals sharing stories of their own abuse or mistreatment on internet support forums.

Additionally, under Schedule 7 to the Bill, versions of Section 5 could also be regulated as priority illegal conduct, meaning that providers would have to remove or restrict content that, for instance, encourages what is called disorderly behaviour that is likely to cause alarm. Various organisations are concerned that this could mean that content that portrayed protest activity, that might be considered disorderly by some, was removed unless you condemned it, or even that content which encouraged people to attend protests would be in scope.

I am not a fan of Section 5 of the Public Order Act, which criminalises stirring up hatred, at the best of times, but at least those offences have been and are subject to the full rigour of the criminal justice system and case law. Of course, the courts, the CPS and the police are also bound, for example by Article 10, to protect free speech. But that is very different to compelling social media companies, their staff or automated algorithms to make such complex assessments of the Section 5 threshold of illegality. Through no fault of their own, those companies are just not qualified to make such determinations, and it is obvious that that could mean that legitimate speech will end up being restricted. Dangerously, it also makes a significant departure from the UK’s rule of law in deciding what is legal or illegal speech. It has the potential to limit UK users’ ability to engage in important aspects of public life, and prevent victims of abuse from sharing their stories, as I have described.

I turn finally to the last amendment, Amendment 275—I will keep this short, for time’s sake. I will not go into detail, but I hope that the Minister will take a look at it, see that there is a loophole, and discuss it with the department. In skeleton form, the Free Speech Union has discovered that the British Board of Film Classification runs a mobile classification network, an agreement with mobile network providers that means that it advises mobile providers on what content should be filtered because it is considered suitable for adults only. This arrangement is private, not governed by statute, and as such means that even the weak free speech safeguards in this Bill can be sidestepped. This affects not only under-18s but anyone with factory settings on their phone. It led to a particular bizarre outcome when last year the readers of the online magazine, “The Conservative Woman”, reported that the website was inaccessible. This small online magazine was apparently blacklisted by the BBFC because of comments below the line on its articles. The potential for such arbitrary censorship is a real concern, and the magazine cannot even appeal to the BBFC, so I ask the Minister to take this amendment back to the DCMS, which helped set up this mobile classification network, and find out what is going on.

That peculiar tale illustrates my concerns about what happens when free speech is not front and centre, even when you are concerned about safety and harm. I worry that when free speech is casually disregarded, censorship and bans can become the default, and a thoughtless option. That is why I urge the Minister before Third Reading to at least make sure that some of the issues and amendments in this group are responded to positively.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, my noble friend on the Front Bench said at various points when we were in Committee that the Bill struck an appropriate balance between protecting the rights of children and the rights of those wishing to exercise their freedom of expression. I have always found it very difficult indeed to discern that point of balance in the Bill as originally drafted, but I will say that if there were such a point, it has been swamped by the hundreds of amendments tabled to the Bill by my noble friend since Committee which push the Bill entirely in the opposite direction.

Among those amendments, I cannot find—it may be my fault, because I am just looking by myself; I have no help to find these things—a single one which seeks to redress the balance back in favour of freedom of expression. My Amendments 123, 128, 130, 141, 148 and 244 seek to do that to some extent, and I am grateful to the noble Baroness, Lady Fox of Buckley, for the support she has expressed for them.

19:15
The Bill includes various provisions requiring providers to have regard to freedom of speech. The group of amendments I am about to speak to addresses the fact that nowhere in the Bill is there any obligation placed on Ofcom to have regard to freedom of speech, unless—it is just possible—it is in that swathe of amendments that has been tabled which I have been swimming in over the weekend looking for a few pearls but finding none. I will take noble Lords through those amendments very briefly, and say what they actually do. I cannot see that there could be any objection to them from any Member of the House, but it seems that the Government are uninterested.
Amendment 123 appears in Chapter 6 of the Bill, which deals with codes of conduct, and imposes a new duty on Ofcom in exercising the functions listed in that regard to
“have special regard to the importance of protecting the rights of users of a service and … interested persons to freedom of expression within the law”.
Who can object to that? Amendment 128 requires Ofcom to issue a statement when it is issuing a code of conduct, showing how it has complied with this new duty. Amendment 130 requires the Secretary of State to lay that statement before Parliament alongside the draft code of conduct, when he lays it.
Amendment 141 relates to how Ofcom responds to a direction—we are moving away from codes of conduct now—from the Secretary of State made under Clause 39. It requires the document Ofcom submits to the Secretary of State in response to that direction to specify how it has complied with the new duty. Amendment 148 requires Ofcom to issue a similar statement to accompany minor amendments to a code of conduct. Amendment 244 imposes a similar duty on Ofcom in relation to the issuance of guidance, which is separate from codes of conduct and from responding to a direction from the Secretary of State. Finally, Amendment 269, which is slightly different, relates to Ofcom’s guidance to providers on enforcement activities, because Ofcom is required to give guidance to providers on how they conduct enforcement activities. Again, it requires Ofcom to have regard to freedom of speech in doing that.
These are not wild and woolly demands. One of the briefs I have received in relation to Report stage reads as follows:
“The proposed duties on providers of Category 1 Services … that seek to protect freedom of expression … should be replaced with a single comprehensive duty to protect the right to freedom of expression. This duty should require Category 1 Service providers to take all reasonable steps to ensure that freedom of expression is not infringed by measures taken to comply with the other duties in the Bill. This should include giving the duty to protect freedom of expression similar status and form as the duties on illegal content”.
That does not come from some strange right-wing think tank; it comes from the Equality and Human Rights Commission.
If I may briefly trespass on the House by quoting a little bit more, the commission goes further:
“The duty to protect the right to freedom of expression should be included in the list of relevant duties for which Ofcom will be required to develop a code of practice”.
I think I referred to it as a code of conduct in my remarks so far; code of practice is the correct term. This is precisely what part of one of my amendments seeks to do, so these recommendations have a very good pedigree. I cannot see, for the life of me, why the Government would want to resist them.
Before I sit down, I will turn briefly to some of the amendments in the name of the noble Baroness, Lady Fox of Buckley, to which I have added my name. Amendment 162 seeks to remove the offence in Section 5 of the Public Order Act from the list of priority illegal content. This provision criminalises
“threatening or abusive words or behaviour, or disorderly behaviour”
likely to lead to “harassment, alarm or distress”. In this House, we spent a considerable amount of time last year, in relation to the then Police, Crime, Sentencing and Courts Bill, seeking new statutory guidance—which has subsequently arrived—to the College of Policing about how this complex and difficult offence should be enforced in the non-virtual world. Here we are, in effect, simply handing it over to private companies, many of them abroad, under the strange and remote supervision of Ofcom, and assuming it is all going to work very well. It is not. The example of transgender disputes, given by the noble Baroness, is a particularly rich example of how difficult it is going to be for private companies to enforce it online. There is a strong case for removing it altogether from the Bill.
Amendment 188 relates to Clause 65, which is about providers’ terms of service. It requires Ofcom to enforce terms of service in a way that is compatible with our rights to freedom of expression under the European Convention on Human Rights. Noble Lords may remember that many of these companies come from countries that are not European. They do not live under the legal cosh of the European Convention on Human Rights. Why should we be enforcing terms of service that might be perfectly legal in California, or Russia, or Kazakhstan—we do not know where the next popular phenomena on the web are going to come from—without having the restriction placed on Ofcom that it cannot be done in a way that contravenes, or at least does not uphold, our rights under the European convention?
I turn to Amendment 275 and the peculiar discovery that the British Board of Film Classification is running its own parallel censorship system—they used to be called censors, so I think I can call them that fairly—in an entirely private arrangement that has no supervision from Ofcom at all. The suggestion is that perhaps, if we are going to have one system supervised by Ofcom, everything might be brought within it so that we have a degree of consistency. Again, I find it very hard to understand why the Government would resist an amendment that is so pellucidly commonsensical.
With that, I will sit down. I do not think these issues are going to go away; there is a very strong public interest in this, as there is, increasingly in recent days, in various other amendments that are going to come up later in the Bill. By pushing things through in the way we have with the amendments the Government have conceded to those who argue for stronger enforcement and more restriction on access to the internet, it may all pass through the Commons and simply become law. I seriously have my doubts, as I have expressed in relation to, for example, Wikipedia and the threat to Welsh Wicipedia, whether some of this is going to survive first contact with reality. The amendments I propose would make it easier to do so.
Lord Hope of Craighead Portrait Lord Hope of Craighead (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I speak to Amendments 286 and 294, which are the last two amendments in this group, and I will explain what they are about. They are in the name of the noble Baroness, Lady Fraser of Craigmaddie, who unfortunately cannot be here this evening, to which I and the noble Lord, Lord Stevenson of Balmacara, have added our names, as has the Minister, for which we are very grateful. They serve a simple purpose: they seek to insert a definition of the phrase “freedom of expression” into the list of definitions in Clause 211 and add it to the index of defined expressions in Clause 212.

They follow an amendment which I proposed in Committee. My amendment at that stage was to insert the definition into Clause 18, where the phrase

“freedom of expression within the law”

appears. It was prompted by a point made by the Constitution Committee in its report on the Bill, which said that the House might wish to consider defining that expression in the interests of legal certainty.

The same point arose when the House was considering the then Higher Education (Freedom of Speech) Bill. Following a similar amendment by me, a government amendment on Report, to achieve the same result, was agreed to that Bill. My amendment in Committee on this Bill adopted the same wording as the government amendment to that Bill. In his response to what I said in Committee, the Minister pointed out, quite correctly, that the Higher Education (Freedom of Speech) Act and this Bill serve quite different purposes, but he did say that the Bill team—and he himself—would consider our amendment closely between then and Report.

What has happened since is the amendment we are now proposing, which has undergone some changes since Committee. They are the product of some very helpful discussions with the Bill team. The most important is that the definition placed in Clause 211 extends to the use of the expression “freedom of expression” wherever it appears in the Bill, which is obviously a sensible change. It also now includes the word “receive” as well as the word “impart”, so that it extends to both kinds of communication that are within the scope of the Bill. The words “including in electronic form”, which are in my amendment, have been removed as unnecessary, as the Bill is concerned with communications in electronic form only.

There are also two provisions in the Bill which refer to freedom of expression to which, as the definition now makes clear, this definition is not to apply. They are in Clauses 36(6)(f) and 69(2)(d). This is because the context in which the expression is used there is quite different. They require Ofcom to consult people with expertise as to this right when preparing codes of conduct. They are not dealing with the duties of providers, which is what the definition aims to do.

As the discussion in Committee showed, and as the noble Baroness, Lady Fox, demonstrated again this evening, we tend to use the phrases “freedom of speech” and “freedom of expression” interchangeably, perhaps without very much thought as to what they really mean and how they relate to other aspects of the idea. That is why legal certainty matters when they appear in legislation. The interests of legal certainty will be met if this definition finds a place in the Bill, and it makes it clear that the reference is to the expression referred to in Article 10(1) of the convention as it has effect for the purposes of the Human Rights Act. That is as generous and comprehensive a definition as one would wish to have for the purposes of the Bill.

I am grateful to the Minister for his support and to the Bill team for their help. When the times come, either the noble Baroness, Lady Fraser, or I will move the amendment; it comes at the very end of the Bill so it will be at the last moment of the last day, when we are finishing Report. I look forward to that stage, as I am sure the Minister does himself.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I want to respond to some of the comments made by the noble Baroness, Lady Fox, and the noble Lord, Lord Moylan. I have been looking forward to this debate equally, as it touches on some crucial issues. One of the mistakes of the Bill that I place on the Government is that it was sold as somehow a balancing Bill. It is not; it is a speech-limiting Bill, as all Bills of this kind are. Its primary purpose is to prevent people in the United Kingdom encountering certain types of content.

If you support the Bill, it is because you believe that those restrictions are necessary and proportionate in the context of Article 8. Others will disagree. We cannot pretend that it is boosting free speech. The United States got it right in its first amendment. If you want to maximise speech, you prohibit your parliament regulating on speech: “Congress shall make no law that limits speech”. As soon as you start regulating, you tend towards limitations; the question in the UK and European contexts is whether those limitations are justified and justifiable.

19:30
I happen to think that certain limitations are, and there are reasons for that—not least, as we have to remind ourselves, because the Bill does not regulate the entire internet. As we discussed when we talked about exemptions, most direct speech by an individual in the United Kingdom remains unaffected. Email is unaffected; personal websites are unaffected. It regulates search and user to user. If you have concerns, as perhaps the noble Baroness, Lady Fox, does, you may feel that it goes too far, but we should be careful not to equate social media with the entire internet. When you are thinking about one’s right to speak, all those channels matter, not just the channels we are talking about. There is a case for saying that restrictions are necessary and proportionate with respect to Article 8, in the context of a regulation that regulates part—albeit an important part—of the internet.
Another thing to recognise—and this is where I perhaps depart from the noble Baroness, Lady Fox, and the noble Lord, Lord Moylan—is that we are in a sense dealing with privately managed public spaces on the internet. There is a lot of debate around this but, for me, they are functionally equivalent to other privately managed public spaces such as pubs, hotels or sports grounds. In none of those context do we expect all legal speech to be permissible. Rather, they all have their own norms and they enforce them. I cannot go into a sports ground and say what I like; I will get thrown out if I carry out certain actions within most of those public spaces. We are talking about privately managed public spaces; anyone can go in but, in entering that space, you have to conform to the norms of that space. As I said, I am not aware of many spaces where all legal speech is permitted.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

I understand the point the noble Lord is making but, if he were thrown out, sacked or treated in some other way that was incompatible with his rights to freedom of expression under Article 10 of the European convention, he would have cause for complaint and, possibly, cause for legal redress.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

That point is well made. In support of that, if the public space treated me in a discriminatory way, I would expect to have redress, but I do not think I have a right in every public space to say everything I like in the classic Article 8 sense. My right vis-à-vis the state is much broader than my right vis-à-vis any public space that I am operating in where norms apply as well as my basic legal rights. Again, to take the pub example, if I went in and made a racist speech, I may well be thrown out of the pub even though it is sub-criminal and the police are never called; they do not need to be as the space itself organises it.

I am making the point that terms of service are about managing these privately managed public services, and it would be a mistake to equate them entirely with our right to speak or the point at which the state can step in and censor us. I understand the point about state interference but it cuts both ways: both the state interfering in excessively censoring what we can say but also the state potentially interfering in the management of what is, after all, a private space. To refer back to the US first amendment tradition, a lot of that was about freedom of religion and precisely about enabling heterodoxy. The US did not want an orthodoxy in which one set of rules applied everywhere to everybody. Rather, it wanted people to have the right to dissent, including in ways that were exclusive. You could create your own religious sect and you could not be told not to have those beliefs.

Rolling that power over to the online world, online services, as long as they are non-discriminatory, can have quite different characters. Some will be very restrictive of speech like a restrictive religious sect; some will be very open and catholic, with a small “c”, in the sense of permitting a broad range of speech. I worry about some of the amendments in case there is a suggestion that Ofcom would start to tell a heterodox community of online services that there is an orthodox way to run their terms of service; I would rather allow this to be a more diverse environment.

Having expressed some concerns, I am though very sympathetic to Amendment 162 on Section 5 of the Public Order Act. I have tried in our debates to bring some real experience to this. There are two major concerns about the inclusion of the Public Order Act in the Bill. One is a lack of understanding of what that means. If you look at the face of the language that has been quoted at us, and go back to that small service that does not have a bunch of lawyers on tap, it reads as though it is stopping any kind of abusive content. Maybe you will google it, as I did earlier, and get a little thing back from the West Yorkshire Police. I googled: “Is it illegal to swear in the street?”. West Yorkshire Police said, “Yes, it is”. So if you are sitting somewhere googling to find out what this Public Order Act thing means, you mind end up thinking, “Crikey, for UK users, I have to stop them swearing”. There is a real risk of misinterpretation.

The second risk is that of people deliberately gaming the system; again, I have a real-life example from working in one of the platforms. I had people from United Kingdom law enforcement asking us to remove content that was about demonstrations by far-right groups. They were groups I fundamentally disagree with, but their demonstrations did not appear to be illegal. The grounds cited were that, if you allow this content to go ahead and the demonstration happens, there will be a Public Order Act offence. Once you get that on official notepaper, you have to be quite robust to say, “No, I disagree”, which we did on occasion.

I think there will be other services that receive Public Order Act letters from people who seem official and they will be tempted to take down content that is entirely legal. The critical thing here is that that content will often be political. In other parts of the Bill, we are saying that we should protect political speech, yet we have a loophole here that risks that.

I am sure the Minister will not concede these amendments, but I hope he will concede that it is important that platforms are given guidance so that they do not think that somebody getting upset about a political demonstration is sufficient grounds to remove the content as a Public Order Act offence. If you are a local police officer it is much better to get rid of that EDL demonstration, so you write to the platform and it makes your life easier, but I do not think that would be great from a speech point of view.

Finally, I turn to the point made by the noble Lord, Lord Moylan, on Amendment 188 about the ECHR Article 8 exemption. As I read it, if your terms of service are not consistent with ECHR Article 8—and I do not think they will be for most platforms—you then get an exemption from all the other duties around appeals and enforcing them correctly. It is probably a probing amendment but it is a curious way of framing it; it essentially says that, if you are more restrictive, you get more freedom in terms of the Ofcom relationship. I am just curious about the detail of that amendment.

It is important that we have this debate and understand this relationship between the state, platforms and terms of service. I for one am persuaded that the general framework of the Bill makes sense; there are necessary and proportionate restrictions. I am strongly of the view that platforms should be allowed to be heterodox in their terms of service. Ofcom’s job is very much to make sure that they are done correctly but not to interfere with the content of those terms of service beyond that which is illegal. I am persuaded that we need to be extraordinarily careful about including Public Order Act offences; that particular amendment needs a good hearing.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I have said several times when we have been debating this Bill—and I will probably say it again when we get to the group about powers—that, for me, the point of the Online Safety Bill is to address the absence of accountability for the extraordinary power that the platforms and search engines have over what we see online and, indeed, how we live and engage with each other online. Through this Bill, much greater responsibility for child safety will be placed on the platforms. That is a good thing; I have been very supportive of the measures to ensure that there are strong protections for children online.

The platforms will also have responsibility, though, for some measures to help adults protect themselves. We must not forget that, the more responsibility that platforms have to protect, the more power we could inadvertently give them to influence what is an acceptable opinion to hold, or to shape society to such an extent that they can even start to influence what we believe to be right or wrong—we are talking about that significant amount of power.

I was of the camp that was pleased when the Government removed the legal but harmful aspects of the Bill, because for me they represented a serious risk to freedom of expression. As I just described, I felt that they risked too much inadvertent power, as it were, going to the platforms. But, with the Government having done that, we have seen through the passage of the Bill some push-back, which is perfectly legitimate and understandable—I am not criticising anyone—from those who were concerned about that move. In response to that, the Government amended the Bill to provide assurances and clarifications on things like the user-empowerment tools. As I said, I do not have any problem; although I might not necessarily support some of the specific measures that were brought forward, I am okay with that as a matter of principle.

However, as was explained by my noble friend Lord Moylan and the noble Baroness, Lady Fox, there has not been a similar willingness from the Government to reassure those who remain concerned about the platforms’ power over freedom of expression. We have to bear in mind that some people’s concerns in this quarter remained even when the legal but harmful change was made—that is, the removal of legal but harmful was a positive step, but it did not go far enough for some people with concerns about freedom of expression.

I am sympathetic to the feeling behind this group, which was expressed by my noble friend and the noble Baroness, Lady Fox. I am sympathetic to many of the amendments. As the noble Lord, Lord Allan of Hallam, pointed out, specifically Amendment 162 in relation to the Public Order Act seems worthy of further consideration by the Government. But the amendments in the group that caught my attention place a specific duty on Ofcom in regard to freedom of expression when drawing up or amending codes of practice or other guidance—these amendments are in my noble friend Lord Moylan’s name. When I looked at them, I did not think that they undermined anything else that the Government brought forward through the amendments to the Bill, as he said, but I thought that they would go a long way towards enforcing the importance of freedom of expression as part of this regulatory framework—one that we expect Ofcom to attach serious importance to.

I take on board what the noble Lord, Lord Allan, said about the framework of this legislation being primarily about safeguarding and protection. The purpose of the Bill is not to enhance freedom of expression, but, throughout its passage, that has none the less always been a concern. It is right that the Government seek to balance these two competing fundamental principles. I ask whether more can be done—my noble friend pointed to the recommendations of the Equality and Human Rights Commission and how they reinforce some of what he proposed. I would like to think that my noble friend the Minister could give some greater thought to this.

As was said, it is to the Government’s credit how much they have moved on the Bill during its passage, particularly between Committee and Report. That was quite contrary to the sense that I think a lot of us felt during the early stages of our debates. It would be a shame if, once the Bill leaves the House, it is felt that the balance is not as fine—let me put it like that—as some people feel it needs to be. I just wanted to express some support and ask my noble friend the Minister to give this proper and serious consideration.

19:45
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I rise briefly to note that, in the exchange between the noble Lords, Lord Allan and Lord Moylan, there was this idea about where you can complain. The independent complaints mechanism would be as advantageous to people who are concerned about freedom of speech as it would be for any other reason. I join and add my voice to other noble Lords who expressed their support for the noble Baroness, Lady Fox, on Amendment 162 about the Public Order Act.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, we are dangerously on the same page this evening. I absolutely agree with the noble Baroness, Lady Kidron, about demonstrating the need for an independent complaints mechanism. The noble Baroness, Lady Stowell, captured quite a lot of the need to keep the freedom of expression aspect under close review, as we go through the Bill. The noble Baroness, Lady Fox, and the noble Lord, Lord Moylan, have raised an important and useful debate, and there are some crucial issues here. My noble friend captured it when he talked about the justifiable limitations and the context in which limitations are made. Some of the points made about the Public Order Act offences are extremely valuable.

I turn to one thing that surprised me. It was interesting that the noble Lord, Lord Moylan, quoted the Equality and Human Rights Commission, which said it had reservations about the protection of freedom of expression in the Bill. As we go through the Bill, it is easy to keep our eyes on the ground and not to look too closely at the overall impact. In its briefing, which is pretty comprehensive, paragraph 2.14 says:

“In a few cases, it may be clear that the content breaches the law. However, in most cases decisions about illegality will be complex and far from clear. Guidance from Ofcom could never sufficiently capture the full range or complexity of these offences to support service providers comprehensively in such judgements, which are quasi-judicial”.


I am rather more optimistic than that, but we need further assurance on how that will operate. Its life would probably be easier if we did not have the Public Order Act offences in Schedule 7.

I am interested to hear what the Minister says. I am sure that there are pressures on him, from his own Benches, to look again at these issues to see whether more can be done. The EHRC says:

“Our recommendation is to create a duty to protect freedom of expression to provide an effective counterbalance to the duties”.


The noble Lord, Lord Moylan, cited this. There is a lot of reference in the Bill but not to the Ofcom duties. So this could be a late contender to settle the horses, so to speak.

This is a difficult Bill; we all know that so much nuance is involved. We really hope that there is not too much difficulty in interpretation when it is put into practice through the codes. That kind of clarity is what we are trying to achieve, and, if the Minister can help to deliver that, he will deserve a monument.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

It is always nice to be nice to the Minister.

I will reference, briefly, the introduction of the amendments in the name of the noble Baroness, Lady Fraser of Craigmaddie, which I signed. They were introduced extremely competently, as you would expect, by my noble and learned kinsman Lord Hope. It is important to get the right words in the right place in Bills such as this. He is absolutely right to point out the need to be sure that we are talking about the right thing when we say “freedom of expression”—that we do mean that and not “freedom of speech”; we should not get them mixed up—and, also, to have a consistent definition that can be referred to, because so much depends on it. Indeed, this group might have run better and more fluently if we had started with this amendment, which would have then led into the speeches from those who had the other amendments in the group.

The noble Baroness is not present today, but not for bad news: for good news. Her daughter is graduating and she wanted to be present at that; it is only right that she should do that. She will be back to pick up other aspects of the devolution issues she has been following very closely, and I will support her at that time.

The debate on freedom of expression was extremely interesting. It raised issues that, perhaps, could have featured more fully had this been timetabled differently, as both noble Lords who introduced amendments on this subject said. I will get my retaliation in first: a lot of what has been asked for will have been done. I am sure that the Minister will say that, if you look at the amendment to Clause 1, the requirement there is that freedom of expression is given priority in the overall approach to the Bill, and therefore, to a large extent, the requirement to replace that at various parts of the Bill may not be necessary. But I will leave him to expand on that; I am sure that he will.

Other than that, the tension I referred to in an earlier discussion, in relation to what we are made to believe about the internet and the social media companies, is that we are seeing a true public square, in which expressions and opinions can be exchanged as freely and openly as they would be in a public space in the real world. But, of course, neither of those places really exists, and no one can take the analogy further than has been done already.

The change, which was picked up by the noble Baroness, Lady Stowell, in relation to losing “legal but harmful”, has precipitated an issue which will be left to social media companies to organise and police—I should have put “policing” in quotation marks. As the noble Baroness, Lady Kidron, said, the remedy for much of this will be an appeals mechanism that works both at the company level and for the issues that need rebalancing in relation to complexity or because they are not being dealt with properly. We will not know that for a couple of years, but at least that has been provided for and we can look forward to it. I look forward to the Minister’s response.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I hope that the noble Baroness, Lady Fox, and my noble friend Lord Moylan do feel that they have been listened to. It was striking, in this debate, that they had support from all corners of your Lordships’ House. I know that, at various points in Committee, they may have felt that they were in a minority, but they have been a very useful and welcome one. This debate shows that many of the arguments that they have made throughout the passage of the Bill have resonated with noble Lords from across the House.

Although I have not signed amendments in the names of the noble Baroness and my noble friend Lord Moylan, in many cases it is not because I disagree with them but because I think that what they do is already covered in the Bill. I hope to reassure them of that in what I say now.

Amendments 77 to 81 from the noble Baroness, Lady Fox, would require services to have particular regard to freedom of expression and privacy when deciding on their terms of service. Services will already need to have particular regard to users’ rights when deciding on safety systems to fulfil their duties. These requirements will be reflected in providers’ terms of service, as a result of providers’ duties to set out their safety measures in their terms of service. The framework will also include a range of measures to allow scrutiny of the formulation, clarity and implementation of category 1 providers’ own terms of service.

However, there are some points on which we disagree. For instance, we do not think that it would be appropriate for all providers to have a general duty to have a particular regard to freedom of expression when deciding on their own terms of service about content. We believe that the Bill achieves the right balance. It requires providers to have regard to freedom of expression when carrying out their safety duties, and it enables public scrutiny of terms of service, while recognising providers’ own freedom of expression rights as private entities to set the terms of service that they want. It is of course up to adults to decide which services to use based on the way those services are drawn up and the way the terms of service set out what is permissible in them.

Nothing in the Bill restricts service providers’ ability to set their own terms and conditions for legal content accessed by adults—that is worth stressing. Ofcom will not set platforms’ terms and conditions, nor will it take decisions on whether individual pieces of content should, or should not, be on a platform. Rather, it will ensure that platforms set clear terms and conditions, so that adults know what to expect online, and ensure that platforms have systems and processes in place to enforce those terms and conditions themselves.

Amendment 226 from the noble Baroness, Lady Fox, would require providers to use all relevant information that is reasonably available to them whenever they make judgments about content under their terms of service. That is, where they have included or drafted those terms of service in compliance with duties in the Bill. Her amendment would be to an existing requirement in Clause 173, which already requires providers to take this approach whenever they implement a system or process to comply, and this system is making judgments about certain content. For example, Clause 173 already covers content judgments made via systems and processes that a category 1 provider implements to fulfil its Clause 65 duties to enforce its own terms of service consistently. So we feel that Clause 173 is already broad enough to achieve the objectives that the noble Baroness, Lady Fox, seeks.

My noble friend Lord Moylan’s amendments seek to require Ofcom to have special regard to the importance of protecting freedom of expression when exercising its enforcement duties and when drafting codes or guidance. As we discussed in Committee, Ofcom has existing obligations to protect freedom of expression, and the Bill will include additional measures in this regard. We are also making additional amendments to underline the importance of freedom of expression. I am grateful to the noble and learned Lord, Lord Hope of Craighead, and my noble friend Lady Fraser of Craigmaddie for their work to define “freedom of expression” in the Bill. The Bill’s new overarching statement at Clause 1, as the noble Lord, Lord Stevenson, rightly pointed out, lists “freedom of expression”, signalling that it is a fundamental part of the Bill. That is a helpful addition.

Amendment 188 in the name of the noble Baroness, Lady Fox, seeks to disapply platforms’ Clause 65 duties when platforms’ terms of service restrict lawful expression, or expression otherwise protected by Article 10 of the European Convention on Human Rights. Her amendment would mean that category 1 providers’ Clause 65 duties to enforce clear, accessible terms of service in a consistent manner would not apply to any of their terms of service, where they are making their own decisions restricting legal content. That would greatly undermine the application of these provisions in the Bill.

Article 10 of the European Convention on Human Rights concerns individuals’ and entities’ rights to receive and impart ideas without undue interference by public authorities, not private entities. As such, it is not clear how a service provider deciding not to allow a certain type of content on its platform would engage the Article 10 rights of a user.

Beyond the legal obligations regarding the treatment of certain kinds of user-generated content imposed by this Bill and by other legislation, platforms are free to decide what content they wish, or do not wish, to have on their services. Provisions in the Bill will set out important duties to ensure that providers’ contractual terms on such matters are clear, accessible and consistently enforced.

20:00
Moreover, as we have discussed before, Ofcom is bound by the Human Rights Act 1998. So, when carrying out all its functions under this Bill, including the preparation of guidance and codes, it will need to ensure that freedom of expression is protected. There is already a range of other measures in the Bill which ensure that Ofcom protects freedom of expression; for instance, it has a duty in Clause 143 to set out the steps it has taken, and the processes it operates, to ensure that its online safety functions have been exercised compatibly with Articles 8 and 10 of the European Convention on Human Rights. As such, my noble friend Lord Moylan’s amendments would be largely duplicative, since Ofcom already has an obligation to set out similar information in an annual statement.
The illegal content duties, in relation to the points raised about Section 5 of the Public Order Act in Schedule 7, remain risk-based and proportionate. Platforms must use proportionate systems and processes designed to prevent users encountering illegal content and to minimise the length of time that any priority illegal content is present on the service. We are not requiring platforms to ensure that users never encounter illegal content. Companies could take proportionate measures, such as user reporting, user empowerment and enforcing policies which prohibit threats or abuse, but the Bill also creates strong safeguards to protect freedom of expression. All services will need to have particular regard to freedom of expression when implementing safety duties. I certainly agree with the noble Lord, Lord Allan of Hallam, when he says that good and clear guidance is vital here. That is why we have put in place a requirement through Clause 174 for Ofcom to produce guidance about how to make judgments about illegal content.
Amendment 162 from the noble Baroness, Lady Fox, seeks to remove offences under Section 5 of the Public Order Act 1986 from the priority offences list. Section 5 of the Public Order Act makes it an offence to use
“threatening or abusive words or behaviour, or disorderly behaviour”
or to display any
“visible representation which is threatening or abusive”.
Given that that activity can cause harm, it is right that companies have duties to tackle it and, subject to the guidance that I have just mentioned, we think that the Bill sets that out appropriately.
The noble Baroness’s Amendment 275 would require Ofcom to ensure that content classification frameworks created by the British Board of Film Classification, which act as a reference for providers’ online safety duties, should not undermine the Bill’s safeguards for freedom of expression. If it is the case that a content classification scheme produced by the BBFC is unsuitable to be used as a reference for whether content falls within the scope of providers’ new online safety duties, Ofcom should not recommend it in its codes of practice. Ofcom has specific duties in the Bill to protect freedom of expression when drafting its codes of practice, which will ensure that any measures it recommends are designed in that light. However, I will take the point and case study she raised back to the department to see whether I can find out any further detail about what went on in that instance.
Amendments 286 and 294 would insert a definition of “freedom of expression” into the Bill. As I mentioned, I am grateful to the noble and learned Lord, Lord Hope, and my noble friend Lady Fraser for proposing these amendments, which align the definition of freedom of expression in the Bill with that in the European Convention on Human Rights. We agree with them that it will increase clarity about freedom of expression in the Bill, which is why I have added my name to their amendments and, when we come to the very end of Report—to which I look forward as well—I will be very glad to support them.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, before my noble friend sits down, perhaps I could seek a point of clarification. I think I heard him say, at the beginning of his response to this short debate, that providers will be required to have terms of service which respect users’ rights. May I ask him a very straightforward question: do those rights include the rights conferred by Article 10 of the European Convention on Human Rights? Put another way, is it possible for a provider operating in the United Kingdom to have terms and conditions that abridge the rights conferred by Article 10? If it is possible, what is the Government’s defence of that? If it is not possible, what is the mechanism by which the Bill achieves that?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

As I set out, I think my noble friend and the noble Baroness, Lady Fox, are not right to point to the European Convention on Human Rights here. That concerns individuals’ and entities’ rights

“to receive and impart ideas without undue interference”

by public authorities, not private entities. We do not see how a service provider deciding not to allow certain types of content on its platform would engage the Article 10 rights of the user, but I would be very happy to discuss this further with my noble friend and the noble Baroness in case we are talking at cross-purposes.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

On that point specifically, having worked inside one of the companies, they fear legal action under all sorts of laws, but not under the European Convention on Human Rights. As the Minister explained, it is for public bodies; if people are going to take a case on Article 10 grounds, they will be taking it against a public body. There are lots of other grounds to go after a private company but not ECHR compliance.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

My Lords, I genuinely appreciate this debate. The noble Lord, Lord Clement-Jones, made what I thought was a very important point, which is, in going through the weeds of the Bill—and some people have been involved in it for many years, looking at the detail—I appreciate that it can be easy to forget the free speech point. It is important that it has been raised but it also constantly needs to be raised. That is the point: it is, as the noble Lord, Lord Allan of Hallam, admitted, a speech-restricting Bill where we are working out the balance.

I apologise to the noble and learned, Lord Hope of Craighead, for not acknowledging that he has constantly emphasised the distinction between free speech and free expression. He and I will not agree on this; it is that we do not have time for this argument now rather than me not understanding. But he has been diligent in his persistence in trying to at least raise the issues and that is important.

I was a bit surprised by the Minister’s response because, for the first time ever, since I have been here, there has been some enthusiasm across the House for one of my amendments—it really is unprecedented—Amendment 162 on the public order offences. I thought that the Minister might have noted that, because he has noted it every other time there has been a consensus across the House. I think he ought to look again at Amendment 162.

To indicate the muddle one gets in, in terms of public order offences and illegality, the police force in Cheshire, where I am from, has put out a film online today saying that misgendering is a crime. That is the police who have said that. It is not a crime and the point about these things, and the difficulty we are concerned with, is asking people to remove and censor material based on illegality or public offences that they should not be removing. That is my concern: censorship.

To conclude, I absolutely agree with the noble Lord, Lord Allan of Hallam, that of course free speech does not mean saying whatever you want wherever you want. That is not free speech, and I am a free speech absolutist. Even subreddits—if people know what they are—think they are policing each other’s speech. There are norms that are set in place. That is fine with me—that multitude.

My concern is that a state body such as Ofcom is going to set norms of what is acceptable free speech that are lower than free speech laws by demanding, on pain of breach of the law, with fines and so on, that these private companies have to impose their own terms of service, which can actually then set a norm, leading them to be risk-averse, and set a norm for levels of speech that are very dangerous. For example, when you go into work, you cannot just say anything, but there are people such as Maya Forstater, who said something at work and was disciplined and lost her job and has just won more than £100,000, because she was expressing her views and opinions. The Equality Act ran to her aid and she has now won and been shown to be right. You cannot do that if your words have disappeared and are censored.

I could talk about this for a long time, as noble Lords know. I hope that at least, as the Bill progresses, even when it becomes an Act, the Government could just stamp on its head, “Don’t forget free speech”—but before then, as we end this process, they could come back with some concessions to some of the amendments that have been raised here today. That would be more than just words. I beg leave to withdraw the amendment.

Amendment 77 withdrawn.
Amendments 78 to 81 not moved.
Clause 19: Record-keeping and review duties
Amendments 82 and 83
Moved by
82: Clause 19, page 23, line 30, at end insert—
“(8A) A duty to make and keep a written record, in an easily understandable form, of all aspects of every assessment under section (Assessment duties: user empowerment) (assessments related to the adult user empowerment duty set out in section 12(2)), including details about how the assessment was carried out and its findings.”Member’s explanatory statement
This amendment requires providers of Category 1 services to keep full records of their assessments under the new Clause proposed after Clause 11 in my name.
83: Clause 19, page 23, line 31, leave out “a risk assessment as required by subsection (2)” and insert “an assessment as required by subsection (2) or (8A)”
Member’s explanatory statement
This amendment requires providers of Category 1 services to supply OFCOM with copies of records of their assessments under the new Clause proposed after Clause 11 in my name.
Amendments 82 and 83 agreed.
Consideration on Report adjourned until not before 8.42 pm.