Debates between Baroness Kidron and Lord Allan of Hallam during the 2019 Parliament

Mon 17th Jul 2023
Wed 12th Jul 2023
Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 2
Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1 & Report stage: Minutes of Proceedings
Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 23rd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 16th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Wed 19th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage & Committee stage

Online Safety Bill

Debate between Baroness Kidron and Lord Allan of Hallam
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

My Lords, I pay tribute to the noble Baroness, Lady Harding, for her role in bringing this issue forward. I too welcome the government amendments. It is important to underline that adding the potential role of app stores to the Bill is neither an opportunity for other companies to fail to comply and wait for the gatekeepers to do the job nor a one-stop shop in itself. It is worth reminding ourselves that digital journeys rarely start and finish in one place. In spite of the incredible war for our attention, in which products and services attempt to keep us rapt on a single platform, it is quite important for everyone in the ecosystem to play their part.

I have two minor points. First, I was not entirely sure why the government amendment requires the Secretary of State to consult as opposed to Ofcom. Can the Minister reassure me that, whoever undertakes the consultation, it will include children and children’s organisations as well as tech companies? Secondly, like the noble Baroness, Lady Harding, I was a little surprised that the amendment does not define an app store but uses the term “the ordinary meaning of”. That seems like it may have the possibility for change. If there is a good reason for that—I am sure there is—then it must be stated that app stores cannot suddenly rebrand to something else and that that gatekeeper function will be kept absolutely front and centre.

Notwithstanding those comments, and associating myself with the idea that nothing should wait until 2025-26, I am very grateful to the Government for bringing this forward.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will make a brief contribution because I was the misery guts when this was proposed first time round. I congratulate the noble Baroness, Lady Harding, not just on working with colleagues to come up with a really good solution but on seeking me out. If I heard someone be as miserable as I was, I might try to avoid them. She did not; she came and asked me, “Why are you miserable? What is the problem here?”, and took steps to address it. Through her work with the Government, their amendments address my main concerns.

My first concern, as we discussed in Committee, was that we would be asking large companies to regulate their competitors, because the app stores are run by large tech companies. She certainly understood that concern. The second was that I felt we had not necessarily yet clearly defined the problem. There are lots of problems. Before you can come up with a solution, you need a real consensus on what problem you are trying to address. The government amendment will very much help in saying, “Let’s get really crunchy about the actual problem that we need app stores to address”.

Finally, I am a glass-half-full kind of guy as well as a misery guts—there is a contradiction there—and so I genuinely think that these large tech businesses will start to change their behaviour and address some of the concerns, such as getting age ratings correct, just by virtue of our having this regulatory framework in place. Even if today the app stores are technically outside, the fact that the sector is inside and that this amendment tells them that they are on notice will, I think and hope, have a hugely positive effect and we will get the benefits much more quickly than the timescale envisaged in the Bill. That feels like a true backstop. I sincerely hope that the people in those companies, who I am sure will be glued to our debate, will be thinking that they need to get their act together much more quickly. It is better for them to do it themselves than wait for someone to do it to them.

Online Safety Bill

Debate between Baroness Kidron and Lord Allan of Hallam
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, these seem very sensible amendments. I am curious about why they have arrived only at this stage, given this was a known problem and that the Bill has been drafted over a long period. I am genuinely curious as to why this issue has been raised only now.

On the substance of the amendments, it seems entirely sensible that, given that we are now going to have 20,000 to 25,000 regulated entities in scope, some of which will never have encountered child sexual exploitation or abuse material or understood that they have a legal duty in relation to it, it will be helpful for them to have a clear set of regulations that tell them how to treat their material.

Child sexual exploitation or abuse material is toxic in both a moral and a legal sense. It needs to be treated almost literally as toxic material inside a company, and sometimes that is not well understood. People feel that they can forward material to someone else, not understanding that in doing so they will break the law. I have had experiences where well-meaning people acting in a vigilante capacity sent material to me, and at that point you have to report them to police. There are no ifs or buts. They have committed an offence in doing so. As somebody who works inside a company, your computer has to be quarantined and taken off and cleaned, just as it would be for any other toxic material, because we framed the law, quite correctly, to say that we do not want to offer people the defence of saying “I was forwarding this material because I’m a good guy”. Forwarding the material is a strict liability offence, so to have regulations that explain, particularly to organisations that have never dealt with this material, exactly how they have to deal with it in order to be legally compliant will be extremely helpful.

One thing I want to flag is that there are going to be some really fundamental cross-border issues that have to be addressed. In many instances of child sexual exploitation or abuse material, the material has been shared between people in different jurisdictions. The provider may not be in a UK jurisdiction, and we have got to avoid any conflicts of laws. I am sure the Government are thinking about this, but in drafting those regulations, what we cannot do, for example, is order a provider to retain data in a way that would be illegal in the jurisdiction from which it originates or in which it has its headquarters. The same would apply vice versa. We would not expect a foreign Government to order a UK company to act in a way that was against UK law in dealing with child sexual exploitation or abuse material. This all has to be worked out. I hope the Government are conscious of that.

I think the public interest is best served if the United Kingdom, the United States and the European Union, in particular, adopt common standards around this. I do not think there is anything between us in terms of how we would want to approach child sexual exploitation or abuse material, so the extent to which we end up having common legal standards will be extraordinarily helpful.

As a general matter, to have regulations that help companies with their compliance is going to be very helpful. I am curious as to how we have got there with the amendment only at this very late stage.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

My Lords, I rise to make a slightly lesser point, but I also welcome these amendments. I want to ask the Minister where the consultation piece of this will lie and to check that all the people who have been in this space for many years will be consulted.

Online Safety Bill

Debate between Baroness Kidron and Lord Allan of Hallam
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

My Lords, I thank the Minister and indeed the Secretary of State for bringing forward these amendments in the fulsome manner that they have. I appreciate it, but I know that Bereaved Families for Online Safety also appreciates it. The Government committed to bringing forward these amendments on the last day in Committee, so they have been pre-emptively welcomed and discussed at some length. One need only read through Hansard of 22 June to understand the strength of feeling about the pain that has been caused to families and the urgent need to prevent others experiencing the horror faced by families already dealing with the loss of their child.

I will speak briefly on three matters only. First, I must once again thank bereaved families and colleagues in this House and in the other place for their tireless work in pressing this issue. This is one of those issues that does not allow for celebration. As I walked from the Chamber on 22 June, I asked one of the parents how they felt. They said: “It is too late for me”. It was not said in bitterness but in acknowledgement of their profound hurt and the failure of companies voluntarily to do what is obvious, moral and humane. I ask the Government to see the sense in the other amendments that noble Lords brought forward on Report to make children safer, and make the same, pragmatic, thoughtful solution to those as they have done on this group of amendments. It makes a huge difference.

Secondly, I need to highlight just one gap; I have written to the Secretary of State and the Minister on this. I find it disappointing that the Government did not find a way to require senior management to attend an inquest to give evidence. Given that the Government have agreed that senior managers should be subject to criminal liability under some circumstances, I do not understand their objections to summoning them to co-operate with legal proceedings. If a company submits information in response to Ofcom and at the coroner’s request the company’s senior management is invited to attend the inquest, it makes sense that someone should be required to appear to answer and follow up those questions. Again, on behalf of the bereaved families and specifically their legal representatives, who are very clear on the importance of this part of the regime, I ask the Government to reconsider this point and ask the Minister to undertake to speak to the department and the MoJ, if necessary, to make sure that, if senior managers are asked to attend court, they are mandated to do so.

Thirdly, I will touch on the additional commitments the Minister made beyond the Bill, the first of which is the upcoming Data Protection and Digital Information Bill. I am glad to report that some of the officials working on the Bill have already reached out, so I am grateful to the Minister that this is in train, but I expect it to include guidance for companies that will, at a minimum, cover data preservation orders and guidance about the privacy of other users in cases where a child has died. I think that privacy for other users is central to this being a good outcome for everybody, and I hope we are able to include that.

I am pleased to hear about the undertaking with the US regarding potential barriers, and I believe—and I would love to hear from the Minister—that the objective is to make a bilateral agreement that would allow data to be shared between the two countries in the case of a child’s death. It is very specific requirement, not a wide-ranging one. I believe, if we can do it on a bilateral basis, it would be easier than a broad attempt to change the data storage Act.

I turn finally to training for coroners. I was delighted that the Chief Coroner made a commitment to consider issuing non-legislative guidance and training on social media for coroners and the offer of consultation with experts, including Ofcom, the ICO and bereaved families and their representatives, but this commitment was made subject to funding. I ask the Minister to agree to discuss routes to funding from the levy via Ofcom’s digital literacy duty. I have proposed an amendment to the government amendment that would make that happen, but I would welcome the opportunity to discuss it with the Minister. Coroners must feel confident in their understanding of the digital world, and I am concerned that giving this new route to regulated companies via Ofcom without giving them training on how to use it may create a spectre of failure or further frustration and distress for bereaved families. I know there is not a person in the House who would want that to be the outcome of these welcome government amendments.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I also welcome this group of amendments. I remember a debate led by the noble Baroness, Lady Kidron, some time ago in the Moses Room, where we discussed this, and I said at the time I thought it would get fixed in the Online Safety Bill. I said that in a spirit of hope, not knowing any of the detail, and it is really satisfying to see the detail here today. As she said, it is testimony to the families, many of whom got in touch with me at that time, who have persisted in working to find a solution for other families—as the noble Baroness said, it is too late for them, but it will make a real difference to other families—and it is so impressive that, at a time of extreme grief and justifiable anger, people have been able to channel that into seeking these improvements.

The key in the amendments, which will make that difference, is that there will be a legal order to which the platforms know they have to respond. The mechanism that has been selected—the information notice—is excellent because it will become well known to every one of the 25,000 or so platforms that operate in the United Kingdom. When they get an information notice from Ofcom, that is not something that they will have discretion over; they will need to comply with it. That will make a huge difference.

Online Safety Bill

Debate between Baroness Kidron and Lord Allan of Hallam
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I shall follow on directly from some of the comments of the noble Baroness, Lady Kidron, around privacy. I shall not repeat the arguments around children and pornography but touch on something else, which is the impact of these amendments on the vast majority of internet users, the 85%-plus who are 18 or older. Of course, when we introduce age assurance measures, they will affect everyone: we should not kid ourselves that it is only about children, because everyone will have to pass through these gateways.

I shall speak particularly to Amendments 184 and 217 on privacy. I am sure that most adults will support extra safety measures for children, but they also want to be able to access a wide range of online services with the least possible friction and the lowest risk to their own personal data. We can explore how this might work in practice by looking at something that we might all do in this Chamber. Looking round, I believe that we are all at least 18 years old, and we might find ourselves idly passing the time creating an account on a new user-to-user or search service that has been recommended. We should consider this group of amendments by how that might play out. In future, the services will have to check that we are in the United Kingdom—there is a range of ways in which they can do that. Having confirmed that, they will need to understand whether we are 18-plus or a child user so that they can tailor their service appropriately.

I hope we all agree that the services should not be asking us for passports or driving licences, for example, as that would be entirely contrary to the thrust of privacy regulations and would be a huge gateway to fraud and other problems. The most efficient way would be for them to ask us for some sort of digital certificate—a certificate that we have on our devices where we have proven to a trusted third party that we are 18-plus. The certificate does not need to contain any personal data but simply confirms that we are of age. That is very similar to the way in which secure websites work: they send a digital certificate to your browser and you verify that certificate with a trusted third party—a certificate authority—and then you can open an encrypted connection. We are reversing the flow: the service will ask the user for a certificate and then verify that before granting access. A user may have a setting on their device in future where they confirm that they are happy for their 18-plus certificate to be given to anybody or whether they would like to be asked every time there will be a new set of privacy controls.

Building the infrastructure for this is non-trivial. Many things could go wrong but at least the kind of model I am describing has some hope of achieving widespread adoption. It is very good for the adult users as they can continue to have the frictionless experience as long as they are happy for their device to send a certificate to new services. It is good for the market of internet services if new services can bring users on easily. It is good for privacy by avoiding lots of services each collecting personal data, as most people access a multiplicity of services. Perhaps most importantly in terms of the Bill’s objectives, it is good for children if services can separate out the vast majority of their users who are 18-plus and then focus their real efforts on tailoring the services for the minority of users who will be children. The Bill will introduce a whole set of new obligations.

We should not underestimate the scale of the challenge in practice; it will work only if major internet companies are willing to play the game and get into the market of offering 18-plus certificates. Companies such as Google, Meta, Amazon, Apple and Microsoft—the ones we normally love to hate—will potentially provide the solution, as well as not-for-profits. There will be foundations for those who object to the big internet companies, but it is those big internet companies which will have the reach; they each have millions of users in the United Kingdom. This is not to fly the flag for those companies; it is simply a question of efficiency. I suspect that everyone in the Chamber uses a combination of services from those big providers. We already share with them the personal data necessary for age assurance, and there would be no additional sharing of data. If they were willing to provide a certificate, they could do so at the kind of scale necessary for the 50 million or so adult internet users in the United Kingdom to be able to get one easily and then pass it to services when they choose to access them.

There may be some discomfort with big tech playing this role, but I cannot see the kind of aggressive targets that we are setting in the amendments working unless we take advantage of those existing platforms and use them to make this work. Amendment 230 tells us that we have about 18 months, which is very soon in terms of trying to build something. We should be clear that if we are to deliver this package it will depend on persuading some of those big names in tech to create age certification schemes for UK users.

For this to have widespread adoption and a competitive market, we need it to be free of direct financial costs to individual users and to services choosing to age-verify, as we have asked them to do so. We need to think very carefully about that, as it raises a whole series of competition questions that I am sure Ofcom and the Competition and Markets Authority will have to address, not least because we will be asking companies to provide age certification free of charge that will be used by their existing and future competitors to meet their compliance requirements.

There may be some listening who think that we can rely on small age-assurance start-ups. Some of them have a really important role to play and we should be proud of our homegrown industry, but we should be realistic that they will reach scale only if they work with and through the large service providers. Many of them are already seeking those kinds of relationship.

As a test case, we might think of an application such as Signal, a messaging app that prides itself on being privacy-first. It does not want to collect any additional information from its users, which is perfectly reasonable, given where it is coming from. It will be really interesting to see how comfortable such a service will be with working with certification schemes, under which it can prove that users are over 18 by taking advantage of the data held by other services which collect significant amounts of data and have a very good idea of how old we are.

I have not focused on under-18s but, once this system is in place, application providers will be thinking very carefully about the pros and cons of allowing under-18s on at all. I know that the noble Baroness, Lady Kidron, is also concerned about this. There will be services that will think very carefully, if they find that the vast majority of their users are 18-plus, about the extent to which they want to put time and effort into tailoring them for users under 18. We do not intend that outcome from the Bill, but we need realistically to consider it.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

Just to be clear, I say that the purpose of my question to the Minister was to get at the fact that, for low-risk situations, there can be age assurance that is a lot less effective or intrusive, for that very reason.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I agree; that is very helpful. I think Amendments 74, 93 and 99 also talk about the exclusion, as the noble Baroness raised, of services from the child safety duties if they can show that they are only 18-plus. It will be quite material and critical to know at what level they can demonstrate that.

I have avoided talking about pornography services directly, but there are interesting questions around what will happen if this model develops, as it likely will. If big tech is now starting to provide age certification for the kinds of mainstream services we may all want to access, they may be much less comfortable providing that same certification to pornography providers, for reputational reasons. A mainstream provider would not want to enter that market. Ofcom will need to take a view on this. We have talked about interoperability in the framework we have created, but it is a big question for Ofcom whether it wants to steer all age certification providers also to provide 18-plus certification for pornography providers or, effectively, to allow two markets to develop—one for mainstream certification and one for certification for pornography.

I have taken a few minutes because this is a very high-risk area for the Bill. There are material risks in willing into existence a model that depends on technical infrastructure that has not yet been built. The noble Lord, Lord Bethell, referred to prior experience; one of the reasons why we have not delivered age assurance before is that the infrastructure was not there. We now want it built, so must recognise that it is quite a high-risk endeavour. That does not mean it is not worth attempting, but we must recognise the risks and work on them.

If the implementation is poor, it will frustrate adult users, which may bring the Bill into disrepute. We need to recognise that as a genuine risk. There are people out there already saying that the Bill means that every internet service in the world will ask you for your passport. If that is not the case, we need to stress that we do not expect that to happen. There are also potentially significant impacts on the market for online services available to both adults and children in the UK, depending on the design of this system.

The purpose of thinking about some of these risks today is not to create a doom-laden scenario and say that it will not work. It is entirely the opposite—to say that, if we are to move ahead into a world in which children are protected from harmful content, for which very good reasons have been articulated and a huge amount of work has gone ahead, and in which services can tailor and gear access to the age of the child, we have to be able to take the 18-plus out of that, put it into a separate box and do so in a really easy, straightforward manner. If not, the 18-plus will end up dragging down what we want to do for the underage.

I hope that explanation helps in the context of these amendments. We will need to test them against it as implementation happens over the next few months.

Online Safety Bill

Debate between Baroness Kidron and Lord Allan of Hallam
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My comments will be rather shorter. I want to make a detailed comment about Amendment 5B, which I strongly support and which is in the name of the noble Lord, Lord Allan. It refers to,

“a genuine medical, scientific or educational purpose, … the purposes of peer support”

I would urge him to put “genuine peer support”. That is very important because there is a lot of dog whistling that goes on in this area. So if the noble Lord—

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My working assumption would be that that would be contestable. If somebody claimed the peer support defence and it was not genuine, that would lead to them becoming liable. So I entirely agree with the noble Baroness. It is a very helpful suggestion.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I also want to support the noble Baroness, Lady Kennedy. The level of abuse to women online and the gendered nature of it has been minimised; the perpetrators have clearly felt immune to the consequences of law enforcement. What worries me a little in this discussion is the idea or conflation that anything said to a woman is an act of violence. I believe that the noble Baroness was being very specific about the sorts of language that could be caught under her suggestions. I understand from what she said that she has been having conversations with the Minister. I very much hope that something is done in this area, and that it is explored more fully, as the noble Baroness, Lady Morgan, said, in the guidance. However, I just want to make the point that online abuse is also gamified: people make arrangements to abuse people in groups in particular ways that are not direct. If they threaten violence, that is quite different to a pile-in saying that you are a marvellous human being.

Online Safety Bill

Debate between Baroness Kidron and Lord Allan of Hallam
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

My Lords, I thank the noble Lord, Lord Knight, for laying out the amendment and recognise that there was a very thoughtful debate on the subject of machine-generated content on Amendment 125 in my name on a previous day of Committee.

I appreciate that the concept of labelling or watermarking machine-generated material is central to recent EU legislation, but I am equally aware that there is more than one school of thought on the efficacy of that approach among AI experts. On the one hand, as the noble Lord, Lord Knight, beautifully set out—with the help of his artificial friend—there are those who believe that visibly marking the division of real and altered material is a clue for the public to look more carefully at what they are seeing and that labelling it might provide an opportunity for both creators and digital companies to give greater weight to “human-created material”. For example, it could be that the new BBC Verify brand is given greater validity by the public, or that Google’s search results promote it above material labelled as machine-generated as a more authentic source. There are others who feel that the scale of machine-generated material will be so vast that this labelling will be impossible or that labelling will downgrade the value of very important machine-generated material in the public imagination, when in the very near future it is likely that most human activity will be a blend of generated material and human interaction.

I spent the first part of this week locked in a room with others at the Institute for Ethics in AI in Oxford debating some of these issues. While this is a very live discussion, one thing is clear: if we are to learn from history, we must act now before all is certain, and we should act with pragmatism and a level of humility. It may be that either or both sets of experts are correct.

Industry has clearly indicated that there is an AI arms race, and many companies are launching services that they do not understand the implications of. This is not my view but one told to me by a company leader, who said that the speed of distribution was so great that the testing was confined to whether deploying large language models crashed the platforms; there was no testing for safety.

The noble Lord, Lord Stevenson, says in his explanatory statement that this is a probing amendment. I therefore ask the Minister whether we might meet before Report and look once again at the gaps that might be covered by some combination of Amendment 125 and the amendment in front of us, to make certain that the Bill adequately reflects the concerns raised by the enforcement community and reflects the advice of those who best understand the latest iterations of the digital world.

The Communications Act 2003 made a horrible mistake in not incorporating digital within it; let us not do the same here. Adding explicit safety duties to AI and machine learning would not slow down innovation but would ensure that innovation is not short-sighted and dangerous for humanity. It is a small amendment for what may turn out to be an unimaginably important purpose.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Baroness, Lady Kidron. I will try to keep my remarks brief.

It is extremely helpful that we have the opportunity to talk about this labelling question. I see it more as a kind of aperitif for our later discussion of AI regulation writ large. Given that it is literally aperitif hour, I shall just offer a small snifter as to why I think there may be some challenges around labelling—again, perhaps that is not a surprise to the noble Baroness.

When we make rules, as a general matter we tend to assume that people are going to read them and respond in a rationalist, conformist way. In reality, particularly in the internet space, we often see that there is a mixed environment and there will be three groups. There are the people who will look at the rules and respond in that rational way to them; a large group of people will just ignore them—they will simply be unaware and not at all focused on the rules; and another group will look for opportunities to subvert them and use them to their own advantage. I want to comment particularly on that last group by reference to cutlery and call centres, two historic examples of where rules have been subverted.

On the cutlery example, I am a Sheffielder, and “Made in Sheffield” used to mean that you had made the entire knife in Sheffield. Then we had this long period when we went from knives being made in Sheffield to bringing them to Sheffield and silver-plating them, to eventually just sharpening them and putting them in boxes. That is relevant in the context of AI. Increasingly, if there is an advantage to be gained by appearing to be human, people will look at what kind of finishing you need, so: “The content may have been generated by AI but the button to post it was pushed by a human, therefore we do not think it is AI because we looked at it and posted it”. On the speech of the noble Lord, Lord Knight, does the fact that my noble friend intervened on him and the noble Lord had to use some of his own words now mean that his speech in Hansard would not have to be labelled “AI-generated” because we have now departed from it? Therefore, there is that question of individuals who will want something to appear human-made even if it was largely AI-generated, and whether they will find the “Made in Sheffield” way of bypassing it.

Interestingly, we may see the phenomenon flipping the other way, and this is where my call centres come in. If people go to a popular search engine and type in “SpinVox”, they will see the story of a tech company that promised to transcribe voicemails into written text. This was a wonderful use of technology, and it was valued on the basis that it had developed that fantastic technology. However, it turned out—or at least there were claims, which I can repeat here under privilege—that it was using call centres in low-cost, low-wage environments to type those messages out. Therefore, again, we may see, curiously, some people seeing an advantage to presenting content as AI-generated when it is actually made by humans. That is just to flag that up—as I say, it is a much bigger debate that we are going to have. It is really important that we are having it, and labelling has a role to play. However, as we think about it, I urge that we remember those communities of people who will look at whatever rules we come up with and say, “Aha! Where can I get advantage?”, either by claiming that something is human when it is generated by AI or claiming that it is generated by AI if it suits them when it was actually produced by humans.

Online Safety Bill

Debate between Baroness Kidron and Lord Allan of Hallam
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, first, I want to recognise the bravery of the families of Olly, Breck, Molly, Frankie and Sophie in campaigning for the amendments we are about to discuss. I also pay tribute to Mia, Archie, Isaac, Maia and Aime, whose families I met this morning on their way to the House. It is a great privilege to stand alongside them and witness their courage and dignity in the face of unimaginable grief. On behalf of myself, my co-signatories—the noble Lords, Lord Stevenson and Lord Clement-Jones, and the noble Baroness, Lady Morgan—and the huge number of Peers and MPs who have supported these amendments, I thank them for their work and the selflessness they have shown in their determination to ensure that other families do not suffer as they have.

This group includes Amendments 198, 199, 215 and 216, which, together, would create a pathway for coroners and, by extension, families to get access to information relevant to the death of a child from technology services. The amendments would put an end to the inhumane situation whereby coroners and families in crisis are forced to battle faceless corporations to determine whether a child’s engagement with a digital service contributed to their death. Bereaved families have a right to know what happened to their children, and coroners have a duty to ensure that lessons are learned and that those who have failed in their responsibilities are held accountable.

Since the Minister is going to be the bearer of good news this afternoon, I will take the time to make arguments for the amendments as they stand. I simply say that, while parents have been fighting for access to information, those same companies have continued to suggest friends, material and behaviours that drive children into places and spaces in which they are undermined, radicalised into despair and come to harm. In no other circumstance would it be acceptable to withhold relevant information from a court procedure. It is both immoral and a failure of justice if coroners cannot access and review all relevant evidence. For the families, it adds pain to heartbreak as they are unable to come to terms with what has happened because there is still so much that they do not know.

I am grateful to the Government for agreeing to bring forward on Report amendments that will go a very long way towards closing the loopholes that allow companies to refuse coroners’ demands and ignore parents’ entreaties. The Government’s approach is somewhat different from that in front of us, but it covers the same ground. These amendments are the result of the considerable efforts of Ministers and officials from DSIT and the Ministry of Justice, with the invaluable support of the right honourable Sajid Javid MP. I wish to note on the record the leadership of the Secretary of State, who is currently on leave, and the Minister here, the noble Lord, Lord Parkinson.

The Government’s amendments will create an express power for Ofcom to require information from services about a deceased child user’s online activity following the receipt of a Schedule 5 request from a coroner. This will vastly increase the reach and power of that coroner. Information that Ofcom can request from regulated companies under the Online Safety Bill is extremely wide and includes detailed data on what is recommended; the amount of time the child spent on the service when they accessed it; their user journey; what content they liked, shared, rewatched, paused and reported; and whether other users raised red flags about the child’s safety or well-being before their death.

Information notices prompted by a Schedule 5 request from a coroner will be backed by Ofcom’s full enforcement powers and will apply to all regulated companies. If a service fails to comply, it may be subject to enforcement action, including senior management liability and fines of up to £18 million or 10% of global turnover—vastly different from the maximum fine of £1,000 under the Coroners and Justice Act 2009. Moreover, these amendments will give coroners access to Ofcom’s expertise and understanding of how online services work and of online services’ safety duties to children. Also, there will be provisions empowering Ofcom to share information freely to assist coroners in their inquiries. Companies must provide a dedicated means of communication to manage requests for information from bereaved parents and provide written responses to those requests. I look forward to the Minister setting out that these will be operated by a team of experts and backed up by Ofcom in ensuring that the communication is adequate, timely and not obstructive. Importantly, if the communication is not adequate, bereaved families will be able to notify Ofcom.

There are a small number of outstanding questions. We remain concerned that only larger companies will be required to set out their policies on disclosure. Sadly, children are often coerced and nudged into smaller sites that have less robust safety mechanisms. Small is not safe. A further issue is to ensure that a coroner is able, via a Schedule 5 notice given to Ofcom, to compel senior management to appear at an inquest. This is a crucial ask of the legal community, who battled and failed to get companies to attend inquests, notably Wattpad at the Frankie Thomas inquest and Snap Inc at Molly Russell’s inquest. Can the Minister undertake to close these gaps before Report?

A number of matters sit outside the scope of the Online Safety Bill. I am particularly grateful to the Secretary of State for committing in writing to further work beyond the Bill to ensure that the UK’s approach is comprehensive and watertight. The Government will be exploring ways in which the Data Protection and Digital Information (No. 2) Bill can support and complement these provisions, including the potential for a code that requires data preservation if a parent or enforcement officer contacts a helpline or if there is constructive knowledge, such as when a death has been widely reported, even before a Schedule 5 notice has been delivered.

The Government are engaging with the Chief Coroner to provide training in order to ensure that coroners have the knowledge they need to carry out inquests where children’s engagement with online services is a possible factor in their death. I am concerned about the funding of this element of the Government’s plans and urge the Minister to indicate whether this could be part of Ofcom’s literacy duties and therefore benefit from the levy. Possibly most importantly, the Secretary of State has undertaken to approach the US Government to ensure that coroners can review private messages that fall outside the scope of this Bill in cases where a child’s death is being investigated. I am grateful to the noble Lord, Lord Allan, for his support in articulating the issue, and accept the invitation to work alongside the department to achieve this.

There are only two further things to say. First, delivery is in the drafting, and I hope that when he responds, the Minister will assure the House that we will see the proposed amendments well before Report so that we can ensure that this works as we have all agreed. Secondly, the Government are now looking very carefully at other amendments which deal with prevention of harm in one way or another. I share the gratitude of Bereaved Parents for Online Safety for the work that has gone into this set of amendments. However, we want to see safety by design; a comprehensive list of harms to children in the Bill, including harms caused or amplified by the design of service; principles for age assurance which ensure that the systems put in place by regulated services are measurable, secure and fit for purpose; and a proper complaints service, so that children have somewhere to turn when things go wrong. What we have been promised is a radical change of status for the coroner and for the bereaved families. What we want is fewer dead children. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, some of the issues that we have been dealing with in this Bill are more abstract or generic harms, but here we are responding to a specific need of families in the UK who are facing the most awful of circumstances.

I want to recognise the noble Baroness, Lady Kidron, for her direct support for many of those families, and for her persistent efforts to use policy and the tools we have available to us here to improve the situation for families who, sadly, will face similar tragedies in future. I appreciate the time that she has spent with me in the spirit of finding workable solutions. It is an alliance that might seem improbable, given our respective responsibilities, which have sometimes placed us in publicly adversarial roles. However, one of the strengths of this Committee process is that it has allowed us to focus on what is important and to find that we have more in common than separates us. Nothing could be more important than the issue we are dealing with now.

I am pleased that it looks like we will be able to use this Bill to make some significant improvements in this area to address the challenges faced by those families, some of whom are here today, challenges which add to their already heart-wrenching distress. The first challenge these families face is to find someone at an online service who is willing and able to answer their questions about their loved one’s use of that platform. This question about contacts at online platforms is not limited to these cases but comes up in other areas.

As noble Lords will know, I used to work for Facebook, where I was often contacted by all sorts of Governments asking me to find people in companies, often smaller companies, concerning very serious issues such as terrorism. Even when they were dealing with the distribution of terrorist content, they would find it very challenging. There is a generic problem around getting hold of people at platforms. A real strength of the Online Safety Bill is that it will necessarily require Ofcom to develop contacts at all online services that offer user-to-user and search services to people in the UK. The Government estimate that 25,000 entities are involved. We are talking about Ofcom building a comprehensive database of pretty much any service that matters to people in the UK.

Primarily, these contacts will be safety focused, as their main responsibility will be to provide Ofcom with evidence that the service is meeting its duties of care under the Bill, so again, they will have the right people in the right companies on their database in future. Importantly, Ofcom will have a team of several hundred people, paid for by a levy on these regulated services, to manage the contacts at the right level. We can expect that, certainly for the larger services, there may be a team of several people at Ofcom dedicated to working with them, whereas for the smaller services it may be a pooled arrangement whereby one Ofcom staff member deals with a group. However, in all cases there will be someone at the regulator with a responsibility for liaising with those companies. We do not expect Ofcom to use those contacts to resolve questions raised by individuals in the UK as a matter of course, but it makes sense to make this channel available where there is a relatively small number of highly impactful cases such as we are dealing with here.

Online Safety Bill

Debate between Baroness Kidron and Lord Allan of Hallam
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

My Lords, I rise very briefly to support the amendments in the name of the noble Baroness, Lady Stowell, and the noble Lord, Lord Stevenson. Like other speakers, I put on record my support for the regulator being offered independence and Parliament having a role.

However, I want to say one very brief and minor thing about timing—I feel somewhat embarrassed after the big vision of the noble Baroness, Lady Stowell. Having had quite a lot of experience of code making over the last three years, I experienced the amount of time that the department was able to take in responding to the regulator as being a point of power, a point of lobbying, as others have said, and a point of huge distraction. For those of us who have followed the Bill for five years and as many Secretaries of State, we should be concerned that none of the amendments has quite tackled the question of time.

The idea of acting within a timeframe is not without precedent; the National Security and Investment Act 2021 is just one recent example. What was interesting about that Act was that the reason given for the Secretary of State’s powers being necessary was as a matter of national security—that is, they were okay and what we all agree should happen—but the reason for the time restriction was for business stability. I put it to the Committee that the real prospect of children and other users being harmed requires the same consideration as business stability. Without a time limit, it is possible that inaction can be used to control or simply fritter away.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will make a short contribution on this substantive question of whether concerns about ministerial overreach are legitimate. Based on a decade of being on the receiving end of representations from Ministers, the short answer is yes. I want to expand on that with some examples.

My experience of working on the other side, inside a company, was that you often got what I call the cycle of outrage: something is shared on social media that upsets people; the media write a front-page story about it; government Ministers and other politicians get involved; that then feeds back into the media and the cycle spins up to a point where something must be done. The “something” is typically that the Minister summons people, such as me in my old job, and brings them into an office. That itself often becomes a major TV moment, where you are brought in, browbeaten and sent out again with your tail between your legs, and the Minister has instructed you to do something. That entire process takes place in the political rather than the regulatory domain.

I readily concede that, in many cases, something of substance needed to be addressed and there was a genuine problem. It is not that this was illegitimate, but these amendments are talking about the process for what we should do when that outrage is happening. I agree entirely with the tablers of the amendments that, to the extent that that process can be encapsulated within the regulator rather than a Minister acting on an ad hoc basis, it would be a significant improvement.

I also note that this is certainly not UK-specific, and it would happen in many countries with varying degrees of threat. I remember being summoned to the Ministry of the Interior in Italy to meet a gentleman who has now sadly passed. He brought me into his office, sat me down, pointed to his desk and said “You see that desk? That was Mussolini’s desk”. He was a nice guy and I left with a CD of his rhythm and blues band, but it was clear that I was not supposed to say no to him. He made a very clear and explicit political direction about content that was on the platform.

One big advantage of this Bill is that it has the potential to move beyond that world. It could move from individual people in companies—the noble Baroness, Lady Stowell of Beeston, made this point very powerfully—to changing the accountability model away from either platforms being entirely accountable themselves or platforms and others, including Ministers, somehow doing deals that will have an impact, as the noble Baroness, Lady Fox, and the noble Viscount, Lord Colville, said, on the freedom of expression of people across the country. We do not want that.

We want to move on in the Bill and I think we have a model which could work. The regulator will take on the outrage and go as far as it can under the powers granted in the Bill. If the regulator believes that it has insufficient powers, it will come back to Parliament and ask for more. That is the way in which the system can and should work. I think I referred to this at Second Reading; we have an opportunity to create clear accountability. Parliament instructs Ofcom, which instructs the platforms. The platforms do what Ofcom says, or Ofcom can sanction them. If Ofcom feels that its powers are deficient, it comes back to Parliament. The noble Lord, Lord Stevenson, and others made the point about scrutiny and us continually testing whether Ofcom has the powers and is exercising them correctly. Again, that is entirely beneficial and the Government should certainly be minded to accept those amendments.

With the Secretary of State powers, as drafted in the Bill and without the amendments we are considering today, we are effectively taking two steps forward and one step back on transparency and accountability. We have to ask: why take that step back when we are able to rely on Ofcom to do the job without these directions?

The noble Baroness, Lady Stowell of Beeston, made the point very clearly that there are other ways of doing this. The Secretary of State can express their view. I am sure that the Minister will be arguing that the Secretary of State’s powers in the Bill are better than the status quo because at least what the Secretary of State says will be visible; it will not be a back-room deal. The noble Baroness, Lady Stowell of Beeston, has proposed a very good alternative, where the Secretary of State makes visible their intentions, but not in the form of an order—rather in the form of advice. The public—it is their speech we are talking about—then have the ability to see whether they agree with Ofcom, the companies or the Secretary of State if there is any dispute about what should happen.

It is certainly the case that visible instructions from the Secretary of State would be better, but the powers as they are still leave room for arm-twisting. I can imagine a future scenario in which future employees of these platforms are summoned to the Secretary of State. But now the Secretary of State would have a draft order sitting there. The draft order is Mussolini’s desk. They say to the people from the platforms, “Look, you can do what I say, or I am going to send an order to Ofcom”. That takes us back to this world in which the public are not seeing the kind of instructions being given.

I hope that the Government will accept that some amendment is needed here. All the ones that have been proposed suggest different ways of achieving the same objective. We are trying to protect future Secretaries of State from an unhealthy temptation to intervene in ways that they should not.

Online Safety Bill

Debate between Baroness Kidron and Lord Allan of Hallam
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I had to miss a few sessions of the Committee but I am now back until the end. I remind fellow Members of my interests: I worked for one of the largest platforms for a decade, but I have no current interests. It is all in the register if people care to look. I want to contribute to this debate on the basis of that experience of having worked inside the platforms.

I start by agreeing with the noble Baroness, Lady Kidron, the noble Lord, Lord Stevenson, and my noble friend Lord Clement-Jones. The thrust of their amendments—the idea that something will be needed here—is entirely correct. We have created in the Online Safety Bill a mechanism that we in this Committee know is intended primarily to focus on systems and how Ofcom regulates them, but what the public out there hear is that we are creating a mechanism that will meet their concerns—and their concerns will not end with systems. As the noble Baroness, Lady Newlove, eloquently described, their concerns in some instances will be about specific cases and the question will be: who will take those up?

If there is no other mechanism and no way to signpost people to a place where they can seek redress, they will come to Ofcom. That is something we do not want. We want Ofcom to be focused on the big-ticket items of dealing with systems, not bogged down in dealing with thousands of individual complaints. So we can anticipate a situation in which we will need someone to be able to deal with those individual complaints.

I want to focus on making that workable, because the volume challenge might not be as people expect. I have seen from having worked on the inside that there is a vast funnel of reports, where people report content to platforms. Most of those reports are spurious or vexatious; that is the reality. Platforms have made their reporting systems easy, as we want them to do —indeed, in the Bill we say, “Make sure you have really easy-to-use reporting systems”—but one feature of that is that people will use them simply to express a view. Over the last couple of weeks, all the platforms will have been inundated with literally millions of reports about Turkish politicians. These will come from the supporters of either side, reporting people on the other side—claiming that they are engaged in hate speech or pornography or whatever. They will use whatever tool they can. That is what we used to see day in, day out: football teams or political groups that report each other. The challenge is to separate out the signal—the genuinely serious reports of where something is going wrong—from the vast amount of noise, of people simply using the reporting system because they can. For the ombudsman, the challenge will be that signal question.

Breaking that down, from the vast funnel of complaints coming in, we have a smaller subset that are actionable. Some of those will be substantive, real complaints, where the individual simply disagrees with the decision. That could be primarily for two reasons. The first is that the platform has made a bad decision and failed to enforce its own policies. For example, you reported something as being pornographic, and it obviously was, but the operator was having a bad day—they were tired, it was late in the day and they pressed “Leave up” instead of “Take down”. That happens on a regular basis, and 1% of errors like that across a huge volume means a lot of mistakes being made. Those kinds of issues, where there is a simple operator error, should get picked up by the platforms’ own appeal mechanisms. That is what they are there for, and the Bill rightly points to that. A second reviewer should look at it. Hopefully they are a bit fresher, understand that a mistake was made and can simply reverse it. Those operator error reports can be dealt with internally.

The second type would be where the platform enforces policies correctly but, from the complainant’s point of view, the policies are wrong. It may be a more pro-free speech platform where the person says, “This is hate speech”, but the platform says, “Well, according to our rules, it is not. Under our terms of service, we permit robust speech of this kind. Another platform might not, but we do”. In that case, the complainant is still unhappy but the platform has done nothing wrong—unless the policies the platform is enforcing are out of step with the requirements under the Online Safety Bill, in which case the complaint should properly come to Ofcom. Based on the individual complaint, a complainant may have something material for Ofcom. They are saying that they believe the platform’s policies and systems are not in line with the guidance issued by Ofcom—whether on hate speech, pornography or anything else. That second category of complaint would come to Ofcom.

The third class concerns the kind of complaint that the noble Baroness, Lady Newlove, described. In some ways, this is the hardest. The platform has correctly enforced its policies but, in a particular case, the effect is deeply unfair, problematic and harmful for an individual. The platform simply says, “Look, we enforced the policies. They are there. This piece of content did not violate them”. Any outsider looking at it would say, “There is an injustice here. We can clearly see that an individual is being harmed. A similar piece of content might not be harmful to another individual, but to this individual it is”. In those circumstances, groups such as the South West Grid for Learning, with which I work frequently, perform an invaluable task. We should recognise that there is a network of non-governmental organisations in the United Kingdom that do this day in, day out. Groups such as the Internet Watch Foundation and many others have fantastic relations and connections with the platforms and regularly bring exceptional cases to them.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

We are glad to have the noble Lord back. I want also to put on the record that the South West Grid for Learning is very supportive of this amendment.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

It has let me know as well. In a way, the amendment seeks to formalise what is already an informal mechanism. I was minded initially to support Amendment 56 in the name of my noble friend Lord Clement-Jones and the noble Lord, Lord Stevenson.

This landscape is quite varied. We have to create some kind of outlet, as the noble Baroness, Lady Kidron, rightly said. That parent or individual will want to go somewhere, so we have to send them somewhere. We want that somewhere to be effective, not to get bogged down in spurious and vexatious complaints. We want it to have a high signal-to-noise ratio—to pull out the important complaints and get them to the platforms. That will vary from platform to platform. In some ways, we want to empower Ofcom to look at what is and is not working and to be able to say, “Platform A has built up an incredible set of mechanisms. It’s doing a good job. We’re not seeing things falling through the cracks in the same way as we are seeing with platform B. We are going to have to be more directive with platform B”. That very much depends on the information coming in and on how well the platforms are doing their job already.

I hope that the Government are thinking about how these individual complaints will be dealt with and about the demand that will be created by the Bill. How can we have effective mechanisms for people in the United Kingdom who genuinely have hard cases and have tried, but where there is no intermediary for the platform they are worried about? In many cases, I suspect that these will be newer or smaller platforms that have arrived on the scene and do not have established relationships. Where are these people to go? Who will help them, particularly in cases where the platform may not systemically be doing anything wrong? Its policies are correct and it is enforcing them correctly, but any jury of peers would say that an injustice is being done. Either an exception needs to be made or there needs to be a second look at that specific case. We are not asking Ofcom to do this in the rest of the legislation.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

From experience, I think it is true that companies get defensive and seek to defend the indefensible on occasion. I agree with the noble Baroness on that, but I will balance it a little as I also work with people who were agonising over not wanting to make a bad situation worse. They were genuinely struggling and seeking to do the right thing. That is where the experts come in. If someone would say to them, “Look, take this stuff down; that is always better”, it would make their lives easier. If they said, “Please leave it up”, they could follow that advice. Again, that would make their lives easier. On the excuses, I agree that sometimes they are defending the indefensible, but also there are people agonising over the right thing to do and we should help them.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I absolutely agree. Of course, good law is a good system, not a good person.

I turn to the comments that I was going to make. Uncharacteristically, I am a little confused about this issue and I would love the Minister’s help. My understanding on reading the Bill very closely is that self-harm and suicide content that meets a legal definition will be subject to the priority illegal content duties. In the case of children, we can safely anticipate that content of this kind will be named primary priority content. Additionally, if such content is against the terms of service of a regulated company, it can be held responsible to those terms. It will have to provide a user empowerment tool on category 1 services so that it can be toggled out if an adult user wishes. That is my understanding of where this content has already been dealt with in the Bill. To my mind, this leaves the following ways in which suicide and self-harm material, which is the subject of this group of amendments, is not covered by the Bill. That is what I would like the Minister to confirm, and I absolutely stand by to be corrected.

In the case of adults, if self-harm and suicide material does not meet a bar of illegal content and the service is not category 1, there is no mechanism to toggle it out. Ofcom has no power to require a service to ensure tools to toggle self-harm and suicide material out by default. This means that self-harm and suicide material can be as prevalent as they like—pushed, promoted and recommended, as I have just explained—if it is not contrary to the terms of service, so long as it does not reach the bar of illegal content.

Search services are not subject to these clauses— I am unsure about that. In the case of both children and adults, if self-harm and suicide material is on blogs or services with limited functionality, it is out of scope of the Bill and there is absolutely nothing Ofcom can do. For non-category 1 services—the majority of services which claim that an insignificant number of children access their site and thus that they do not have to comply with the child safety duties—there are no protections for a child against this content.

I put it like that because I believe that each of the statements I just made could have been fixed by amendments already discussed during the past six days in Committee. We are currently planning to leave many children without the protection of the safety duties, to leave vulnerable adults without even the cover of default protections against material that has absolutely no public interest and to leave companies to decide whether to promote or use this material to fuel user engagement—even if it costs well-being and lives.

I ask the Minister to let me know if I have misunderstood, but I think it is really quite useful to see what is left once the protections are in place, rather than always concentrating on the protections themselves.

Online Safety Bill

Debate between Baroness Kidron and Lord Allan of Hallam
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

I support these amendments, which were set out wonderfully by the noble Lord, Lord Stevenson. I want to raise a point made on Tuesday when the noble Baroness, Lady Merron, said that only 3% of people read terms of service and I said that 98% of people do not read them, so one of us is wrong, but I think the direction of travel is clear. She also used a very interesting phrase about prominence, and I want to use this opportunity to ask the Minister whether there is some lever whereby Ofcom can insist on prominence for certain sorts of material—a hierarchy of information, if you like—because these are really important pieces of information, buried in the wrong place so that even 2% or 3% of people may not find them.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I am very pleased that the noble Lord, Lord Stevenson, has given us the opportunity to talk about terms of service, and I will make three points again, in a shorter intervention than on the previous group.

First, terms of service are critical as the impact of terms of service will generally be much greater in terms of the amount of intervention that occurs on content than it will ever be under the law. Terms of service create, in effect, a body of private law for a community, and they are nearly always a superset of the public law—indeed, it is very common for the first items of a terms of service to say, “You must not do anything illegal”. This raises the interesting question of “illegal where?”—what it generally means is that you must not do anything illegal in the jurisdiction in which the service provider is established. The terms of service will say, “Do not do anything illegal”, and then they will give a whole list of other things, as well as illegality, that you cannot do on the platform, and I think this is right because they have different characteristics.

Online Safety Bill

Debate between Baroness Kidron and Lord Allan of Hallam
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I point out that one of the benefits of end-to-end encryption is that it precisely stops companies doing things such as targeted advertising based on the content of people’s communications. Again, I think there is a very strong and correct trend to push companies in that direction.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I thank the noble Lord for the intervention. For those noble Lords who are not following the numbers, Amendment 285, which I support, would prevent general monitoring. Apart from anything else, I am worried about equivalence and other issues in relation to general monitoring. Apart from a principled position against it, I think to be explicit is helpful.

Ofcom needs to be very careful, and that is what Amendment 190 sets out. It asks whether the alternatives have been thought about, whether the conditions have been thought about, and whether the potential impact has been thought about. That series of questions is essential. I am probably closer to the community that wants to see more powers and more interventions, but I would like that to be in a very monitored and regulated form.

I thank the noble Lord for his contribution. Some of these amendments must be supported because it is worrying for us as a country to have—what did the noble Lord call it?—ambiguity about whether something is possible. I do not think that is a useful ambiguity.

Online Safety Bill

Debate between Baroness Kidron and Lord Allan of Hallam
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

The Twitter scenario, and other scenarios of mixed sites, are some of the most challenging that we have to deal with. But I would say, straightforwardly, “Look, 13% is a big chunk, but the primary purpose of Twitter is not the delivery of pornography”. I use Twitter on a daily basis and I have never seen pornography on it. I understand that it is there and that people can go for it, and that is an issue, but I think people out there would say that for most people, most of the time, the primary purpose of Twitter is not pornography.

What we want to do—in answer to the noble Lord’s second point—is create an incentive for people to be recategorised in the right direction. There is an assumption here that it is all going to be about gaming the system. I actually think that there is an opportunity here for genuine changes. There will be a conversation with Twitter. It will be interesting, given Twitter’s current management—apparently it is run by a dog, so there will be a conversation with the dog that runs Twitter. In that conversation, the regulator, Ofcom, on our behalf, will be saying, “You could change your terms of service and get rid of pornography”. Twitter will say yes or no. If it says no, Ofcom will say, “Well, here are all the things we expect you to do in order to wall off that part of the site”.

That is a really healthy and helpful conversation to have with Twitter. I expect it is listening now and already thinking about how it will respond. But it would expect that kind of treatment and conversation to be different; and I think the public would expect that conversation to be a different and better conversation than just saying “Twitter, you’re Pornhub. We’re just going to treat you like Pornhub”.

That is the distinction. As I say, we have an opportunity to get people to be more robust about either limiting or removing pornography, and I fear that the amendments we have in front of us would actually undermine rather than enhance that effort.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

At the centre of this is the question of whether we are trying to block the entire service or block at the level of porn content. It is the purpose of a set of amendments in the names of the noble Lord, Lord Bethell, myself and a number of other noble Lords to do exactly the latter. But I have to say to the noble Baroness that I am very much in sympathy with, first, putting porn behind an age gate; secondly, having a commencement clause; and, thirdly and very importantly—this has not quite come up in the conversation—saying that harms must be on the face of the Bill and that porn is not the only harm. I say, as a major supporter of the Bereaved Families for Online Safety, that “Porn is the only harm children face” would be a horrendous message to come from this House. But there is nothing in the noble Baroness’s amendments, apart from where the action happens, that I disagree with.

I also felt that the noble Baroness made an incredibly important point when she went into detail on Amendment 125A. I will have to read her speech in order to follow it, because it was so detailed, but the main point she made is salient and relates to an earlier conversation: the reason we have Part 5 is that the Government have insisted on this ridiculous thing about user-to-user and search, instead of doing it where harm is. The idea that you have Part 5, which is to stop the loophole of sites that do not have user-to-user, only to find that they can add user-to-user functionality and be another type of site, is quite ludicrous. I say to the Committee and the Minister, who I am sure does not want me to say it, “If you accept Amendment 2, you’d be out of that problem”—because, if a site was likely to be accessed by children and it had harm and we could see the harm, it would be in scope. That is the very common-sense approach. We are where we are, but let us be sensible about making sure the system cannot be gamed, because that would be ludicrous and would undermine everybody’s efforts—those of the Government and of all the campaigners here.

I just want to say one more thing because I see that the noble Lord, Lord Moylan, is back in his place. I want to put on the record that age assurance and identity are two very separate things. I hope that, when we come to debate the package of harms—unfortunately, we are not debating them all together; we are debating harms first, then AV—we get to the bottom of that issue because I am very much in the corner of the noble Lord and the noble Baroness, Lady Fox, on this. Identity and age assurance must not be considered the same thing by the House, and definitely not by the legislation.

Online Safety Bill

Debate between Baroness Kidron and Lord Allan of Hallam
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

The noble Baroness makes a very good point; they are making efforts. There is a role for app stores to play but I hope she would accept that it is qualitatively different from that played by a search engine or a user-to-user service. If we were to decide, in both instances, that we want app stores to have a greater role in online safety and a framework that allows us to look at blogs and other forms of content, we should go ahead and do that. All I am arguing is that we have a Bill that is carefully constructed around two particular concepts, a user-to-user service and a search engine, and I am not sure it will stretch that far.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I want to reassure the noble Lord: I have his blog in front of me and he was quite right—there were not a lot of children on that site. It is a very good blog, which I read frequently.

I want to make two points. First, age-rating and age-gating are two different things, and I think the noble Lord has conflated them. There is a lot of age- rating going on, and it is false information. We need good information, and we have not managed to get it by asking nicely. Secondly, I slightly dispute his idea that we have a very structured Bill regarding user-to-user and so on. We have a very structured Bill from a harms perspective that describes the harms that must be prevented—and then we got to commercial porn, and we can also get to these other things.

I agree with the noble Lord’s point about freedom of speech, but we are talking about a fixed set of harms that will, I hope, be in the Bill by the end. We can then say that if children are likely to be accessed by this test, and known harm is there, that is what we are looking at. We are certainly not looking at the noble Lord’s blog.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I appreciate the intervention by the noble Baroness; I hope through this grit we may conjure up a pearl of some sort. The original concept of the Bill, as championed by the noble Baroness, would have been a generalised set of duties of care which could have stretched much more broadly. It has evolved in a particular direction and become ever more specific and tailored to those three services: user-to-user, search, and pornography services. Having arrived at that point, it is difficult to then open it back up and stretch it to reach other forms of service.

My intention in intervening in this debate is to raise some of those concerns because I think they are legitimate. I may be at the more sceptical end of the political world, but I am at the more regulation-friendly end of the tech community. This is said in a spirit of trying to create a Bill that will actually work. I have done the work, and I know how hard Ofcom’s job will be. That sums up what I am trying to say: my concern is that we should not give Ofcom an impossible job. We have defined something quite tight—many people still object to it, think it is too loose and do not agree with it—but I think we have something reasonably workable. I am concerned that, however tempting it is, by re-opening Pandora’s box we may end up creating something less workable.

That does not mean we should forget about app stores and non-user-to-user content, but we need to think of a way of dealing with those which does not necessarily just roll over the mechanism we have created in the Online Safety Bill to other forms of application.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to speak in support of Amendment 9, tabled by the noble Lord, Lord Moylan, and in particular the proposed new paragraph 10A to Schedule 1. I hope I will find myself more in tune with the mood of the Committee on this amendment than on previous ones. I would be interested to know whether any noble Lords believe that Ofcom should be spending its limited resources supervising a site like Wikipedia under the new regime, as it seems to me patently obvious that that is not what we intend; it is not the purpose of the legislation.

The noble Lord, Lord Moylan, is right to remind us that one of the joys of the internet is that you buy an internet connection, plug it in and there is a vast array of free-to-use services which are a community benefit, produced by the community for the community, with no harm within them. What we do not want to do is interfere with or somehow disrupt that ecosystem. The noble Baroness, Lady Fox, is right to remind us that there is a genuine risk of people withdrawing from the UK market. We should not sidestep that. People who try to be law-abiding will look at these requirements and ask themselves, “Can I meet them?” If the Wikimedia Foundation that runs Wikipedia does not think it can offer its service in a lawful way, it will have to withdraw from the UK market. That would be to the detriment of children in the United Kingdom, and certainly not to their benefit.

There are principle-based and practical reasons why we do not want Ofcom to be operating in this space. The principle-based one is that it makes me uncomfortable that a Government would effectively tell their regulator how to manage neutral information sites such as Wikipedia. There are Governments around the world who seek to do that; we do not want to be one of those.

The amendment attempts to define this public interest, neutral, informational service. It happens to be user-to-user but it is not like Facebook, Instagram or anything similar. I would feel much more comfortable making it clear in law that we are not asking Ofcom to interfere with those kinds of services. The practical reason is the limited time Ofcom will have available. We do not want it to be spending time on things that are not important.

Definitions are another example of how, with the internet, it can often be extremely hard to draw bright lines. Functionalities bleed into each other. That is not necessarily a problem, until you try to write something into law; then, you find that your definition unintentionally captures a service that you did not intend to capture, or unintentionally misses out a service that you did intend to be in scope. I am sure the Minister will reject the amendment because that is what Ministers do; but I hope that, if he is not willing to accept it, he will at least look at whether there is scope within the Bill to make it clear that Wikipedia is intended to be outside it.

Paragraph 4 of Schedule 1 refers to “limited functionality services”. That is a rich vein to mine. It is clear that the intention is to exclude mainstream media, for example. It refers to “provider content”. In this context, Encyclopaedia Britannica is not in scope but Wikipedia is, the difference being that Wikipedia is constructed by users, while Encyclopaedia Britannica is regarded as being constructed by a provider. The Daily Mail is outside scope; indeed, all mainstream media are outside scope. Anyone who declares themselves to be media—we will debate this later on—is likely to be outside scope.

Such provider exemption should be offered to other, similar services, even if they happen to be constructed from the good will of users as opposed to a single professional author. I hope the Minister will be able to indicate that the political intent is not that we should ask Ofcom to spend time and energy regulating Wikipedia-like services. If so, can he point to where in the legislation we might get that helpful interpretation, in order to ensure that Ofcom is focused on what we want it to be focused on and not on much lower priority issues?

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

I will speak to a couple of the amendments in this group. First, small is not safe, and you cannot necessarily see these platforms in isolation. For example, there is an incel group that has only 4,000 active users, but it posts a great deal on YouTube and has 24.2 million users in that context. So we have to be clear that small and safe are not the same thing.

However, I am sympathetic to the risk-based approach. I should probably have declared an interest as someone who has given money to Wikipedia on several occasions to keep it going. I ask the Minister for some clarity on the systems and processes of the Bill, and whether the risk profile of Wikipedia—which does not entice you in and then follow you for the next six months once you have looked at something—is far lower than something very small that gets hold of you and keeps on going. I say that particularly in relation to children, but I feel it for myself also.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to the amendments in the name of the noble Lord, Lord Moylan, on moderation, which I think are more important than he has given himself credit for—they go more broadly than just Wikipedia.

There is a lot of emphasis on platform moderation, but the reality is that most moderation of online content is done by users, either individually or in groups, acting as groups in the space where they operate. The typical example, which many Members of this House have experienced, is when you post something and somebody asks, “Did you mean to post that?”, and you say, “Oh gosh, no”, and then delete it. A Member in the other place has recently experienced a rather high-profile example of that through the medium of the newspaper. On a much smaller scale, it is absolutely typical that people take down content every day, either because they regret it or, quite often, because their friends, families or communities tell them that it was unwise. That is the most effective form of moderation, because it is the way that people learn to change their behaviour online, as opposed to the experience of a platform removing content, which is often experienced as the big bad hand of the platform. The person does not learn to change their behaviour, so, in some cases, it can reinforce bad behaviour.

Community moderation, not just on Wikipedia but across the internet, is an enormous public good, and the last thing that we want to do in this legislation is to discourage people from doing it. In online spaces, that is often a volunteer activity: people give up their time to try to keep a space safe and within the guidelines they have set for that space. The noble Lord, Lord Moylan, has touched on a really important area: in the Bill, we must be absolutely clear to those volunteers that we will not create all kinds of new legal operations and liabilities on them. These are responsible people, so, if they are advised that they will incur all kinds of legal risk when trying to comply with the Online Safety Bill, they will stop doing the moderation—and then we will all suffer.

On age-gating, we will move to a series of amendments where we will discuss age assurance, but I will say at the outset, as a teaser to those longer debates, that I have sympathy with the points made by the noble Lord, Lord Moylan. He mentioned pubs—we often talk about real-world analogies. In most of the public spaces we enter in the real world, nobody does any ID checking or age checking; we take it on trust, unless and until you carry out an action, such as buying alcohol, which requires an age check.

It is legitimate to raise this question, because where we fall in this debate will depend on how we see public spaces. I see a general-purpose social network as equivalent to walking into a pub or a town square, so I do not expect to have my age and ID checked at the point at which I enter that public space. I might accept that my ID is checked at a certain point where I carry out various actions. Others will disagree and will say that the space should be checked as soon as you go into it—that is the boundary of the debate we will have across a few groups. As a liberal, I am certainly on the side that says that it is incumbent on the person wanting to impose the extra checks to justify them. We should not just assume that extra checks are cost-free and beneficial; they have a cost for us all, and it should be imposed only where there is a reasonable justification.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

Far be it for me to suggest that all the amendments tabled by the noble Lord, Lord Moylan, are in the wrong place, but I think that Amendment 26 might have been better debated with the other amendments on age assurance.

On community moderation, I underscore the point that Ofcom must have a risk profile as part of its operations. When we get to that subject, let us understand what Ofcom intends to do with it—maybe we should instruct Ofcom a little about what we would like it to do with it for community moderation. I have a lot of sympathy—but do not think it is a get-out clause—with seeing some spaces as less risky, or, at least, for determining what risky looks like in online spaces, which is a different question. This issue belongs in the risk profile: it is not about taking things out; we have to build it into the Bill we have.

On age assurance and AV, I do not think that today is the day to discuss it in full. I disagree with the point that, because we are checking kids, we have to check ourselves—that is not where the technology is. Without descending into technical arguments, as the noble Lord, Lord Moylan, asked us not to, we will bring some of those issues forward.

The noble Lords, Lord Bethell and Lord Stevenson, and the right reverend Prelate the Bishop of Oxford have a package of amendments which are very widely supported across the Committee. They have put forward a schedule of age assurance that says what the rules of the road are. We must stop pretending that age assurance is something that is being invented now in this Bill. If you log into a website with your Facebook login, it shares your age—and that is used by 42% of people online. However, if you use an Apple login, it does not share your age, so I recommend using Apple—but, interestingly, it is harder to find that option on websites, because websites want to know your age.

So, first, we must not treat age assurance as if it has just been invented. Secondly, we need to start to have rules of the road, and ask what is acceptable, what is proportionate, and when we will have zero tolerance. Watching faces around the Committee, I say that I will accept zero tolerance for pornography and some other major subjects, but, for the most part, age assurance is something that we need to have regulated. Currently, it is being done to us rather than in any way that is transparent or agreed, and that is very problematic.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, it is risky to stand between people and their dinner, but I rise very briefly to welcome these amendments. We should celebrate the good stuff that happens in Committee as well as the challenging stuff. The risk assessments are, I think, the single most positive part of this legislation. Online platforms already do a lot of work trying to understand what risks are taking place on their platforms, which never sees the light of day except when it is leaked by a whistleblower and we then have a very imperfect debate around it.

The fact that platforms will have to do a formal risk assessment and share it with a third-party regulator is huge progress; it will create a very positive dynamic. The fact that the public will be able to see those risk assessments and make their own judgments about which services to use—according to how well they have done them—is, again, a massive public benefit. We should welcome the fact that risk assessments are there and the improvements that this group of amendments makes to them. I hope that was short enough.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I also welcome these amendments, but I have two very brief questions for the Minister. First, in Amendment 27A, it seems that the child risk assessment is limited only to category 1 services and will be published only in the terms of service. As he probably knows, 98% of people do not read terms of service, so I wondered where else we might find this, or whether there is a better way of dealing with it.

My second question is to do with Amendments 64A and 88A. It seems to me—forgive me if I am wrong—that the Bill previously stipulated that all regulated search and user services had to make and keep a written record of any measure taken in compliance with a relevant duty, but now it seems to have rowed back to only category 1 and 2A services. I may be wrong on that, but I would like to check it for the record.

Online Safety Bill

Debate between Baroness Kidron and Lord Allan of Hallam
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

My Lords, I draw attention to my interests in the register, which I declared in full at Second Reading. It is an absolute pleasure to follow the noble Lord, Lord Stevenson, and, indeed, to have my name on this amendment, along with those of fellow members of the pre-legislative committee. It has been so long that it almost qualifies as a reunion tour.

This is a fortuitous amendment on which to start our deliberations, as it sets out the very purpose of the Bill—a North Star. I want to make three observations, each of which underlines its importance. First, as the pre-legislative committee took evidence, it was frequently remarked by both critics and supporters that it was a complicated Bill. We have had many technical briefings from DSIT and Ofcom, and they too refer to the Bill as “complicated”. As we took advice from colleagues in the other place, expert NGOs, the tech sector, academics and, in my own case, the 5Rights young advisory group, the word “complicated” repeatedly reared its head. This is a complex and ground-breaking area of policy, but there were other, simpler structures and approaches that have been discarded.

Over the five years with ever-changing leadership and political pressures, the Bill has ballooned with caveats and a series of very specific, and in some cases peculiar, clauses—so much so that today we start with a Bill that even those of us who are paying very close attention are often told that we do not understand. That should make the House very nervous.

It is a complicated Bill with intersecting and dependent clauses—grey areas from which loopholes emerge—and it is probably a big win for the deepest pockets. The more complicated the Bill is, the more it becomes a bonanza for the legal profession. As the noble Lord, Lord Stevenson, suggests, the Minister is likely to argue that the contents of the amendment are already in the Bill, but the fact that the word “complicated” is firmly stuck to its reputation and structure is the very reason to set out its purpose at the outset, simply and unequivocally.

Secondly, the OSB is a framework Bill, with vast amounts of secondary legislation and a great deal of work to be implemented by the regulator. At a later date we will discuss whether the balance between the Executive, the regulator and Parliament is exactly as it should be, but as the Bill stands it envisages a very limited future role for Parliament. If I might borrow an analogy from my previous profession, Parliament’s role is little more than that of a background extra.

I have some experience of this. In my determination to follow all stages of the age-appropriate design code, I found myself earlier this week in the Public Gallery of the other place to hear DSIT Minister Paul Scully, at Second Reading of the Data Protection and Digital Information (No. 2) Bill, pledge to uphold the AADC and its provisions. I mention this in part to embed it on the record—that is true—but primarily to make this point: over six years, there have been two Information Commissioners and double figures of Secretaries of State and Ministers. There have been many moments at which the interpretation, status and purpose of the code has been put at risk, at least once to a degree that might have undermined it altogether. At these moments, each time the issue was resolved by establishing the intention of Parliament beyond doubt. Amendment 1 moves Parliament from background extra to star of the show. It puts the intention of Parliament front and centre for the days, weeks, months and years ahead in which the work will still be ongoing—and all of us will have moved on.

The Bill has been through a long and fractured process in which the pre-legislative committee had a unique role. Many attacks on the Bill have been made by people who have not read it. Child safety was incorrectly cast as the enemy of adult freedom. While some wanted to apply the existing and known concepts and terms of public interest, protecting the vulnerable, product safety and the established rights and freedoms of UK citizens, intense lobbying has seen them replaced by untested concepts and untried language over which the tech sector has once again emerged as judge and jury. This has further divided opinion.

In spite of all the controversy, when published, the recommendations of the committee report received almost universal support from all sides of the debate. So I ask the Minister not only to accept the committee’s view that the Bill needs a statement of purpose, the shadow of which will provide shelter for the Bill long into the future, but to undertake to look again at the committee report in full. In its pages lies a landing strip of agreement for many of the things that still divide us.

This is a sector that is 100% engineered and almost all privately owned, and within it lie solutions to some of the greatest problems of our age. It does not have to be as miserable, divisive and exploitative as this era of exceptionalism has allowed it to be. As the Minister is well aware, I have quite a lot to say about proposed new subsection (1)(b),

“to provide a higher level of protection for children than for adults”,

but today I ask the Minister to tell us which of these paragraphs (a) to (g) are not the purpose of the Bill and, if they are not, what is.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I am pleased that we are starting our Committee debate on this amendment. It is a pleasure to follow the noble Lord, Lord Stevenson, and the noble Baroness, Lady Kidron.

In this Bill, as has already been said, we are building a new and complex system and we can learn some lessons from designing information systems more generally. There are three classic mistakes that you can make. First, you can build systems to fit particular tools. Secondly, you can overcommit beyond what you can actually achieve. Thirdly, there is feature creep, through which you keep adding things on as you develop a new system. A key defence against these mistakes is to invest up front in producing a really good statement of requirements, which I see in Amendment 1.

On the first risk, as we go through the debate, there is a genuine risk that we get bogged down in the details of specific measures that the regulator might or might not include in its rules and guidance, and that we lose sight of our goals. Developing a computer system around a particular tool—for example, building everything with Excel macros or with Salesforce—invariably ends in disaster. If we can agree on the goals in Amendment 1 and on what we are trying to achieve, that will provide a sound framework for our later debates as we try to consider the right regulatory technologies that will deliver those goals.

The second cardinal error is overcommitting and underdelivering. Again, it is very tempting when building a new system to promise the customer that it will be all-singing, all-dancing and can be delivered in the blink of an eye. Of course, the reality is that in many cases, things prove to be more complex than anticipated, and features sometimes have to be removed while timescales for delivering what is left are extended. A wise developer will instead aim to undercommit and overdeliver, promising to produce a core set of realistic functions and hoping that, if things go well, they will be able to add in some extra features that will delight the customer as an unexpected bonus.

This lesson is also highly relevant to the Bill, as there is a risk of giving the impression to the public that more can be done quicker than may in fact be possible. Again, Amendment 1 helps us to stay grounded in a realistic set of goals once we put those core systems in place. The fundamental and revolutionary change here is that we will be insisting that platforms carry out risk assessments and share them with a regulator, who will then look to them to implement actions to mitigate those risks. That is fundamental. We must not lose sight of that core function and get distracted by some of the bells and whistles that are interesting, but which may take the regulator’s attention away from its core work.

We also need to consider what we mean by “safe” in the context of the Bill and the internet. An analogy that I have used in this context, which may be helpful, is to consider how we regulate travel by car and aeroplane. Our goal for air travel is zero accidents, and we regulate everything down to the nth degree: from the steps we need to take as passengers, such as passing through security and presenting identity documents, to detailed and exacting safety rules for the planes and pilots. With car travel, we have a much higher degree of freedom, being able to jump in our private vehicles and go where we want, when we want, pretty much without restrictions. Our goal for car travel is to make it incrementally safer over time; we can look back and see how regulation has evolved to make vehicles, roads and drivers safer year on year, and it continues to do so. Crucially, we do not expect car travel to be 100% safe, and we accept that there is a cost to this freedom to travel that, sadly, affects thousands of people each year, including my own family and, I am sure, many others in the House. There are lots of things we could do to make car travel even safer that we do not put into regulation, because we accept that the cost of restricting freedom to travel is too high.

Without over-labouring this analogy, I ask that we keep it in mind as we move through Committee—whether we are asking Ofcom to implement a car-like regime whereby it is expected to make continual improvements year on year as the state of online safety evolves, or we are advocating an aeroplane-like regime whereby any instance of harm will be seen as a failure by the regulator. The language in Amendment 1 points more towards a regime of incremental improvements, which I believe is the right one. It is in the public interest: people want to be safer online, but they also want the freedom to use a wide range of internet services without excessive government restriction, and they accept some risk in doing so.

I hope that the Minister will respond positively to the intent of Amendment 1 and that we can explore in this debate whether there is broad consensus on what we hope the Bill will achieve and how we expect Ofcom to go about its work. If there is not, then we should flush that out now to avoid later creating confused or contradictory rules based on different understandings of the Bill’s purpose. I will keep arguing throughout our proceedings for us to remain focused on giving the right goals to Ofcom and allowing it considerable discretion over the specific tools it needs, and for us to be realistic in our aims so that we do not overcommit and underdeliver.

Finally, the question of feature creep is very much up to us. There will be a temptation to add things into the Bill as it goes through. Some of those things are essential; I know that the noble Baroness, Lady Kidron, has some measures that I would also support. This is the right time to do that, but there will be other things that would be “nice to have”, and the risk of putting them in might detract from those core mechanisms. I hope we are able to maintain our discipline as we go through these proceedings to ensure we deliver the right objectives, which are incredibly well set out in Amendment 1, which I support.