All 29 Lord Parkinson of Whitley Bay contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Wed 1st Feb 2023
Wed 19th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage & Committee stage
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 11th May 2023
Tue 16th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 16th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 23rd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 23rd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1 & Report stage: Minutes of Proceedings
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 2
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 3
Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1
Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 2
Wed 12th Jul 2023
Mon 17th Jul 2023
Wed 19th Jul 2023
Wed 6th Sep 2023
Tue 19th Sep 2023
Online Safety Bill
Lords Chamber

Consideration of Commons amendments

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- View Speech - Hansard - -

That the Bill be now read a second time.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Digital, Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - -

My Lords, I am very glad to be here to move the Second Reading of the Online Safety Bill. I know that this is a moment which has been long awaited in your Lordships’ House and noble Lords from across the House share the Government’s determination to make the online realm safer.

That is what this Bill seeks to do. As it stands, over three quarters of adults in this country express a concern about going online; similarly, the number of parents who feel the benefits outweigh the risks of their children being online has decreased rather than increased in recent years, falling from two-thirds in 2015 to barely over half in 2019. This is a terrible indictment of a means through which people of all ages are living increasing proportions of their lives, and it must change.

All of us have heard the horrific stories of children who have been exposed to dangerous and deeply harmful content online, and the tragic consequences of such experiences both for them and their families. I am very grateful to the noble Baroness, Lady Kidron, who arranged for a number of noble Lords, including me, to see some of the material which was pushed relentlessly at Molly Russell whose family have campaigned bravely and tirelessly to ensure that what happened to their daughter cannot happen to other young people. It is with that in mind, at the very outset of our scrutiny of this Bill, that I would like to express my gratitude to all those families who continue to fight for change and a safer, healthier online realm. Their work has been central to the development of this Bill. I am confident that, through it, the Government’s manifesto commitment to make the UK the safest place in the world to be online will be delivered.

This legislation establishes a regulatory regime which has safety at its heart. It is intended to change the mindset of technology companies so that they are forced to consider safety and risk mitigation when they begin to design their products, rather than as an afterthought.

All companies in scope will be required to tackle criminal content and activity online. If it is illegal offline; it is illegal online. All in-scope platforms and search services will need to consider in risk assessments the likelihood of illegal content or activity taking place on their site and put in place proportionate systems and processes to mitigate those risks. Companies will also have to take proactive measures against priority offences. This means platforms will be required to take proportionate steps to prevent people from encountering such content.

Not only that, but platforms will also need to mitigate the risk of the platform being used to facilitate or commit such an offence. Priority offences include, inter alia: terrorist material, child sexual abuse and exploitation, so-called revenge pornography and material encouraging or assisting suicide. In practice, this means that all in-scope platforms will have to remove this material quickly and will not be allowed to promote it in their algorithms.

Furthermore, for non-priority illegal content, platforms must have effective systems in place for its swift removal once this content has been flagged to them. Gone will be the days of lengthy and arduous complaints processes and platforms feigning ignorance of such content. They can and will be held to account.

As I have previously mentioned, the safety of children is of paramount importance in this Bill. While all users will be protected from illegal material, some types of legal content and activity are not suitable for children and can have a deeply damaging impact on their mental health and their developing sense of the world around them.

All in-scope services which are likely to be accessed by children will therefore be required to assess the risks to children on their service and put in place safety measures to protect child users from harmful and age inappropriate content. This includes content such as that promoting suicide, self-harm or eating disorders which does not meet a criminal threshold; pornography; and damaging behaviour such as bullying.

The Bill will require providers specifically to consider a number of risk factors as part of their risk assessments. These factors include how functionalities such as algorithms could affect children’s exposure to content harmful to children on their service, as well as children’s use of higher risk features on the service such as livestreaming or private messaging. Providers will need to take robust steps to mitigate and effectively manage any risks identified.

Companies will need to use measures such as age verification to prevent children from accessing content which poses the highest risk of harm to them, such as online pornography. Ofcom will be able to set out its expectations about the use of age assurance solutions, including age verification tools, through guidance. This guidance will also be able to refer to relevant standards. The Bill also now makes it clear that providers may need to use age assurance to identify the age of their users to meet the necessary child safety duties and effectively enforce age restrictions on their service.

The Government will set out in secondary legislation the priority categories of content harmful to children so that all companies are clear on what they need to protect children from. Our intention is to have the regime in place as soon as possible after Royal Assent, while ensuring the necessary preparations are completed effectively and service providers understand clearly what is expected. We are working closely with Ofcom and I will keep noble Lords appraised.

My ministerial colleagues in another place worked hard to strengthen these provisions and made commitments to introduce further provisions in your Lordships’ House. With regard to increased protections for children specifically, the Government will bring forward amendments at Committee stage to name the Children’s Commissioner for England as a statutory consultee for Ofcom when it is preparing a code of practice, ensuring that the experience of children and young people is accounted for during implementation.

We will also bring forward amendments to specify that category 1 companies—the largest and most risky platforms—will be required to publish a summary of their risk assessments for both illegal content and material that is harmful to children. This will increase transparency about illegal and harmful content on in-scope services and ensure that Ofcom can do its job regulating effectively.

We recognise the great suffering experienced by many families linked to children’s exposure to harmful content and the importance of this Bill in ending that. We must learn from the horrific events from the past to secure a safe future for children online.

We also understand that, unfortunately, people of any age may experience online abuse. For many adults, the internet is a positive source of entertainment and information and a way to connect with others; for some, however, it can be an arena for awful abuse. The Bill will therefore offer adult users a triple shield of protection when online, striking the right balance between protecting the right of adult users to access legal content freely, and empowering adults with the information and tools to manage their own online experience.

First, as I have outlined, all social media firms and search services will need to tackle illegal content and activity on their sites. Secondly, the Bill will require category 1 services to set clear terms of service regarding the user-generated content they prohibit and/or restrict access to, and to enforce those terms of service effectively. All the major social media platforms such as Meta, Twitter and TikTok say that they ban abuse and harassment online. They all say they ban the promotion of violence and violent threats, yet this content is still easily visible on those sites. People sign up to these platforms expecting one environment, and are presented with something completely different. This must stop.

As well as ensuring the platforms have proper systems to remove banned content, the Bill will also put an end to services arbitrarily removing legal content. The largest platform category 1 services must ensure that they remove or restrict access to content or ban or suspend users only where that is expressly allowed in their terms of service, or where they otherwise have a legal obligation to do so.

This Bill will make sure that adults have the information they need to make informed decisions about the sites they visit, and that platforms are held to their promises to users. Ofcom will have the power to hold platforms to their terms of service, creating a safer and more transparent environment for all.

Thirdly, category 1 services will have a duty to provide adults with tools they can use to reduce the likelihood that they encounter certain categories of content, if they so choose, or to alert them to the nature of that content. This includes content which encourages, promotes, or provides instructions for suicide, self-harm or eating disorders. People will also have the ability to filter out content from unverified users if they so wish. This Bill will mean that adult users will be empowered to make more informed choices about what services they use, and to have greater control over whom and what they engage with online.

It is impossible to speak about the aspects of the Bill which protect adults without, of course, mentioning freedom of expression. The Bill needs to strike a careful balance between protecting users online, while maintaining adults’ ability to have robust—even uncomfortable or unpleasant—conversations within the law if they so choose. Freedom of expression within the law is fundamental to our democracy, and it would not be right for the Government to interfere with what legal speech is permitted on private platforms. Instead, we have developed an approach based on choice and transparency for adult users, bounded by major platforms’ clear commercial incentives to provide a positive experience for their users.

Of course, we cannot have robust debate without being accurately informed of the current global and national landscape. That is why the Bill includes particular protections for recognised news publishers, content of democratic importance, and journalistic content. We have been clear that sanctioned news outlets such as RT, formerly Russia Today, must not benefit from these protections. We will therefore bring forward an amendment in your Lordships’ House explicitly to exclude entities subject to sanctions from the definition of a recognised news publisher.

Alongside the safety duties for children and the empowerment tools for adults, platforms must also have effective reporting and redress mechanisms in place. They will need to provide accessible and effective mechanisms for users to report content which is illegal or harmful, or where it breaches terms and conditions. Users will need to be given access to effective mechanisms to complain if content is removed without good reason.

The Bill will place a duty on platforms to ensure that those reporting mechanisms are backed up by timely and appropriate redress mechanisms. Currently, internet users often do not bother to report harmful content they encounter online, because they do not feel that their reports will be followed up. That too must change. If content has been unfairly removed, it should be reinstated. If content should not have been on the site in question, it should be taken down. If a complaint is not upheld, the reasons should be made clear to the person who made the report.

There have been calls—including from the noble Lord, Lord Stevenson of Balmacara, with whom I look forward to working constructively, as we have done heretofore—to use the Bill to create an online safety ombudsman. We will listen to all suggestions put forward to improve the Bill and the regime it ushers in with an open mind, but as he knows from our discussions, of this suggestion we are presently unconvinced. Ombudsman services in other sectors are expensive, often underused and primarily relate to complaints which result in financial compensation. We find it difficult to envisage how an ombudsman service could function in this area, where user complaints are likely to be complex and, in many cases, do not have the impetus of financial compensation behind them. Instead, the Bill ensures that, where providers’ user-reporting and redress mechanisms are not sufficient, Ofcom will have the power to take enforcement action and require the provider to improve its user-redress provisions to meet the standard required of them. I look forward to probing elements of the Bill such as this in Committee.

This regulatory framework could not be effective if Ofcom, as the independent regulator, did not have a robust suite of powers to take enforcement actions against companies which do not comply with their new duties, and if it failed to take the appropriate steps to protect people from harm. I believe the chairman of Ofcom, the noble Lord, Lord Grade of Yarmouth, is in his place. I am glad that he has been and will be following our debates on this important matter.

Through the Bill, Ofcom will have wide-ranging information-gathering powers to request any information from companies which is relevant to its safety functions. Where necessary, it will be able to ask a suitably skilled person to undertake a report on a company’s activity—for example, on its use of algorithms. If Ofcom decides to take enforcement action, it can require companies to take specific steps to come back into compliance.

Ofcom will also have the power to impose substantial fines of up to £18 million, or 10% of annual qualifying worldwide revenue, whichever is higher. For the biggest technology companies, this could easily amount to billions of pounds. These are significant measures, and we have heard directly from companies that are already changing their safety procedures to ensure they comply with these regulations.

If fines are not sufficient, or not deemed appropriate because of the severity of the breach, Ofcom will be able to apply for a court order allowing it to undertake business disruption measures. This could be blocking access to a website or preventing it making money via payment or advertising services. Of course, Ofcom will be able to take enforcement action against any company that provides services to people in the UK, wherever that company is located. This is important, given the global nature of the internet.

As the Bill stands, individual senior managers can be held criminally liable and face a fine for failing to ensure their platform complies with Ofcom’s information notice. Further, individual senior managers can face jail, a fine or both for failing to prevent the platform committing the offences of providing false information, encrypting information or destroying information in response to an information notice.

The Government have also listened to and acknowledged the need for senior managers to be made personally liable for a wider range of failures of compliance. We have therefore committed to tabling an amendment in your Lordships’ House which will be carefully designed to capture instances where senior managers have consented to or connived in ignoring enforceable requirements, risking serious harm to children. We are carefully designing this amendment to ensure that it can hold senior managers to account for their actions regarding the safety of children, without jeopardising the UK’s attractiveness as a place for technology companies to invest in and grow. We intend to base our offence on similar legislation recently passed in the Republic of Ireland, as well as looking carefully at relevant precedent in other sectors in the United Kingdom.

I have discussed the safety of children, adults, and everyone’s right to free speech. It is not possible to talk about this Bill without also discussing its protections for women and girls, who we know are disproportionately affected by online abuse. As I mentioned, all services in scope will need to seek out and remove priority illegal content proactively. There are a number of offences which disproportionately affect women and girls, such as revenge pornography and cyberstalking, which the Bill requires companies to tackle as a priority.

To strengthen protections for women in particular, we will be listing controlling or coercive behaviour as a priority offence. Companies will have to take proactive measures to tackle this type of illegal content. We will also bring forward an amendment to name the Victims’ Commissioner and the domestic abuse commissioner as statutory consultees for the codes of practice. This means there will be a requirement for Ofcom to consult both commissioners ahead of drafting and amending the codes of practice, ensuring that victims, particularly victims and survivors of domestic abuse, are better protected. The Secretary of State and our colleagues have been clear that women’s and girls’ voices must be heard clearly in developing this legislation.

I also want to take this opportunity to acknowledge the concerns voiced over the powers for the Secretary of State regarding direction in relation to codes of practice that currently appear in the Bill. That is a matter on which my honourable friend Paul Scully and I were pressed by your Lordships’ Communications and Digital Committee when we appeared before it last week. As we explained then, we remain committed to ensuring that Ofcom maintains its regulatory independence, which is vital to the success of this framework. As we are introducing ground-breaking regulation, our aim is to balance the need for the regulator’s independence with appropriate oversight by Parliament and the elected Government.

We intend to bring forward two changes to the existing power: first, replacing the “public policy” wording with a defined list of reasons that a direction can be made; and secondly, making it clear that this element of the power can only be used in exceptional circumstances. I would like to reassure noble Lords—as I sought to reassure the Select Committee—that the framework ensures that Parliament will always have the final say on codes of practice, and that strong safeguards are in place to ensure that the use of this power is transparent and proportionate.

Before we begin our scrutiny in earnest, it is also necessary to recognise that this Bill is not just establishing a regulatory framework. It also updates the criminal law concerning communication offences. I want to thank the Law Commission for its important work in helping to strengthen criminal law for victims. The inclusion of the new offences for false and threatening communications offers further necessary protections for those who need it most. In addition, the Bill includes new offences to criminalise cyberflashing and epilepsy trolling. We firmly believe that these new offences will make a substantive difference to the victims of such behaviour. The Government have also committed to adding an additional offence to address the encouragement or assistance of self-harm communications and offences addressing intimate image abuse online, including deep- fake pornography. Once these offences are introduced, all companies will need to treat this content as illegal under the framework and take action to prevent users from encountering it. These new offences will apply in respect of all victims of such activity, children as well as adults.

This Bill has been years in the making. I am proud to be standing here today as the debate begins in your Lordships’ House. I realise that noble Lords have been waiting long and patiently for this moment, but I know that they also appreciate that considerable work has already been done to ensure that this Bill is proportionate and fair, and that it provides the change that is needed.

A key part of that work was conducted by the Joint Committee, which conducted pre-legislative scrutiny of the Bill, drawing on expertise from across both Houses of Parliament, from all parties and none. I am very glad that all the Members of your Lordships’ House who served on that committee are speaking in today’s debate: the noble Baroness, Lady Kidron; the noble Lords, Lord Stevenson of Balmacara and Lord Knight of Weymouth, who have very helpfully been called to service on the Opposition Front Bench; the noble Lord, Lord Clement-Jones, who speaks for the Liberal Democrats; as well as my noble friends Lord Black of Brentwood and Lord Gilbert of Panteg.

While I look forward to the contributions of all Members of your Lordships’ House, and will continue the open-minded, collaborative approach established by my right honourable friend the Secretary of State and her predecessors—listening to all ideas which are advanced to make this Bill as effective as it can be—I urge noble Lords who are not yet so well-versed in its many clauses and provisions, or who might be disinclined to accept at first utterance the points I make from this Dispatch Box, to consult those noble Lords before bringing forward their amendments in later stages of the Bill. I say that not to discourage noble Lords from doing so, but in the spirit of ensuring that what they do bring forward, and our deliberations on them, will be pithy, focused, and conducive to making this Bill law as swiftly as possible. In that spirit, I shall draw my already too lengthy remarks to a close. I beg to move.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I am grateful to the very many noble Lords who have spoken this afternoon and this evening. They have spoken with passion—we heard that in the voices of so many—about their own experiences, the experiences of their families and the experiences of far too many of our fellow subjects, who have harrowing examples of the need for this Bill. But noble Lords have also spoken with cool-headed precision and forensic care about the aspects of the Bill that demand our careful scrutiny. Both hearts and heads are needed to make this Bill worth the wait.

I am very grateful for the strong consensus that has come through in noble Lords’ speeches on the need to make this Bill law and to do so quickly, and therefore to do our work of scrutiny diligently and speedily. I am grateful for the very generous and public-spirited offer the noble Lord, Lord Stevenson, has just issued. I, too, would like to make this not a party-political matter; it is not and has not been in the speeches we have heard today. The work of your Lordships’ House is to consider these matters in detail and without party politics intruding, and it would be very good if we could proceed on the basis of collaboration, co-operation and, on occasion, compromise.

In that spirit, I should say at the outset that I share the challenge faced by the noble Lords, Lord Clement-Jones and Lord Stevenson. Given that so many speakers have chosen to contribute, I will not be able to cover or acknowledge everyone who has spoken. I shall undoubtedly have to write on many of the issues to provide the technical detail that the matters they have raised deserve. It is my intention to write to noble Lords and invite them to join a series of meetings to look in depth at some of the themes and areas between now and Committee, so that as a group we can have well-informed discussions in Committee. I shall write with details suggesting some of those themes, and if noble Lords feel that I have missed any, or particular areas they would like to continue to talk about, please let me know and I will be happy to facilitate those.

I want to touch on a few of the issues raised today. I shall not repeat some of the points I made in my opening speech, given the hour. Many noble Lords raised the very troubling issue of children accessing pornography online, and I want to talk about that initially. The Government share the concerns raised about the lack of protections for children from this harmful and deeply unsuitable content. That is why the Bill introduces world-leading protections for children from online pornography. The Bill will cover all online sites offering pornography, including commercial pornography sites, social media, video-sharing platforms and fora, as well as search engines, which play a significant role in enabling children to access harmful and age-inappropriate content online. These companies will have to prevent children accessing pornography or face huge fines. To ensure that children are protected from this content, companies will need to put in place measures such as age verification, or demonstrate that the approach they are taking delivers the same level of protection for children.

While the Bill does not mandate that companies use specific technologies to comply with these new duties, in order to ensure that the Bill is properly future-proofed, we expect Ofcom to take a robust approach to sites which pose the highest risk of harm to children, including sites hosting online pornography. That may include directing the use of age verification technologies. Age verification is also referred to in the Bill. This is to make clear that these are measures that the Government expect to be used for complying with the duties under Part 3 and Part 5 to protect children from online pornography. Our intention is to have the regime operational as soon as possible after Royal Assent, while ensuring that the necessary preparations are completed effectively and that service providers understand what is expected of them. We are working very closely with Ofcom to ensure this.

The noble Lord, Lord Morrow, and others asked about putting age verification in the Bill more clearly, as was the case with the Digital Economy Act. The Online Safety Bill includes references to age assurance and age verification in the way I have just set out. That is to make clear that these are measures which the Government expect to be used for complying with the duties where proportionate to do so. While age assurance and age verification are referred to in the Bill, the Government do not mandate the use of specific approaches or technologies. That is similar to the approach taken in the Digital Economy Act, which did not mandate the use of a particular technology either.

I think my noble friend Lord Bethell prefers the definition of pornography in Part 3 of the Digital Economy Act. There is already a robust definition of “pornographic content” in this Bill which is more straightforward for providers and Ofcom to apply. That is important. The definition we have used is similar to the definition of pornographic content used in existing legislation such as the Coroners and Justice Act 2009. It is also in line with the approach being taken by Ofcom to regulate UK-established video-sharing platforms, meaning that the industry will already have familiarity with this definition and that Ofcom will already have experience in regulating content which meets this definition. That means it can take action more swiftly. However, I have heard the very large number of noble Lords who are inclined to support the work that my noble friend is doing in the amendments he has proposed. I am grateful for the time he has already dedicated to conversations with the Secretary of State and me on this and look forward to discussing it in more detail with him between now and Committee.

A number of noble Lords, including the noble Baronesses, Lady Finlay of Llandaff and Lady Kennedy of The Shaws, talked about algorithms. All platforms will need to undertake risk assessments for illegal content. Services likely to be accessed by children will need to undertake a children’s risk assessment to ensure they understand the risks associated with their services. That includes taking into account in particular the risk of algorithms used by their service. In addition, the Bill includes powers to ensure that Ofcom is able effectively to assess whether companies are fulfilling their regulatory requirements, including in relation to the operating of their algorithms. Ofcom will have the power to require information from companies about the operation of their algorithms and the power to investigate non-compliance as well as the power to interview employees. It will have the power to require regulated service providers to undergo a skilled persons report and to audit company systems and processes, including in relation to their algorithms.

The noble Baroness, Lady Kidron, rightly received many tributes for her years of work in relation to so many aspects of this Bill. She pressed me on bereaved parents’ access to data and, as she knows, it is a complex issue. I am very grateful to her for the time she has given to the meetings that the Secretary of State and I have had with her and with colleagues from the Ministry of Justice on this issue, which we continue to look at very carefully. We acknowledge the distress that some parents have indeed experienced in situations such as this and we will continue to work with her and the Ministry of Justice very carefully to assess this matter, mindful of its complexities which, of course, were something the Joint Committee grappled with as well.

The noble Baroness, Lady Featherstone, my noble friend Lady Wyld and others focused on the new cyberflashing offence and suggested that a consent-based approach would be preferable. The Law Commission looked at that in drawing up its proposals for action in this area. The Law Commission’s report raised concerns about the nature of consent in instant messaging conversations, particularly where there are misjudged attempts at humour or intimacy that could particularly affect young people. There is a risk, which we will want to explore in Committee, of overcriminalising young people. That is why the Government have brought forward proposals based on the Law Commission’s work. If noble Lords are finding it difficult to see the Law Commission’s reports, I am very happy to draw them to their attention so that they can benefit from the consultation and thought it conducted on this difficult issue.

The noble Baroness, Lady Gohir, talked about the impact on body image of edited images in advertising. Through its work on the online advertising programme, DCMS is considering how the Government should approach advertisements that contribute to body image concerns. A consultation on this programme closed in June 2022. We are currently analysing the responses to the consultation and developing policy. Where there is harmful user-generated content related to body image that risks having an adverse physical or psychological impact on children, the Online Safety Bill will require platforms to take action against that. Under the Bill’s existing risk assessment duties, regulated services are required to consider how media literacy can be used to mitigate harm for child users. That could include using content provenance technology, which can empower people to identify when content has been digitally altered in ways such as the noble Baroness mentioned.

A number of noble Lords focused on the changes made in relation to the so-called “legal but harmful” measures to ensure that adults have the tools they need to curate and control their experience online. In particular, noble Lords suggested that removing the requirement for companies to conduct risk assessments in relation to a list of priority content harmful to adults would reduce protections available for users. I do not agree with that assessment. The new duties will empower adult users to make informed choices about the services they use and to protect themselves on the largest platforms. The new duties will require the largest platforms to enforce all their terms of service regarding the moderation of user-generated content, not just the categories of content covered in a list in secondary legislation. The largest platforms already prohibit the most abusive and harmful content. Under the new duties, platforms will be required to keep their promises to users and take action to remove it.

There was rightly particular focus on vulnerable adult users. The noble Baronesses, Lady Hollins and Lady Campbell of Surbiton, and others spoke powerfully about that. The Bill will give vulnerable adult users, including people with disabilities, greater control over their online experience too. When using a category 1 service, they will be able to reduce their exposure to online abuse and hatred by having tools to limit the likelihood of their encountering such content or to alert them to the nature of it. They will also have greater control over content that promotes, encourages or provides instructions for suicide, self-harm and eating disorders. User reporting and redress provisions must be easy to access by all users, including people with a disability and adults with caring responsibilities who are providing assistance. Ofcom is of course subject to the public sector equality duty as well, so when performing its duties, including writing its codes of practice, it will need to take into account the ways in which people with protected characteristics, including people with disabilities, can be affected. I would be very happy to meet the noble Baronesses and others on this important matter.

The noble Lords, Lord Hastings of Scarisbrick and Lord Londesborough, and others talked about media literacy. The Government fully recognise the importance of that in achieving online safety. As well as ensuring that companies take action to keep users safe through this Bill, we are taking steps to educate and empower them to make safe and informed choices online. First, the Bill strengthens Ofcom’s existing media literacy functions. Media literacy is included in Ofcom’s new transparency reporting and information-gathering powers. In response to recommendations from the Joint Committee, the legislation also now specifies media literacy in the risk-assessment duties. In July 2021, DCMS published the online media literacy strategy, which sets out our ambition to improve national media literacy. We have committed to publishing annual action plans in each financial year until 2024-25, setting out our plans to deliver that. Furthermore, in December of that year, Ofcom published Ofcom’s Approach to Online Media Literacy, which includes an ambitious range of work focusing on media literacy.

Your Lordships’ House is, understandably, not generally enthusiastic about secondary legislation and secondary legislative powers, so I was grateful for the recognition by many tonight of the importance of providing for them in certain specific instances through this Bill. As the noble Lord, Lord Brooke of Alverthorpe, put it, there may be loopholes that Parliament wishes to close, and quickly. My noble friend Lord Inglewood spoke of the need for “living legislation”, and it is important to stress, as many have, that this Bill seeks to be technology-neutral—not specifying particular technological approaches that may quickly become obsolete—in order to cater for new threats and challenges as yet not envisaged. Some of those threats and challenges were alluded to in the powerful speech of my noble friend Lord Sarfraz. I know noble Lords will scrutinise those secondary powers carefully. I can tell my noble friend that the Bill does apply to companies that enable users to share content online or interact with each other, as well as to search services. That includes a broad range of services, including the metaverse. Where haptics enable user interaction, companies must take action. The Bill is also clear that content generated by bots is in scope where it interacts with user-generated content such as on Twitter, but not if the bot is controlled by or on behalf of the service, such as providing customer services for a particular site.

Given the range of secondary powers and the changing technological landscape, a number of noble Lords understandably focused on the need for post-legislative scrutiny. The Bill has undoubtedly benefited from pre-legislative scrutiny. As I said to my noble friend Lady Stowell of Beeston in her committee last week, we remain open-minded on the best way of doing that. We must ensure that once this regime is in force, it has the impact we all want it to have. Ongoing parliamentary scrutiny will be vital in ensuring that is the case. We do not intend to legislate for a new committee, not least because it is for Parliament itself to decide what committees it sets up. But I welcome further views on how we ensure that we have effective parliamentary scrutiny, and I look forward to discussing that in Committee. We have also made it very clear that the Secretary of State will undertake a review of the effectiveness of the regime between two and five years after it comes into force, producing a report that will then be laid in Parliament, thus providing a statutory opportunity for Parliament to scrutinise the effectiveness of the legislation.

My noble friend and other members of her committee followed up with a letter to me about the Secretary of State’s powers. I shall reply to that letter in detail and make that available to all noble Lords to see ahead of Committee. This is ground-breaking legislation, and we have to balance the need for regulatory independence with the appropriate oversight for Parliament and the Government. In particular, concerns were raised about the Secretary of State’s power of direction in Clause 39. Ofcom’s independence and expertise will be of utmost importance here, but the very broad nature of online harms means that there may be subjects that go beyond its expertise and remit as a regulator. That was echoed by Ofcom itself when giving evidence to the Joint Committee: it noted that there will clearly be some issues in respect of which the Government have access to expertise and information that the regulator does not, such as national security.

The framework in the Bill ensures that Parliament will always have the final say on codes of practice, and the use of the affirmative procedure will further ensure that there is an increased level of scrutiny in the exceptional cases where that element of the power is used. As I said, I know that we will look at that in detail in Committee.

My noble friend Lord Black of Brentwood, quoting Stanley Baldwin, talked about the protections for journalistic content. He and others are right that the free press is a cornerstone of British democracy; that is why the Bill has been designed to protect press and media freedom and why it includes robust provisions to ensure that people can continue to access diverse news sources online. Category 1 companies will have a new duty to safeguard all journalistic content shared on their platform, which includes citizen journalism. Platforms will need to put systems and processes in place to protect journalistic content, and they must enforce their terms of service consistently across all moderation and in relation to journalistic content. They will also need to put in place expedited appeals processes for producers of journalistic content.

The noble Baroness, Lady Anderson of Stoke-on-Trent, spoke powerfully about the appalling abuse and threats of violence she sustained in her democratic duties, and the noble Baroness, Lady Foster, spoke powerfully of the way in which that is putting off people, particularly women, from going into public life. The noble Baroness, Lady Anderson, asked about a specific issue: the automatic deletion of material and the implications for prosecution. We have been mindful of the scenario where malicious users post threatening content which they then delete themselves, and of the burden on services that retaining that information in bulk would cause. We have also been mindful of the imperative to ensure that illegal content cannot be shared and amplified online by being left there. The retention of data for law enforcement purposes is strictly regulated, particularly through the Investigatory Powers Act, which the noble Lord, Lord Anderson of Ipswich, is reviewing at the request of the Home Secretary. I suggest that the noble Baroness and I meet to speak about that in detail, mindful of that ongoing review and the need to bring people to justice.

The noble Baroness, Lady Chakrabarti, asked about sex for rent. Existing offences can be used to prosecute that practice, including Sections 52 and 53 of the Sexual Offences Act 2003, both of which are listed as priority offences in Schedule 7 to the Bill. As a result, all in-scope services must take proactive measures to prevent people being exposed to such content.

The noble Lord, Lord Davies of Brixton, and others talked about scams. The largest and most popular platforms and search engines—category 1 and category 2A services in the Bill—will have a duty to prevent paid-for fraudulent adverts appearing on their services, making it harder for fraudsters to advertise scams online. We know that that can be a particularly devastating crime. The online advertising programme builds on this duty in the Bill and will look at the role of the whole advertising system in relation to fraud, as well as the full gamut of other harms which are caused.

My noble friend Lady Fraser talked about the devolution aspects, which we will certainly look at. Internet services are a reserved matter for the UK Government. The list of priority offences in Schedule 7 can be updated only by the Secretary of State, subject to approval by this Parliament.

The right reverend Prelate the Bishop of Manchester asked about regulatory co-operation, and we recognise the importance of that. Ofcom has existing and strong relationships with other regulators, such as the ICO and the CMA, which has been supported and strengthened by the establishment of the Digital Regulation Cooperation Forum in 2020. We have used the Bill to strengthen Ofcom’s ability to work closely with, and to disclose information to, other regulatory bodies. Clause 104 ensures that Ofcom can do that, and the Bill also requires Ofcom to consult the Information Commissioner.

I do not want to go on at undue length—I am mindful of the fact that we will have detailed debates on all these issues and many more in Committee—but I wish to conclude by reiterating my thanks to all noble Lords, including the many who were not able to speak today but to whom I have already spoken outside the Chamber. They all continue to engage constructively with this legislation to ensure that it meets our shared objectives of protecting children and giving people a safe experience online. I look forward to working with noble Lords in that continued spirit.

My noble friend Lady Morgan of Cotes admitted to being one of the cavalcade of Secretaries of State who have worked on this Bill; I pay tribute to her work both in and out of office. I am pleased that my right honourable friend the Secretary of State was here to observe part of our debate today and, like all noble Lords, I am humbled that Ian Russell has been here to follow our debate in its entirety. The experience of his family and too many others must remain uppermost in our minds as we carry out our duty on the Bill before us; I know that it will be. We have an important task before us, and I look forward to getting to it.

Bill read a second time.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, let me start by saying how saying how pleased I, too, am that we are now in Committee. I thank all noble Lords for giving up their time to attend the technical briefings that officials in my department and I have held since Second Reading and for the collaborative and constructive nature of their contributions in those discussions.

In particular, not least because today is his birthday, I pay tribute to the noble Lord, Lord Stevenson of Balmacara, for his tireless work on the Bill—from his involvement in its pre-legislative scrutiny to his recall to the Front Bench in order to see the job through. We are grateful for his diligence and, if I may say so, the constructive and collaborative way in which he has gone about it. He was right to pay tribute both to my noble friend Lord Gilbert of Panteg, who chaired the Joint Committee, and to the committee’s other members, including all the other signatories to this amendment. The Bill is a better one for their work, and I repeat my thanks to them for it. In that spirit, I am grateful to the noble Lord for bringing forward this philosophical opening amendment. As noble Lords have said, it is a helpful place for us to start and refocus our thoughts as we begin our line-by-line scrutiny of this Bill.

Although I agree with the noble Lord’s broad description of his amendment’s objectives, I am happy to respond to the challenge that lies behind it and put the objectives of this important legislation clearly on the record at the outset of our scrutiny. The Online Safety Bill seeks to bring about a significant change in online safety. The main purposes of the Bill are: to give the highest levels of protection to children; to protect users of all ages from being exposed to illegal content; to ensure that companies’ approach focuses on proactive risk management and safety by design; to protect people who face disproportionate harm online including, for instance, because of their sex or their ethnicity or because they are disabled; to maintain robust protections for freedom of expression and privacy; and to ensure that services are transparent and accountable.

The Bill will require companies to take stringent measures to tackle illegal content and protect children, with the highest protections in the Bill devoted to protecting children; as the noble Baroness, Lady Benjamin, my noble friend Lord Cormack and others have again reminded us today, that is paramount. Children’s safety is prioritised throughout this Bill. Not only will children be protected from illegal content through its illegal content duties but its child safety duties add an additional layer of protection so that children are protected from harmful or inappropriate content such as grooming, pornography and bullying. I look forward to contributions from the noble Baroness, Lady Kidron, and others who will, I know, make sure that our debates are properly focused on that.

Through their duties of care, all platforms will be required proactively to identify and manage risk factors associated with their services in order to ensure both that users do not encounter illegal content and that children are protected from harmful content. To achieve this, they will need to design their services to reduce the risk of harmful content or activity occurring and take swift action if it does.

Regulated services will need to prioritise responding to online content and activity that present the highest risk of harm to users, including where this is linked to something classified as a protected characteristic under the terms of the Equality Act 2010. This will ensure that platforms protect users who are disproportionately affected by online abuse—for example, women and girls. When undertaking child safety and illegal content risk assessments, providers must consider whether certain people face a greater risk of harm online and ensure that those risks are addressed and mitigated.

The Bill will place duties relating to freedom of expression and privacy on both Ofcom and all in-scope companies. Those companies will have to consider and implement safeguards for freedom of expression when fulfilling their duties. Ofcom will need to carry out its new duties in a way that protects freedom of expression. The largest services will also have specific duties to protect democratic and journalistic content.

Ensuring that services are transparent about the risks on their services and the actions they are taking to address them is integral to this Bill. User-to-user services must set out in their terms of service how they are complying with their illegal and child safety duties. Search services must do the same in public statements. In addition, government amendments that we tabled yesterday will require the biggest platforms to publish summaries of their illegal and their child safety risk assessments, increasing transparency and accountability, and Ofcom will have a power to require information from companies to assess their compliance with providers’ duties.

Finally, the Bill will also increase transparency and accountability relating to platforms with the greatest influence over public discourse. They will be required to ensure that their terms of service are clear and properly enforced. Users will be able to hold platforms accountable if they fail to enforce those terms.

The noble Baroness, Lady Kidron, asked me to say which of the proposed new paragraphs (a) to (g), to be inserted by Amendment 1, are not the objectives of this Bill. Paragraph (a) sets out that the Bill must ensure that services

“do not endanger public health or national security”.

The Bill will certainly have a positive impact on national security, and a core objective of the Bill is to ensure that platforms are not used to facilitate terrorism. Ofcom will issue a stand-alone code on terrorism, setting out how companies can reduce the risk of their services being used to facilitate terrorist offences, and remove such content swiftly if it appears. Companies will also need to tackle the new foreign interference offence as a priority offence. This will ensure that the Bill captures state-sponsored disinformation, which is of most concern—that is, attempts by foreign state actors to manipulate information to interfere in our society and undermine our democratic, political and legal processes.

The Bill will also have a positive impact on public health but I must respectfully say that that is not a primary objective of the legislation. In circumstances where there is a significant threat to public health, the Bill already provides powers for the Secretary of State both to require Ofcom to prioritise specified objectives when carrying out its media literacy activity and to require companies to report on the action they are taking to address the threat. Although the Bill may lead to additional improvements—I am sure that we all want to see them—for instance, by increasing transparency about platforms’ terms of service relating to public health issues, making this a primary objective on a par with the others mentioned in the noble Lord’s amendment risks making the Bill much broader and more unmanageable. It is also extremely challenging to prohibit such content, where it is viewed by adults, without inadvertently capturing useful health advice or legitimate debate and undermining the fundamental objective of protecting freedom of expression online—a point to which I am sure we will return.

The noble Lord’s amendment therefore reiterates many objectives that are interwoven throughout the legislation. I am happy to say again on the record that I agree with the general aims it proposes, but I must say that accepting it would be more difficult than the noble Lord and others who have spoken to it have set out. Accepting this amendment, or one like it, would create legal uncertainty. I have discussed with the officials sitting in the Box—the noble Baroness, Lady Chakrabarti, rightly paid tribute to them—the ways in which such a purposive statement, as the noble Lord suggests, could be made; we discussed it between Second Reading and now.

I appreciate the care and thought with which the noble Lord has gone about this—mindful of international good practice in legislation and through discussion with the Public Bill Office and others, to whom he rightly paid tribute—but any deviation from the substantive provisions of the Bill and the injection of new terminology risk creating uncertainty about the proper interpretation and application of those provisions. We have heard that again today; for example, the noble Baroness, Lady Fox, said that she was not clear what the meaning of certain words may be while my noble friend Lady Stowell made a plea for simplicity in legislation. The noble Lord, Lord Griffiths, also gave an eloquent exposition of the lexicographical befuddlement that can ensue when new words are added. All pointed to some confusion; indeed, there have been areas of disagreement even in what I am sure the noble Lord, Lord Stevenson, thinks was a very consensual summary of the purposes of the Bill.

That legal uncertainty could provide the basis for an increased number of judicial reviews or challenges to the decisions taken under the Bill and its framework, creating significant obstacles to the swift and effective implementation of the new regulatory framework, which I know is not something that he or other noble Lords would want. As noble Lords have noted, this is a complicated Bill, but adding further statements and new terminology to it, for however laudable a reason, risks adding to that complication, which can only benefit those with, as the noble Baroness, Lady Kidron, put it, the deepest pockets.

However, lest he think that I and the Government have not listened to his pleas or those of the Joint Committee, I highlight, as my noble friend Lady Stowell did, that the Joint Committee’s original recommendation was that these objectives

“should be for Ofcom”.

The Government took that up in Schedule 4 to the Bill, and in Clause 82(4), which set out objectives for the codes and for Ofcom respectively. At Clause 82(4) the noble Lord will see the reference to

“the risk of harm to citizens presented by content on regulated services”

and

“the need for a higher level of protection for children than for adults”.

I agree with the noble Baroness, Lady Chakrabarti, that it is not impossible to add purposive statements to Bills and nor is it unprecedented. I echo her tribute to the officials and lawyers in government who have worked on this Bill and given considerable thought to it. She has had the benefit of sharing their experience and the difficulties of writing tightly worded legislation. In different moments of her career, she has also had the benefit of picking at the loose threads in legislation and poking at the holes in it. That is the purpose of lawyers who question the thoroughness with which we have all done our work. I will not call them “pesky lawyers”, as she did—but I did hear her say it. I understand the point that she was making in anticipation but reassure her that she has not pre-empted the points that I was going to make.

To the layperson, legislation is difficult to understand, which is why we publish Explanatory Notes, on which the noble Baroness and others may have had experience of working before. I encourage noble Lords, not just today but as we go through our deliberations, to consult those as well. I hope that noble Lords will agree that they are more easily understood, but if they do not do what they say and provide explanation, I will be very willing to listen to their thoughts on it.

So, while I am not going to give the noble Lord, Lord Stevenson, the birthday present of accepting his amendment, I hope that the clear statement that I gave at the outset from this Dispatch Box, which is purposive as well, about the objectives of the Bill, and my outline of how it tries to achieve them, is a sufficient public statement of our intent, and that it achieves what I hope he was intending to get on the record today. I invite him to withdraw his amendment.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

Well, my Lords, it has been a very good debate, and we should be grateful for that. In some senses, I should bank that; we have got ourselves off to a good start for the subsequent debates and discussions that we will have on the nearly 310 amendments that we must get through before the end of the process that we have set out on.

However, let us pause for a second. I very much appreciated the response, not least because it was very sharp and very focused on the amendment. It would have been tempting to go wider and wider, and I am sure that the Minister had that in mind at some point, but he has not done that. The first substantial point that he made seemed to be a one-pager about what this Bill is about. Suitably edited and brought down to manageable size, it would fit quite well into the Bill. I am therefore a bit puzzled as to why he cannot make the jump, intellectually or otherwise, from having that written for him and presumably working on it late at night with candles so that it was perfect—because it was pretty good; I will read it very carefully in Hansard, but it seemed to say everything that I wanted to say and covered most of the points that everybody else thought of to say, in a way that would provide clarity for those seeking it.

The issue we are left with was touched on by the noble Baroness, Lady Stowell, in her very perceptive remarks. Have we got this pointing in the right direction? We should think about it as a way for the Government to get out of this slightly ridiculous shorthand of the safest place to be online, to a statement to themselves about what they are trying to do, rather than an instruction to Ofcom—because that is where it gets difficult and causes problems with the later stages. This is really Parliament and government agreeing to say this, in print, rather than just through reading Hansard. That then reaches back to where my noble friend Lady Chakrabarti is, and it helps the noble Baroness, Lady Harding, with her very good point, that this will not work if people do not even bother to get through the first page.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Finally, let me say this in anticipation of the Minister perhaps suggesting that this might be a good idea but we are far down the road with the Bill and Ofcom is ready to go and we want to get on with implementing it, so maybe let us not do this now but perhaps in another piece of legislation. Personally, I am interested in having a conversation about the sequence of implementation. It might be that we can implement the regime that Ofcom is good to go on but with the powers there in the Bill for it to cover app stores and some other wider internet services, according to a road map that it sets out and that we in Parliament can scrutinise. However, my general message is, as the noble Baroness, Lady Kidron, said, that we should get this right in this legislation and grab the opportunity, particularly with app stores, to bring other internet services in—given that we consume so much through applications—and to provide a safer environment for our children.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, I share noble Lords’ determination to deliver the strongest protections for children and to develop a robust and future-proofed regulatory regime. However, it will not be possible to solve every problem on the internet through this Bill, nor through any piece of legislation, flagship or otherwise. The Bill has been designed to confer duties on the services that pose the greatest risk of harm—user-to-user services and search services—and where there are proportionate measures that companies can take to protect their users.

As the noble Baroness, Lady Kidron, and others anticipated, I must say that these services act as a gateway for users to discover and access other online content through search results and links shared on social media. Conferring duties on these services will therefore significantly reduce the risk of users going on to access illegal or harmful content on non-regulated services, while keeping the scope of the Bill manageable and enforceable.

As noble Lords anticipated, there is also a practical consideration for Ofcom in all this. I know that many noble Lords are extremely keen to see this Bill implemented as swiftly as possible; so am I. However, as the noble Lord, Lord Allan, rightly pointed out, making major changes to the Bill’s scope at this stage would have significant implications for Ofcom’s implementation timelines. I say this at the outset because I want to make sure that noble Lords are aware of those implications as we look at these issues.

I turn first to Amendments 2, 3, 5, 92 and 193, tabled by the noble Baroness, Lady Kidron. These aim to expand the number of services covered by the Bill to incorporate a broader range of services accessed by children and a broader range of harms. I will cover the broader range of harms more fully in a separate debate when we come to Amendment 93, but I am very grateful to the noble Baroness for her constructive and detailed discussions on these issues over the past few weeks and months.

These amendments would bring new services into scope of the duties beyond user-to-user and search services. This could include services which enable or promote commercial harms, including consumer businesses such as online retailers. As I have just mentioned in relation to the previous amendments, bringing many more services into scope would delay the implementation of Ofcom’s priorities and risk detracting from its work overseeing existing regulated services where the greatest risk of harm exists—we are talking here about the services run by about 2.5 million businesses in the UK alone. I hope noble Lords will appreciate from the recent communications from Ofcom how challenging the implementation timelines already are, without adding further complication.

Amendment 92 seeks to change the child-user condition in the children’s access assessment to the test in the age-appropriate design code. The test in the Bill is already aligned with the test in that code, which determines whether a service is likely to be accessed by children, in order to ensure consistency for providers. The current child-user condition determines that a service is likely to be accessed by children where it has a significant number or proportion of child users, or where it is of a kind likely to attract a significant number or proportion of child users. This will already bring into scope services of the kind set out in this amendment, such as those which are designed or intended for use by children, or where children form a—

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I am sorry to interrupt. Will the Minister take the opportunity to say what “significant” means, because that is not aligned with the ICO code, which has different criteria?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

If I can finish my point, this will bring into scope services of the kind set out in the amendments, such as those designed or intended for use by children, or where children form a substantial and identifiable user group. The current condition also considers the nature and content of the service and whether it has a particular appeal for children. Ofcom will be required to consult the Information Commissioner’s Office on its guidance to providers on fulfilling this test, which will further support alignment between the Bill and the age-appropriate design code.

On the meaning of “significant”, a significant number of children means a significant number in itself or a significant proportion of the total number of UK-based users on the service. In the Bill, “significant” has its ordinary meaning, and there are many precedents for it in legislation. Ofcom will be required to produce and publish guidance for providers on how to make the children’s access assessment. Crucially, the test in the Bill provides more legal certainty and clarity for providers than the test outlined in the code. “Substantive” and “identifiable”, as suggested in this amendment, do not have such a clear legal meaning, so this amendment would give rise to the risk that the condition is more open to challenge from providers and more difficult to enforce. On the other hand, as I said, “significant” has an established precedent in legislation, making it easier for Ofcom, providers and the courts to interpret.

The noble Lord, Lord Knight, talked about the importance of future-proofing the Bill and emerging technologies. As he knows, the Bill has been designed to be technology neutral and future-proofed, to ensure that it keeps pace with emerging technologies. It will apply to companies which enable users to share content online or to interact with each other, as well as to search services. Search services using AI-powered features will be in scope of the search duties. The Bill is also clear that content generated by AI bots is in scope where it interacts with user-generated content, such as bots on Twitter. The metaverse is also in scope of the Bill. Any service which enables users to interact as the metaverse does will have to conduct a child access test and comply with the child safety duties if it is likely to be accessed by children.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

I know it has been said that the large language models, such as that used by ChatGPT, will be in scope when they are embedded in search, but are they in scope generally?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

They are when they apply to companies enabling users to share content online and interact with each other or in terms of search. They apply in the context of the other duties set out in the Bill.

Amendments 19, 22, 298 and 299, tabled by my noble friend Lady Harding of Winscombe, seek to impose child safety duties on application stores. I am grateful to my noble friend and others for the collaborative approach that they have shown and for the time that they have dedicated to discussing this issue since Second Reading. I appreciate that she has tabled these amendments in the spirit of facilitating a conversation, which I am willing to continue to have as the Bill progresses.

As my noble friend knows from our discussions, there are challenges with bringing application stores—or “app stores” as they are popularly called—into the scope of the Bill. Introducing new duties on such stores at this stage risks slowing the implementation of the existing child safety duties, in the way that I have just outlined. App stores operate differently from user-to-user and search services; they pose different levels of risk and play a different role in users’ experiences online. Ofcom would therefore need to recruit different people, or bring in new expertise, to supervise effectively a substantially different regime. That would take time and resources away from its existing priorities.

We do not think that that would be a worthwhile new route for Ofcom, given that placing child safety duties on app stores is unlikely to deliver any additional protections for children using services that are already in the scope of the Bill. Those services must already comply with their duties to keep children safe or will face enforcement action if they do not. If companies do not comply, Ofcom can rely on its existing enforcement powers to require app stores to remove applications that are harmful to children. I am happy to continue to discuss this matter with my noble friend and the noble Lord, Lord Knight, in the context of the differing implementation timelines, as he has asked.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

The Minister just said something that was material to this debate. He said that Ofcom has existing powers to prevent app stores from providing material that would have caused problems for the services to which they allow access. Can he confirm that?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Perhaps the noble Lord could clarify his question; I was too busy finishing my answer to the noble Lord, Lord Knight.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

It is a continuation of the point raised by the noble Baroness, Lady Harding, and it seems that it will go part of the way towards resolving the differences that remain between the Minister and the noble Baroness, which I hope can be bridged. Let me put it this way: is it the case that Ofcom either now has powers or will have powers, as a result of the Bill, to require app stores to stop supplying children with material that is deemed in breach of the law? That may be the basis for understanding how you can get through this. Is that right?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Services already have to comply with their duties to keep children safe. If they do not comply, Ofcom has powers of enforcement set out, which require app stores to remove applications that are harmful to children. We think this already addresses the point, but I am happy to continue discussing it offline with the noble Lord, my noble friend and others who want to explore how. As I say, we think this is already covered. A more general duty here would risk distracting from Ofcom’s existing priorities.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, on that point, my reading of Clauses 131 to 135, where the Bill sets out the business disruption measures, is that they could be used precisely in that way. It would be helpful for the Minister responding later to clarify that Ofcom would use those business disruption measures, as the Government explicitly anticipate, were an app store, in a rogue way, to continue to list a service that Ofcom has said should not be made available to people in the United Kingdom.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I will be very happy to set that out in more detail.

Amendments 33A and 217A in the name of the noble Lord, Lord Storey, would place a new duty on user-to-user services that predominantly enable online gaming. Specifically, they would require them to have a classification certificate stating the age group for which they are suitable. We do not think that is necessary, given that there is already widespread, voluntary uptake of approval classification systems in online gaming.

--- Later in debate ---
Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, it has certainly been an interesting debate, and I am grateful to noble Lords on all sides of the Committee for their contributions and considerations. I particularly thank the noble Lords who tabled the amendments which have shaped the debate today.

In general, on these Benches, we believe that the Bill offers a proportionate approach to tackling online harms. We feel that granting some of the exemptions proposed in this group would be unintentionally counterproductive and would raise some unforeseen difficulties. The key here—and it has been raised by a number of noble Lords, including the noble Baronesses, Lady Harding and Lady Kidron, and, just now, the noble Lord, Lord Clement-Jones, who talked about the wider considerations of the Joint Committee and factors that should be taken into account—is that we endorse a risk-based approach. In this debate, it is very important that we take ourselves back to that, because that is the key.

My view is that using other factors, such as funding sources or volunteer engagement in moderation, cuts right across this risk-based approach. To refer to Amendment 4, it is absolutely the case that platforms with fewer than 1 million UK monthly users have scope to create considerable harm. Indeed, noble Lords will have seen that later amendments call for certain small platforms to be categorised on the basis of the risk—and that is the important word—that they engender, rather than the size of the platform, which, unfortunately, is something of a crude measure. The point that I want to make to the noble Baroness, Lady Fox, is that it is not about the size of the businesses and how they are categorised but what they actually do. The noble Baroness, Lady Kidron, rightly said that small is not safe, for all the reasons that were explained, including by the noble Baroness, Lady Harding.

Amendment 9 would exempt small and medium-sized enterprises and certain other organisations from most of the Bill’s provisions. I am in no doubt about the well-meaning nature of this amendment, tabled by the noble Lord, Lord Moylan, and supported by the noble Lord, Lord Vaizey. Indeed, there may well be an issue about how start-ups and entrepreneur unicorns cope with the regulatory framework. We should attend to that, and I am sure that the Minister will have something to say about it. But I also expect that the Minister will outline why this would actually be unhelpful in combating many of the issues that this Bill is fundamentally designed to deal with if we were to go down the road of these exclusions.

In particular, granting exemptions simply on the basis of a service’s size could lead to a situation where user numbers are capped or perhaps even where platforms are deliberately broken up to avoid regulation. This would have an effect that none of us in this Chamber would want to see because it would embed harmful content and behaviour rather than helping to reduce them.

Referring back to the comments of the noble Lord, Lord Moylan, I agree with the noble Lord, Lord Vaizey, in his reflection. I, too, have not experienced the two sides of the Chamber that the noble Lord, Lord Moylan, described. I feel that the Chamber has always been united on the matter of child safety and in understanding the ramifications for business. It is the case that good legislation must always seek a balance, but, to go back to the point about excluding small and medium-sized enterprises, to call them a major part of the British economy is a bit of an understatement when they account for 99.9% of the business population. In respect of the exclusion of community-based services, including Wikipedia—and we will return to this in the next group—there is nothing for platforms to fear if they have appropriate systems in place. Indeed, there are many gains to be had for community-based services such as Wikipedia from being inside the system. I look forward to the further debate that we will have on that.

I turn to Amendment 9A in the name of my noble friend Lord Knight of Weymouth, who is unable to participate in this section of the debate. It probes how the Bill’s measures would apply to specialised search services. Metasearch engines such as Skyscanner have expressed concern that the legislation might impose unnecessary burdens on services that pose little risk of hosting the illegal content targeted by the Bill. Perhaps the Minister, in his response, could confirm whether or not such search engines are in scope. That would perhaps be helpful to our deliberations today.

While we on these Benches are not generally supportive of exemptions, the reality is that there are a number of online search services that return content that would not ordinarily be considered harmful. Sites such as Skyscanner and Expedia, as we all know, allow people to search for and book flights and other travel services such as car hire. Obviously, as long as appropriate due diligence is carried out on partners and travel agents, the scope for users to encounter illegal or harmful material appears to be minimal and returns us to the point of having a risk-based approach. We are not necessarily advocating for a carve-out from the Bill, but it would perhaps be helpful to our deliberations if the Minister could outline how such platforms will be expected to interact with the Ofcom-run online safety regime.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I am sympathetic to arguments that we must avoid imposing disproportionate burdens on regulated services, but I cannot accept the amendments tabled by the noble Baroness, Lady Fox, and others. Doing so would greatly reduce the strong protections that the Bill offers to internet users, particularly to children. I agree with the noble Baroness, Lady Merron, that that has long been the shared focus across your Lordships’ House as we seek to strike the right balance through the Bill. I hope to reassure noble Lords about the justification for the existing balance and scope, and the safeguards built in to prevent undue burdens to business.

I will start with the amendments tabled by the noble Baroness, Lady Fox of Buckley—Amendments 4, 6 to 8, 12, 288 and 305—which would significantly narrow the definition of services in scope of regulation. The current scope of the Bill reflects evidence of where harm is manifested online. There is clear evidence that smaller services can pose a significant risk of harm from illegal content, as well as to children, as the noble Baroness, Lady Kidron, rightly echoed. Moreover, harmful content and activity often range across a number of services. While illegal content or activity may originate on larger platforms, offenders often seek to move to smaller platforms with less effective systems for tackling criminal activity in order to circumvent those protections. Exempting smaller services from regulation would likely accelerate that process, resulting in illegal content being displaced on to smaller services, putting users at risk.

These amendments would create significant new loopholes in regulation. Rather than relying on platforms and search services to identify and manage risk proactively, they would require Ofcom to monitor smaller harmful services, which would further annoy my noble friend Lord Moylan. Let me reassure the noble Baroness, however, that the Bill has been designed to avoid disproportionate or unnecessary burdens on smaller services. All duties on services are proportionate to the risk of harm and the capacity of companies. This means that small, low-risk services will have minimal duties imposed on them. Ofcom’s guidance and codes of practice will set out how they can comply with their duties, in a way that I hope is even clearer than the Explanatory Notes to the Bill, but certainly allowing for companies to have a conversation and ask for areas of clarification, if that is still needed. They will ensure that low-risk services do not have to undertake unnecessary measures if they do not pose a risk of harm to their users.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, while my noble friend is talking about the possibility of excessive and disproportionate burden on businesses, can I just ask him about the possibility of excessive and disproportionate burden on the regulator? He seems to be saying that Ofcom is going to have to maintain, and keep up to date regularly, 25,000 risk assessments—this is on the Government’s own assessment, produced 15 months ago, of the state of the market then—even if those assessments carried out by Ofcom result in very little consequence for the regulated entity.

We know from regulation in this country that regulators already cannot cope with the burdens placed on them. They become inefficient, sclerotic and unresponsive; they have difficulty in recruiting staff of the same level and skills as the entities that they regulate. We have a Financial Services and Markets Bill going through at the moment, and the FCA is a very good example of that. Do we really think that this is a sensible burden to place on a regulator that is actually able to discharge it?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

The Bill creates a substantial new role for Ofcom, but it has already substantially recruited and prepared for the effective carrying out of that new duty. I do not know whether my noble friend was in some of the briefings with officials from Ofcom, but it is very happy to set out directly the ways in which it is already discharging, or preparing to discharge, those duties. The Government have provided it with further resource to enable it to do so. It may be helpful for my noble friend to have some of those discussions directly with the regulator, but we are confident that it is ready to discharge its duties, as set out in the Bill.

I was about to say that we have already had a bit of discussion on Wikipedia. I am conscious that we are going to touch on it again in the debate on the next group of amendments so, at the risk of being marked down for repetition, which is a black mark on that platform, I shall not pre-empt what I will say shortly. But I emphasise that the Bill does not impose prescriptive, one-size-fits-all duties on services. The codes of practice from Ofcom will set out a range of measures that are appropriate for different types of services in scope. Companies can follow their own routes to compliance, so long as they are confident that they are effectively managing risks associated with legal content and, where relevant, harm to children. That will ensure that services that already use community moderation effectively can continue to do so—such as Wikipedia, which successfully uses that to moderate content. As I say, we will touch on that more in the debate on the next group.

Amendment 9, in the name of my noble friend Lord Moylan, is designed to exempt small and medium sized-enterprises working to benefit the public from the scope of the Bill. Again, I am sympathetic to the objective of ensuring that the Bill does not impose undue burdens on small businesses, and particularly that it should not inhibit services from providing valuable content of public benefit, but I do not think it would be feasible to exempt service providers deemed to be

“working to benefit the public”.

I appreciate that this is a probing amendment, but the wording that my noble friend has alighted on highlights the difficulties of finding something suitably precise and not contestable. It would be challenging to identify which services should qualify for such an exemption.

Taking small services out of scope would significantly undermine the framework established by the Bill, as we know that many smaller services host illegal content and pose a threat to children. Again, let me reassure noble Lords that the Bill has been designed to avoid disproportionate or unnecessary regulatory burdens on small and low-risk services. It will not impose a disproportionate burden on services or impede users’ access to value content on smaller services.

Amendment 9A in the name of the noble Lord, Lord Knight of Weymouth, is designed to exempt “sector specific search services” from the scope of the Bill, as the noble Baroness, Lady Merron, explained. Again, I am sympathetic to the intention here of ensuring that the Bill does not impose a disproportionate burden on services, but this is another amendment that is not needed as it would exempt search services that may pose a significant risk of harm to children, or because of illegal content on them. The amendment aims to exempt specialised search services—that is, those that allow users to

“search for … products or services … in a particular sector”.

It would exempt specialised search services that could cause harm to children or host illegal content—for example, pornographic search services or commercial search services that could facilitate online fraud. I know the noble Lord would not want to see that.

The regulatory duties apply only where there is a significant risk of harm and the scope has been designed to exclude low-risk search services. The duties therefore do not apply to search engines that search a single database or website, for example those of many retailers or other commercial websites. Even where a search service is in scope, the duties on services are proportionate to the risk of harm that they pose to users, as well as to a company’s size and capacity. Low-risk services, for example, will have minimal duties. Ofcom will ensure that these services can quickly and easily comply by publishing risk profiles for low-risk services, enabling them easily to understand their risk levels and, if necessary, take steps to mitigate them.

The noble Lord, Lord McCrea, asked some questions about the 200 most popular pornographic websites. If I may, I will respond to the questions he posed, along with others that I am sure will come in the debate on the fifth group, when we debate the amendments in the names of the noble Lord, Lord Morrow, and the noble Baroness, Lady Ritchie of Downpatrick, because that will take us on to the same territory.

I hope that provides some assurance to my noble friend Lord Moylan, the noble Baroness, Lady Fox, and others, and that they will be willing not to press their amendments in this group.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I thank people for such a wide-ranging and interesting set of contributions. I take comfort from the fact that so many people understood what the amendments were trying to do, even if they did not fully succeed in that. I thought it was quite interesting that in the first debate the noble Lord, Lord Allan of Hallam, said that he might be a bit isolated on the apps, but I actually agreed with him—which might not do his reputation any good. However, when he said that, I thought, “Welcome to my world”, so I am quite pleased that this has not all been shot down in flames before we started. My amendment really was a serious attempt to tackle something that is a real problem.

The Minister says that the Bill is designed to avoid disproportionate burdens on services. All I can say is, “Sack the designer”. It is absolutely going to have a disproportionate burden on a wide range of small services, which will not be able to cope, and that is why so many of them are worried about it. Some 80% of the companies that will be caught up in this red tape are small and micro-businesses. I will come to the small business point in a moment.

The noble Baroness, Lady Harding, warned us that small tech businesses become big tech businesses. As far as I am concerned, that is a success story—it is what I want; is it not what we all want? Personally, I think economic development and growth is a positive thing—I do not want them to fail. However, I do not think it will ever happen; I do not think that small tech businesses will ever grow into big tech businesses if they face a disproportionate burden in the regulatory sense, as I have tried to describe. That is what I am worried about, and it is not a positive thing to be celebrated.

I stress that it is not small tech and big tech. There are also community sites, based on collective moderation. Wikipedia has had a lot of discussion here. For a Bill that stresses that it wants to empower users, we should think about what it means when these user-moderated community sites are telling us that they will not be able to carry on and get through. That is what they are saying. It was interesting that the noble Lord, Lord Clement-Jones, said that he relies on Wikipedia—many of us do, although please do not believe what it says about me. There are all of these things, but then there was a feeling that, well, Reddit is a bit dodgy. The Bill is not meant to be deciding which ones to trust in quite that way, or people’s tastes.

I was struck that the noble Baroness, Lady Kidron, said that small is not safe, and used the incel example. I am not emphasising that small is safe; I am saying that the small entities will not survive this process. That is my fear. I do not mean that the big ones are nasty and dangerous and the small ones are cosy, lovely and Wikipedia-like. I am suggesting that smaller entities will not be able to survive the regulatory onslaught. That is the main reason I raised this.

The noble Baroness, Lady Merron, said that these entities can cause great harm. I am worried about a culture of fear, in which we demonise tens of thousands of innocent tech businesses and communities and end up destroying them when we do not intend to. I tried to put in the amendment an ability for Ofcom, if there are problematic sites that are risky, to deal with them. As the Minister kept saying, low-risk search engines have been exempted. I am suggesting that low-risk small and micro-businesses are exempted, which is the majority of them. That is what I am suggesting, rather than that we assume they are all guilty and then they have to get exempted.

Interestingly, the noble Lord, Lord McCrea, asked how many pornography sites are in scope and which pornographic websites have a million or fewer users. I am glad I do not know the answer to that, otherwise people might wonder why I did. The point is that there are always going to be sites that are threatening or a risk to children, as we are discussing. But we must always bear in mind—this was the important point that the noble Lord, Lord Moylan, made—that in our absolute determination to protect children via this Bill we do not unintendedly damage society as a whole. Adult access to free speech, for example, is one of my concerns, as are businesses and so on. We should not have that as an outcome.

--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

Like others, I had prepared quite extensive notes to respond to what I thought the noble Lord was going to say about his amendments in this group, and I have not been able to find anything left that I can use, so I am going to have to extemporise slightly. I think it is very helpful to have a little non-focused discussion about what we are about to talk about in terms of age, because there is a snare and a delusion in quite a lot of it. I was put in mind of that in the discussions on the Digital Economy Act, which of course precedes the Minister but is certainly still alive in our thinking: in fact, we were talking about it earlier today.

The problem I see is that we have to find a way of squaring two quite different approaches. One is to prevent those who should not be able to see material, because it is illegal for them to see it. The other is to find a way of ensuring that we do not end up with an age-gated internet, which I am grateful to find that we are all, I think, agreed about: that is very good to know.

Age is very tricky, as we have heard, and it is not the only consideration we have to bear in mind in wondering whether people should be able to gain access to areas of the internet which we know will be bad and difficult for them. That leads us, of course, to the question about legal but harmful, now resolved—or is it? We are going to have this debate about age assurance and what it is. What is age verification? How do they differ? How does it matter? Is 18 a fixed and final point at which we are going to say that childhood ends and adulthood begins, and therefore one is open for everything? It is exactly the point made earlier about how to care for those who should not be exposed to material which, although legal for them by a number called age, is not appropriate for them in any of the circumstances which, clinically, we might want to bring to bear.

I do not think we are going to resolve these issues today—I hope not. We are going to talk about them for ever, but at this stage I think we still need a bit of thinking outside a box which says that age is the answer to a lot of the problems we have. I do not think it is, but whether the Bill is going to carry that forward I have my doubts. How we get that to the next stage, I do not know, but I am looking forward to hearing the Minister’s comments on it.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I agree that this has been a rather unfortunate grouping and has led to a slightly strange debate. I apologise if it is the result of advice given to my noble friend. I know there has been some degrouping as well, which has led to slightly odd combinations today. However, as promised, I shall say a bit more about Wikipedia in relation to my noble friend’s Amendments 10 and 11.

The effect of these amendments would be that moderation actions carried out by users—in other words, community moderation of user-to-user and search services —would not be in scope of the Bill. The Government support the use of effective user or community moderation by services where this is appropriate for the service in question. As I said on the previous group, as demonstrated by services such as Wikipedia, this can be a valuable and effective means of moderating content and sharing information. That is why the Bill does not impose a one-size-fits-all requirement on services, but instead allows services to adopt their own approaches to compliance, so long as these are effective. The noble Lord, Lord Allan of Hallam, dwelt on this. I should be clear that duties will not be imposed on individual community moderators; the duties are on platforms to tackle illegal content and protect children. Platforms can achieve this through, among other things, centralised or community moderation. Ultimately, however, it is they who are responsible for ensuring compliance and it is platforms, not community moderators, who will face enforcement action if they fail to do so.

--- Later in debate ---
Moved by
12A: Clause 6, page 5, line 11, at end insert “(2) to (8)”
Member’s explanatory statement
This amendment is consequential on the amendments in the Minister’s name to clause 9 below (because the new duty to summarise illegal content risk assessments in the terms of service is only imposed on providers of Category 1 services).
--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, this group of government amendments relates to risk assessments; it may be helpful if I speak to them now as the final group before the dinner break.

Risk management is at the heart of the Bill’s regulatory framework. Ofcom and services’ risk assessments will form the foundation for protecting users from illegal content and content which is harmful to children. They will ensure that providers thoroughly identify the risks on their own websites, enabling them to manage and mitigate the potential harms arising from them. Ofcom will set out the risks across the sector and issue guidance to companies on how to conduct their assessments effectively. All providers will be required to carry out risk assessments, keep them up-to-date and update them before making a significant change to the design or operation of their service which could put their users at risk. Providers will then need to put in place measures to manage and mitigate the risks they identify in their risk assessments, including any emerging risks.

Given how crucial the risk assessments are to this framework, it is essential that we enable them to be properly scrutinised by the public. The government amendments in this group will place new duties on providers of the largest services—that is, category 1 and 2A services—to publish summaries of their illegal and child safety risk assessments. Through these amendments, providers of these services will also have a new duty to send full records of their risk assessments to Ofcom. This will increase transparency about the risk of harm on the largest platforms, clearly showing how risk is affected by factors such as the design, user base or functionality of their services. These amendments will further ensure that the risk assessments can be properly assessed by internet users, including by children and their parents and guardians, by ensuring that summaries of the assessments are publicly available. This will empower users to make informed decisions when choosing whether and how to use these services.

It is also important that Ofcom is fully appraised of the risks identified by service providers. That is why these amendments introduce duties for both category 1 and 2A services to send their records of these risk assessments, in full, to Ofcom. This will make it easier for Ofcom to supervise compliance with the risk assessment duties, as well as other duties linked to the findings of the risk assessments, rather than having to request the assessments from companies under its information-gathering powers.

These amendments also clarify that companies must keep a record of all aspects of their risk assessments, which strengthens the existing record-keeping duties on services. I hope that noble Lords will welcome these amendments. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, it is risky to stand between people and their dinner, but I rise very briefly to welcome these amendments. We should celebrate the good stuff that happens in Committee as well as the challenging stuff. The risk assessments are, I think, the single most positive part of this legislation. Online platforms already do a lot of work trying to understand what risks are taking place on their platforms, which never sees the light of day except when it is leaked by a whistleblower and we then have a very imperfect debate around it.

The fact that platforms will have to do a formal risk assessment and share it with a third-party regulator is huge progress; it will create a very positive dynamic. The fact that the public will be able to see those risk assessments and make their own judgments about which services to use—according to how well they have done them—is, again, a massive public benefit. We should welcome the fact that risk assessments are there and the improvements that this group of amendments makes to them. I hope that was short enough.

--- Later in debate ---
Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to the Minister for introducing this group, and we certainly welcome this tranche of government amendments. We know that there are more to come both in Committee and as we proceed to Report, and we look forward to seeing them.

The amendments in this group, as other noble Lords have said, amount to a very sensible series of changes to services’ risk-assessment duties. This perhaps begs the question of why they were not included in earlier drafts of the Bill, but we are glad to see them now.

There is, of course, the issue of precisely where some of the information will appear, as well as the wider status of terms of service. I am sure those issues will be discussed in later debates. It is certainly welcome that the department is introducing stronger requirements around the information that must be made available to users; it will all help to make this a stronger and more practical Bill.

We all know that users need to be able to make informed decisions, and it will not be possible if they are required to view multiple statements and various documents. It seems that the requirements for information to be provided to Ofcom go to the very heart of the Bill, and I suggest that the proposed system will work best if there is trust and transparency between the regulator and those who are regulated. I am sure that there will be further debate on the scope of risk assessments, particularly on issues that were dropped from previous iterations of the Bill, and certainly this is a reasonable starting point today.

I will try to be as swift as possible as I raise a few key issues. One is about avoiding warnings that are at such a high level of generality that they get put on to everything. Perhaps the Minister could indicate how Ofcom will ensure that the summaries are useful and accessible to the reader. The test, of course, should be that a summary is suitable and sufficient for a prospective user to form an assessment of the likely risk they would encounter when using the service, taking into account any special vulnerabilities that they might have. That needs to be the test; perhaps the Minister could confirm that.

Is the terms of service section the correct place to put a summary of the illegal content risk assessment? Research suggests, unsurprisingly, that only 3% of people read terms before signing up—although I recall that, in an earlier debate, the Minister confessed that he had read all the terms and conditions of his mobile phone contract, so he may be one of the 3%. It is without doubt that any individual should be supported in their ability to make choices, and the duty should perhaps instead be to display a summary of the risks with due prominence, to ensure that anyone who is considering signing up to a service is really able to read it.

I also ask the Minister to confirm that, despite the changes to Clause 19 in Amendment 16B, the duty to keep records of risk assessments will continue to apply to all companies, but with an enhanced responsibility for category 1 companies.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I am grateful to noble Lords for their questions on this, and particularly grateful to the noble Lord, Lord Allan, and the noble Baroness, Lady Kidron, for their chorus of welcome. Where we are able to make changes, we will of course bring them forward, and I am glad to be able to bring forward this tranche now.

As the noble Lord, Lord Allan, said, ensuring the transparency of services’ risk assessments will further ensure that the framework of the Bill delivers its core objectives relating to effective risk management and increased accountability regarding regulated services. As we have discussed, it is imperative that these providers take a thorough approach to identifying risks, including emerging risks. The Government believe that it is of the utmost importance that the public are able effectively to scrutinise the risk assessments of the largest in-scope services, so that users can be empowered to make informed decisions about whether and how to use their services.

On the questions from the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, about why it is just category 1 and category 2A services, we estimate that there will be around 25,000 UK service providers in scope of the Bill’s illegal and child safety duties. Requiring all these companies to publish full risk assessments and proactively to send them to Ofcom could undermine the Bill’s risk-based and proportionate approach, as we have discussed in previous groups on the burdens to business. A large number of these companies are likely to be low risk and it is unlikely that many people will seek out their risk assessments, so requiring all companies to publish them would be an excessive regulatory burden.

There would also be an expectation that Ofcom would proactively monitor a whole range of services, even ones that posed a minimal risk to users. That in turn could distract Ofcom from taking a risk-based approach in its regulation by overwhelming it with paperwork from thousands of low-risk services. If Ofcom wants to see records of the risk assessments of providers that are not category 1 or category 2A services, it has extensive information-gathering powers that it can use to require a provider to send it such records.

The noble Baroness, Lady Merron, was right to say that I read the terms of my broadband supply—I plead guilty to the nerdiness of doing that—but I have not read all the terms and conditions of every application and social medium I have downloaded, and I agree that many people do skim through them. They say the most commonly told lie on the planet at the moment is “I agree to the terms and conditions”, and the noble Baroness is right to point to the need for these to be intelligible, easily accessible and transparent—which of course we want to see.

In answer to her other question, the record-keeping duty will apply to all companies, but the requirement to publish is only for category 1 and category 2A companies.

The noble Baroness, Lady Kidron, asked me about Amendment 27A. If she will permit me, I will write to her with the best and fullest answer to that question.

I am grateful to noble Lords for their questions on this group of amendments.

Amendment 12A agreed.
Moved by
12B: Clause 6, page 5, line 16, at end insert “(2) to (6)”
Member’s explanatory statement
This amendment is consequential on the amendments in the Minister’s name to clause 19 below (because the new duty to supply records of risk assessments to OFCOM is only imposed on providers of Category 1 services).

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, first, I will address Amendments 12BA, 183A and 183B, tabled by the noble Baroness, Lady Ritchie of Downpatrick, who I was grateful to discuss them with earlier today, and the noble Lord, Lord Morrow, whose noble friend, the noble Lord, Lord Browne of Belmont, I am grateful to for speaking to them on his behalf.

These amendments seek to apply the duties in Part 5 of the Bill, which are focused on published pornographic content and user-generated pornography. Amendments 183A and 183B are focused particularly on making sure that children are protected from user-to-user pornography in the same way as from published pornography, including through the use of age verification. I reassure the noble Baroness and the noble Lord that the Government share their concerns; there is clear evidence about the impact of pornography on young people and the need to protect children from it.

This is where I come to the questions posed earlier by the noble Lord, Lord McCrea of Magherafelt and Cookstown. The research we commissioned from the British Board of Film Classification assessed the functionality of and traffic to the UK’s top 200 most visited pornographic websites. The findings indicated that 128 of the top 200 most visited pornographic websites—that is just under two-thirds, or 64%—would have been captured by the proposed scope of the Bill at the time of the Government’s initial response to the online harms White Paper, and that represents 85% of the traffic to those 200 websites.

Since then, the Bill’s scope has been broadened to include search services and pornography publishers, meaning that children will be protected from pornography wherever it appears online. The Government expect companies to use age-verification technologies to prevent children accessing services which pose the highest risk to children, such as online pornography. Age-assurance technologies and other measures will be used to provide children with an age-appropriate experience on their service.

As noble Lords know, the Bill does not mandate that companies use specific approaches or technologies when keeping children safe online as it is important that the Bill is future-proofed: what is effective today might not be so effective in the future. Moreover, age verification may not always be the most appropriate or effective approach for user-to-user companies to comply with their duties under the Bill. For instance, if a user-to-user service, such as a social medium, does not allow pornography under its terms of service, measures such as strengthening content moderation and user reporting would be more appropriate and effective for protecting children than age verification. That would allow content to be better detected and removed, instead of restricting children from a service that is designed to be appropriate for their use—as my noble friend Lady Harding of Winscombe puts it, avoiding the situation where children are removed from these services altogether.

While I am sympathetic to the aims of these amendments, I assure noble Lords that the Bill already has robust, comprehensive protections in place to keep children safe from all pornographic content, wherever or however it appears online. This amendment is therefore unnecessary because it duplicates the existing provisions for user-to-user pornography in the child safety duties in Part 3.

It is important to be clear that, wherever they are regulated in the Bill, companies will need to ensure that children cannot access pornographic content online. This is made clear, for user-to-user content, in Clause 11(3); for search services, in Clause 25(3); and for published pornographic content in Clause 72(2). Moving the regulation of pornography from Part 3 to Part 5 would not be a workable or desirable option because the framework is effective only if it is designed to reflect the characteristics of the services in scope.

Part 3 has been designed to address the particular issues arising from the rapid growth in platforms that allow the sharing of user-generated content but are not the ones choosing to upload that content. The scale and speed of dissemination of user-generated content online demands a risk-based and proportionate approach, as Part 3 sets out.

It is also important that these companies understand the risks to children in the round, rather than focusing on one particular type of content. Risks to children will often be a consequence of the design of these services—for instance, through algorithms, which need to be tackled holistically.

I know that the noble Baroness is concerned about whether pornography will indeed be designated as primary priority content for the purposes of the child safety duties in Clauses 11(3) and 25(3). The Government fully intend this to be the case, which means that user-to-user services will need to have appropriate systems to prevent children accessing pornography, as defined in Clause 70(2).

The approach taken in Part 3 is very different from services captured under Part 5, which are publishing content directly, know exactly where it is located on their site and already face legal liability for the content. In this situation the service has full control over its content, so a risk-based approach is not appropriate. It is reasonable to expect that service to prevent children accessing pornography. We do not therefore consider it necessary or effective to apply the Part 5 duties to user-to-user pornographic content.

I also assure the noble Baroness and the noble Lord that, in a case where a provider of user-to-user services is directly publishing pornographic content on its own service, it will already be subject to the Part 5 duties in relation to that particular content. Those duties in relation to that published pornographic content will be separate from and in addition to their Part 3 duties in relation to user-generated pornographic content.

This means that, no matter where published pornographic content appears, the obligation to ensure that children are not normally able to encounter it will apply to all in-scope internet service providers that publish pornographic content. This is made clear in Clause 71(2) and is regardless of whether they also offer user-to-user or search services.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I am sorry, but can the Minister just clarify that? Is he saying that it is not possible to be covered by both Part 3 and Part 5, so that where a Part 5 service has user-generated content it is also covered by Part 3? Can he clarify that you cannot just escape Part 5 by adding user-generated content?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Yes, that is correct. I was trying to address the points raised by the noble Baroness, but the noble Lord is right. The point on whether people might try to be treated differently by allowing comments or reviews on their content is that they would be treated the same way. That is the motivation behind the noble Baroness’s amendment trying to narrow the definition. There is no risk that a publisher of pornographic content could evade their Part 5 duties by enabling comments or reviews on their content. That would be the case whether or not those reviews contained words, non-verbal indications that a user liked something, emojis or any other form of user-generated content.

That is because the Bill has been designed to confer duties on different types of content. Any service with provider pornographic content will need to comply with the Part 5 duties to ensure that children cannot normally encounter such content. If they add user-generated functionality—

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

I am sorry to come back to the same point, but let us take the Twitter example. As a publisher of pornography, does Twitter then inherit Part 5 responsibilities in as much as it is publishing pornography?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

It is covered in the Bill as Twitter. I am not quite sure what my noble friend is asking me. The harms that he is worried about are covered in different ways. Twitter or another social medium that hosts such content would be hosting it, not publishing it, so would be covered by Part 3 in that instance.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

Maybe my noble friend the Minister could write to me to clarify that point, because it is quite a significant one.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Perhaps I will speak to the noble Lord afterwards and make sure I have his question right before I do so.

I hope that answers the questions from the noble Baroness, Lady Ritchie, and that on that basis, she will be happy to withdraw her amendment.

Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a very wide-ranging debate, concentrating not only on the definition of pornography but on the views of noble Lords in relation to how it should be regulated, and whether it should be regulated, as the noble Baroness, Lady Kidron, the noble Lords, Lord Bethell and Lord Browne, and I myself believe, or whether it should be a graduated response, which seems to be the view of the noble Lords, Lord Allan and Lord Clement-Jones.

I believe that all pornography should be treated the same. There is no graduated response. It is something that is pernicious and leads to unintended consequences for many young people, so therefore it needs to be regulated in all its forms. I think that is the point that the noble Lord, Lord Bethell, was making. I believe that these amendments should have been debated along with those of the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, because then we could have an ever wider-ranging debate, and I look forward to that in the further groups in the days to come. The focus should be on the content, not on the platform, and the content is about pornography.

I agree with the noble Baroness, Lady Kidron, that porn is not the only harm, and I will be supporting her amendments. I believe that they should be in the Bill because if we are serious about dealing with these issues, they have to be in there.

I do not think my amendments are suggesting that children will be removed from social media. I agree that it is a choice to remove pornography or to age-gate. Twitter is moving to subscriber content anyway, so it can do it; the technology is already available to do that. I believe you just age-gate the porn content, not the whole site. I agree with the noble Lord, Lord Clement-Jones, as I said. These amendments should have been debated in conjunction with those of the noble Lord, Lord Bethell, and the noble Baroness, Lady Kidron, as I believe that the amendments in this group are complementary to those, and I think I already said that in my original submission.

I found the Minister’s response interesting. Obviously, I would like time to read Hansard. I think certain undertakings were given, but I want to see clearly spelled out where they are and to discuss with colleagues across the House where we take these issues and what we come back with on Report.

I believe that these issues will be debated further in Committee when the amendments from the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, are debated. I hope that in the intervening period the Minister will have time to reflect on the issues raised today about Parts 3 and 5 and the issue of pornography, and that he will be able to help us in further sessions in assuaging the concerns that we have raised about pornography. There is no doubt that these issues will come back. The only way that they can be dealt with, that pornography can be dealt with and that all our children throughout the UK can be dealt with is through proper regulation.

I think we all need further reflection. I will see, along with colleagues, whether it is possible to come back on Report. In the meantime, I beg leave to withdraw the amendment.

--- Later in debate ---
Moved by
12C: Clause 6, page 5, line 23, at end insert “(2) to (10)”
Member’s explanatory statement
This amendment is consequential on the amendments in the Minister’s name to clause 11 below (because the new duty to summarise children’s risk assessments in the terms of service is only imposed on providers of Category 1 services).

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome this debate, which revisits some of the areas discussed in earlier debates about the scope of the Bill, as many noble Lords said. It allows your Lordships’ House to consider what has to be the primary driver for assessment. In my view and as others said, it ought to be about risk, which has to be the absolute driver in all this. As the noble Baroness, Lady Harding, said, businesses do not remain static: they start at a certain size and then change. Of course, we hope that many of the businesses we are talking about will grow, so this is about preparation for growth and the reality of doing businesses.

As we discussed, there certainly are cases where search providers may, by their very nature, be almost immune from presenting users with content that could be considered either harmful or illegal under this legislative framework. The new clause proposed by the noble Lord, Lord Moylan—I am grateful to him for allowing us to explore these matters—and its various consequential amendments, would limit the duty to prevent access to illegal content to core category 2A search providers, rather than all search providers, as is currently the case under Clause 23(3).

The argument that I believe the noble Lord, Lord Moylan, put forward is that the illegal content duty is unduly wide, placing a disproportionate and otherwise unacceptable burden on smaller and/or supposedly safer search providers. He clearly said he was not saying that small was safe—that is now completely understood—but he also said that absolute safety is not achievable. As the noble Baroness, Lady Kidron, said, that is indeed so. If this legislation is too complex and creates the wrong provisions, we will clearly be a long way away from our ambition, which here has to be to have in place the best legislative framework, one that everyone can work with and that provides the maximum opportunity for safety and what we all seek to achieve.

Of course, the flip side of the argument about an unacceptable burden on smaller, or on supposedly safer, search providers may be that they would in fact have very little work to do to comply with the illegal content duty, at least in the short term. But the duty would act as an important safeguard, should the provider’s usual systems prove ineffective with the passage of time. Again, that point was emphasised in this and the previous debate by the noble Baroness, Lady Harding.

We look forward to the Minister’s response to find out which view he and his department subscribe to or, indeed, whether they have another view they can bring to your Lordships’ House. But, on the face of it, the current arrangements do not appear unacceptably onerous.

Amendment 157 in the name of the noble Lord, Lord Pickles, and introduced by the noble Baroness, Lady Deech, deals with search by a different approach by inserting requirements about search services’ publicly available statements into Clause 65. In the debate, the noble Baroness and the noble Lord, Lord Weir, raised very important, realistic examples of where search engines can take us, including to material that encourages racism directed at Jews and other groups and encourages hatred of various groups, including Jews. The amendment talks about issues such as the changing of algorithms or the hiding of content and the need to ensure that the terms of providers’ publicly available statements are applied as consistently.

I look forward to hearing from the Minister in response to Amendment 157 as the tech certainly moves us beyond questions of scope and towards discussion of the conduct of platforms when harm is identified.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, I must first apologise for my slightly dishevelled appearance as I managed to spill coffee down my shirt on my way to the Chamber. I apologise for that—as the fumes from the dried coffee suffuse the air around me. It will certainly keep me caffeinated for the day ahead.

Search services play a critical role in users’ online experience, allowing them easily to find and access a broad range of information online. Their gateway function, as we have discussed previously, means that they also play an important role in keeping users safe online because they have significant influence over the content people encounter. The Bill therefore imposes stringent requirements on search services to tackle the risks from illegal content and to protect children.

Amendments 13, 15, 66 to 69 and 73 tabled by my noble friend Lord Moylan seek to narrow the scope of the Bill so that its safety search duties apply only to the largest search services—categorised in the Bill as category 2A services—rather than to all search services. Narrowing the scope in this way would have an adverse impact on the safety of people using search services, including children. Search services, including combined services, below the category 2A threshold would no longer have a duty to minimise the risk of users encountering illegal content or children encountering harmful content in or via search results. This would increase the likelihood of users, including children, accessing illegal content and children accessing harmful content through these services.

The Bill already takes a targeted approach and the duties on search services will be proportionate to the risk of harm and the capacity of companies. This means that services which are smaller and lower-risk will have a lighter regulatory burden than those which are larger and higher-risk. All search services will be required to conduct regular illegal content risk assessments and, where relevant, children’s risk assessments, and then implement proportionate mitigations to protect users, including children. Ofcom will set out in its codes of practice specific steps search services can take to ensure compliance and must ensure that these are proportionate to the size and capacity of the service.

The noble Baroness, Lady Kidron, and my noble friend Lady Harding of Winscombe asked how search services should conduct their risk assessments. Regulated search services will have a duty to conduct regular illegal content risk assessments, and where a service is likely to be accessed by children it will have a duty to conduct regular children’s risk assessments, as I say. They will be required to assess the level and nature of the risk of individuals encountering illegal content on their service, to implement proportionate mitigations to protect people from illegal content, and to monitor them for effectiveness. Services likely to be accessed by children will also be required to assess the nature and level of risk of their service specifically for children to identify and implement proportionate mitigations to keep children safe, and to monitor them for effectiveness as well.

Companies will also need to assess how the design and operation of the service may increase or reduce the risks identified and Ofcom will have a duty to issue guidance to assist providers in carrying out their risk assessments. That will ensure that providers have, for instance, sufficient clarity about what an appropriate risk assessment looks like for their type of service.

The noble Lord, Lord Allan, and others asked about definitions and I congratulate noble Lords on avoiding the obvious

“To be, or not to be”


pun in the debate we have just had. The noble Lord, Lord Allan, is right in the definition he set out. On the rationale for it, it is simply that we have designated as category 1 the largest and riskiest services and as category 2 the smaller and less risky ones, splitting them between 2A, search services, and 2B, user-to-user services. We think that is a clear framework. The definitions are set out a bit more in the Explanatory Notes but that is the rationale.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I am grateful to the Minister for that clarification. I take it then that the Government’s working assumption is that all search services, including the biggest ones, are by definition less risky than the larger user-to-user services. It is just a clarification that that is their thinking that has informed this.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

As I said, the largest and riskiest sites may involve some which have search functions, so the test of large and most risky applies. Smaller and less risky search services are captured in category 2A.

Amendment 157 in the name of my noble friend Lord Pickles, and spoken to by the noble Baroness, Lady Deech, seeks to apply new duties on the largest search services. I agree with the objectives in my noble friend’s amendment of increasing transparency about the search services’ operations and enabling users to hold them to account. It is not, however, an amendment I can accept because it would duplicate existing duties while imposing new duties which we do not think are appropriate for search services.

As I say, the Bill will already require search services to set out how they are fulfilling their illegal content and child safety duties in publicly available statements. The largest search services—category 2A—will also be obliged to publish a summary of their risk assessments and to share this with Ofcom. That will ensure that users know what to expect on those search services. In addition, they will be subject to the Bill’s requirements relating to user reporting and redress. These will ensure that search services put in place effective and accessible mechanisms for users to report illegal content and content which is harmful to children.

My noble friend’s amendment would ensure that the requirements to comply with its publicly available statements applied to all actions taken by a search service to prevent harm, not just those relating to illegal content and child safety. This would be a significant expansion of the duties, resulting in Ofcom overseeing how search services treat legal content which is accessed by adults. That runs counter to the Government’s stated desire to avoid labelling legal content which is accessed by adults as harmful. It is for adult users themselves to determine what legal content they consider harmful. It is not for us to put in place measures which could limit their access to legal content, however distasteful. That is not to say, of course, that where material becomes illegal in its nature that we do not share the determination of the noble Baroness, my noble friend and others to make sure that it is properly tackled. The Secretary of State and Ministers have had extensive meetings with groups making representations on this point and I am very happy to continue speaking to my noble friend, the noble Baroness and others if they would welcome it.

I hope that that provides enough reassurance for the amendment to be withdrawn at this stage.

--- Later in debate ---
Moved by
13A: Clause 6, page 5, line 35, after “service” insert “is not a Category 2A service and”
Member’s explanatory statement
This technical amendment ensures that the duties imposed on providers of combined services in relation to the search engine are correct following the changes to clause 20 arising from the new duties in clauses 23, 25 and 29 which are imposed on providers of Category 2A services only.
--- Later in debate ---
Finally, I know this is unpopular as far as the Government are concerned, but is there not a concern that we are running a coach and horses through some of our well thought-through and important issues relating to human rights? The EHRC’s paper says that the provisions in Clause 110 may be disproportionate and an infringement of millions of individuals’ rights to privacy where those individuals are not suspected of any wrongdoing. This is not a right or wrong issue; it is a proportion issue. We need to balance that. I do not know if have heard the Minister set out exactly why the measures in the Bill meet that set of conditions, so I would be grateful if he could talk about that or, if not, write to us. If we are in danger of heading into issues which are raised by Article 8 of the ECHR—I know the noble Lord opposite may not be a huge supporter of it, but it is an important part of our current law, and senior Ministers have said how important it will be in the future—surely we must have safeguards which will protect it.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, this has indeed been a very good debate on a large group of amendments. We have benefited from two former Ministers, the noble Lord, Lord McNally, and my noble friend Lord Kamall. I hope it is some solace to my noble friend that, such a hard act is he to follow, his role has been taken on by two of us on the Front Bench—myself at DCMS and my noble friend Lord Camrose at the new Department for Science, Innovation and Technology.

The amendments in this group are concerned with the protection of user privacy under the Bill and the maintenance of end-to-end encryption. As noble Lords have noted, there has been some recent coverage of this policy in the media. That reporting has not always been accurate, and I take this opportunity to set the record straight in a number of areas and seek to provide the clarity which the noble Lord, Lord Stevenson of Balmacara, asked for just now.

Encryption plays a crucial role in the digital realm, and the UK supports its responsible use. The Bill does not ban any service design, nor will it require services materially to weaken any design. The Bill contains strong safeguards for privacy. Broadly, its safety duties require platforms to use proportionate systems and processes to mitigate the risks to users resulting from illegal content and content that is harmful to children. In doing so, platforms must consider and implement safeguards for privacy, including ensuring that they are complying with their legal responsibilities under data protection law.

With regard to private messaging, Ofcom will set out how companies can comply with their duties in a way that recognises the importance of protecting users’ privacy. Importantly, the Bill is clear that Ofcom cannot require companies to use proactive technology, such as automated scanning, on private communications in order to comply with their safety duties.

In addition to these cross-cutting protections, there are further safeguards concerning Ofcom’s ability to require the use of proactive technology, such as content identification technology on public channels. That is in Clause 124(6) of the Bill. Ofcom must consider a number of matters, including the impact on privacy and whether less intrusive measures would have the equivalent effect, before it can require a proactive technology.

The implementation of end-to-end encryption in a way that intentionally blinds companies to criminal activity on their services, however, has a disastrous effect on child safety. The National Center for Missing & Exploited Children in the United States of America estimates that more than half its reports could be lost if end-to-end encryption were implemented without preserving the ability to tackle child sexual abuse—a conundrum with which noble Lords grappled today. That is why our new regulatory framework must encourage technology companies to ensure that their safety measures keep pace with this evolving and pernicious threat, including minimising the risk that criminals are able to use end-to-end encrypted services to facilitate child sexual abuse and exploitation.

Given the serious risk of harm to children, the regulator must have appropriate powers to compel companies to take the most effective action to tackle such illegal and reprehensible content and activity on their services, including in private communications, subject to stringent legal safeguards. Under Clause 110, Ofcom will have a stand-alone power to require a provider to use, or make best endeavours to develop, accredited technology to tackle child sexual exploitation and abuse, whether communicated publicly or privately, by issuing a notice. Ofcom will use this power as a last resort only when all other measures have proven insufficient adequately to address the risk. The only other type of harm for which Ofcom can use this power is terrorist content, and only on public communications.

The use of the power in Clause 110 is subject to additional robust safeguards to ensure appropriate protection of users’ rights online. Ofcom will be able to require the use of technology accredited as being highly accurate only in specifically detecting illegal child sexual exploitation and abuse content, ensuring a minimal risk that legal content is wrongly identified. In addition, under Clause 112, Ofcom must consider a number of matters, including privacy and whether less intrusive means would have the same effect, before deciding whether it is necessary and proportionate to issue a notice.

The Bill also includes vital procedural safeguards in relation to Ofcom’s use of the power. If Ofcom concludes that issuing a notice is necessary and proportionate, it will need to publish a warning notice to provide the company an opportunity to make representations as to why the notice should not be issued or why the detail contained in it should be amended. In addition, the final notice must set out details of the rights of appeal under Clause 149. Users will also be able to complain to and seek action from a provider if the use of a specific technology results in their content incorrectly being removed and if they consider that technology is being used in a way that is not envisaged in the terms of service. Some of the examples given by the noble Baroness, Lady Fox of Buckley, pertain in this instance.

The Bill also recognises that in some cases there will be no available technology compatible with the particular service design. As I set out, this power cannot be used by Ofcom to require a company to take any action that is not proportionate, including removing or materially weakening encryption. That is why the Bill now includes an additional provision for this scenario, to allow Ofcom to require technology companies to use their best endeavours to develop or find new solutions that work on their services while meeting the same high standards of accuracy and privacy protection. Given the ingenuity and resourcefulness of the sector, it is reasonable to ask it to do everything possible to protect children from abuse and exploitation. I echo the comments made by the noble Lord, Lord Allan, about the work being done across the sector to do that.

More broadly, the regulator must uphold the right to privacy under its Human Rights Act obligations when implementing the new regime. It must ensure that its actions interfere with privacy only where it is lawful, necessary and proportionate to do so. I hope that addresses the question posed by the noble Lord, Lord Stevenson. In addition, Ofcom will be required to consult the Information Commissioner’s Office when developing codes of practice and relevant pieces of guidance.

I turn now to Amendments 14—

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Before the Minister does so, can he give a sense of what he means by “best endeavours” for those technology companies? If it is not going to be general monitoring of what is happening as the message moves from point to point—we have had some discussions about the impracticality and issues attached to monitoring at one end or the other—what, theoretically, could “best endeavours” possibly look like?

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am hesitant to give too tight a definition, because we want to remain technology neutral and make sure that we are keeping an open mind to developing changes. I will think about that and write to the noble Lord. The best endeavours will inevitably change over time as new technological solutions present themselves. I point to the resourcefulness of the sector in identifying those, but I will see whether there is anything more I can add.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

While the Minister is reflecting, I note that the words “best endeavours” are always a bit of a worry. The noble Lord, Lord Allan, made the good point that once it is on your phone, you are in trouble and you must report it, but the frustration of many people outside this Chamber, if it has been on a phone and you cannot deal with it, is what comes next to find the journey of that piece of material without breaking encryption. I speak to the tech companies very often—indeed, I used to speak to the noble Lord, Lord Allan, when he was in position at then Facebook—but that is the question that we would like answered in this Committee, because the frustration that “It is nothing to do with us” is where we stop with our sympathy.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

The noble Baroness’s intervention has given me an opportunity to note that I am about to say a little more on best endeavours, which will not fully answer the question from the noble Lord, Lord Knight, but I hope fleshes it out a little more.

I do that in turning to Amendments 14, 108 and 205, which seek to clarify that companies will not be required to undertake fundamental changes to the nature of their service, such as the removal or weakening of end-to-end encryption. As I previously set out, the Bill does not require companies to weaken or remove any design and there is no requirement for them to do so as part of their risk assessments or in response to a notice. Instead, companies will need to undertake risk assessments, including consideration of risks arising from the design of their services, before taking proportionate steps to mitigate and manage these risks. Where relevant, assessing the risks arising from end-to-end encryption will be an integral part of this process.

This risk management approach is well established in almost every other industry and it is right that we expect technology companies to take user safety into account when designing their products and services. We understand that technologies used to identify child sexual abuse and exploitation content, including on private communications, are in some cases nascent and complex. They continue to evolve, as I have said. That is why Ofcom has the power through the Bill to issue a notice requiring a company to make best endeavours to develop or source technology.

This notice will include clear, proportionate and enforceable steps that the company must take, based on the relevant information of the specific case. Before issuing a warning notice, Ofcom is expected to enter into informal consultation with the company and/or to exercise information-gathering powers to determine whether a notice is necessary and proportionate. This consultation period will assist in establishing what a notice to develop a technology may require and appropriate steps for the company to take to achieve best endeavours. That dialogue with Ofcom is part of the process.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

There are a lot of phrases here—best endeavour, proportionate, appropriate steps—that are rather subjective. The concern of a number of noble Lords is that we want to address this issue but it is a matter of how it is applied. That is one of the reasons why noble Lords were asking for some input from the legal profession, a judge or otherwise, to make those judgments.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

All the phrases used in the Bill are subject to the usual scrutiny through the judicial process—that is why we debate them now and think about their implications—but of course they can, and I am sure will, be tested in the usual legal ways. Once a company has developed a new technology that meets minimum standards of accuracy, Ofcom may require its use but not before considering matters including the impact on user privacy, as I have set out. The Bill does not specify which tools are likely to be required, as we cannot pre-empt Ofcom’s evidence-based and case-by-case assessment.

Amendment 285 intends to clarify that social media platforms will not be required to undertake general monitoring of the activity of their users. I agree that the protection of privacy is of utmost importance. I want to reassure noble Lords, in particular my noble friend Lady Stowell of Beeston, who asked about it, that the Bill does not require general monitoring of all content. The clear and strong safeguards for privacy will ensure that users’ rights are protected.

Setting out clear and specific safeguards will be more effective in protecting users’ privacy than adopting the approach set out in Amendment 285. Ofcom must consider a number of matters, including privacy, before it can require the use of proactive technology. The government amendments in this group, Amendments 290A to 290G, further clarify that technology which identifies words, phrases or images that indicate harm is subject to all of these restrictions. General monitoring is not a clearly defined concept—a point made just now by my noble friend Lord Kamall. It is used in EU law but is not defined clearly in that, and it is not a concept in UK law. This lack of clarity could create uncertainty that some technology companies might attempt to exploit in order to avoid taking necessary and proportionate steps to protect their users. That is why we resist Amendment 285.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I understand the point the Minister is making, but it is absolutely crystal clear that, whatever phrase is used, the sensibility is quite clear that the Government are saying on record, at the Dispatch Box, that the Bill can in no way be read as requiring anybody to provide a view into private messaging or encrypted messaging unless there is good legal cause to suspect criminality. That is a point that the noble Baroness, Lady Stowell, made very clearly. One may not like the phrasing used in other legislatures, but could we find a form of words that will make it clear that those who are operating in this legal territory are absolutely certain about where they stand on that?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, I want to give clear reassurance that the Bill does not require general monitoring of all content. We have clear and strong safeguards for privacy in the Bill to ensure that users’ rights are protected. I set out the concerns about use of the phrase “general monitoring”. I hope that provides clarity, but I may have missed the noble Lord’s point. The brief answer to the question I think he was asking is yes.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

Let the record stand clear: yes. It was the slight equivocation around how the Minister approached and left that point that I was worried about, and that people might seek to use that later. Words from the Dispatch Box are never absolute and they are never meant to be, but the fact that they have been said is important. I am sure that everybody understands that point, and the Minister did say “yes” to my question.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I did, and I am happy to say it again: yes.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

Perhaps I might go back to an earlier point. When the Minister said the Government want to make sure, I think he was implying that certain companies would try to avoid obligations to keep their users safe by threatening to leave or whatever. I want it to be clear that the obligations to the users of the service are, in the instance of encrypted services, to protect their privacy, and they see that as keeping them safe. It would be wrong to make that a polar opposite. I think that companies that run unencrypted services believe that to be what their duties are—so that in a way is a clash.

Secondly, I am delighted by the clarity in the Minister’s “yes” answer, but I think that maybe there needs to be clearer communication with people outside this Chamber. People are worried about whether duties placed on Ofcom to enact certain things would lead to some breach of encryption. No one thinks that the Government intend to do this or want to spy on anyone, but that the unintended consequences of the duty on Ofcom might have that effect. If that is not going to be the case, and that can be guaranteed by the Government, and they made that clear, it would reassure not just the companies but the users of messaging services, which would be helpful.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

The points the noble Baroness has just made bring me neatly to what I was about to say in relation to the question raised earlier by the noble Lord, Lord Knight of Weymouth. But first, I would say that Ofcom as a public body is subject to public law principles already, so those apply in this case.

The noble Lord, Lord Knight, asked about virtual private networks and the risk of displacing people on to VPNs or other similar alternatives. That is a point worth noting, not just in this group but as we consider all these amendments, particularly when we talk later on about age verification, pornography and so on. Services will need to think about how safety measures could be circumvented and take steps to prevent that, because they need to mitigate risk effectively. There may also be a role in enforcement action, too; Ofcom will be able to apply to the courts to require these services where appropriate to apply business disruption measures. We should certainly be mindful of the incentives for people to do that, and the example the noble Lord, Lord Knight, gave earlier is a useful lesson in the old adage “Caveat emptor” when looking at some of these providers.

I want to say a little bit about Amendments 205A and 290H in my name. Given the scale of child sexual abuse and exploitation that takes place online, and the reprehensible nature of these crimes, it is important that Ofcom has effective powers to require companies to tackle it. This brings me to these government amendments, which make small changes to the powers in Clause 110 to ensure that they are effective. I will focus particularly, in the first instance, on Amendment 290H, which ensures that Ofcom considers whether a service has features that allow content to be shared widely via another service when deciding whether content has been communicated publicly or privately, including for the purposes of issuing a notice. This addresses an issue highlighted by the Independent Reviewer of Terrorism Legislation, Jonathan Hall, and Professor Stuart Macdonald in a recent paper. The separate, technical amendment, Amendment 205A, clarifies that Clause 110(7) refers only to a notice on a user-to-user service.

Amendment 190 in the name of the noble Lord, Lord Clement-Jones, seeks to introduce a new privacy duty on Ofcom when considering whether to use any of its powers. The extensive privacy safeguards that I have already set out, along with Ofcom’s human rights obligations, would make this amendment unnecessary. Ofcom must also explicitly consult persons whom it considers to have expertise in the enforcement of the criminal law and the protection of national security, which is relevant to online safety matters in the course of preparing its draft codes. This may include the integrity and security of internet services where relevant.

Amendments 202 and 206, in the name of the noble Lord, Lord Stevenson of Balmacara, and Amendments 207, 208, 244, 246, 247, 248, 249 and 250 in the name of the noble Lord, Lord Clement-Jones, all seek to deliver privacy safeguards to notices issued under Clause 110 through additional review and appeals processes. There are already strong safeguards concerning this power. As part of the warning notice process, companies will be able to make representations to Ofcom which it is bound to consider before issuing a notice. Ofcom must also review any notice before the end of the period for which it has effect.

Amendment 202 proposes mirroring the safeguards of the investigatory powers Act when issuing notices to encrypted messaging services under this power. First, this would be inappropriate, because the powers in the investigatory powers Act serve different purposes from those in this Bill. The different legal safeguards in the investigatory powers Act reflect the potential intrusion by the state into an individual’s private communications; that is not the case with this Bill, which does not grant investigatory powers to state bodies, such as the ability to intercept private communications. Secondly, making a reference to encryption would be—

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

Is that right? I do not need a yes or no answer. It was rhetorical; I am just trying to frame the right question. The Minister is making a very strong point about the difference between RIPA requirements and those that might be brought in under this Bill. But it does not really get to the bottom of the questions we were asking. In this situation, whatever the exact analogy between the two systems is, it is clear that Ofcom is marking its own homework—which is fair enough, as there are representations, but it is not getting external advice or seeking judicial approval.

The Minister’s point was that that was okay because it was private companies involved. But we are saying here that these would be criminal offences taking place and therefore there is bound to be interest from the police and other agencies, including anti-terrorism agencies. It is clearly similar to the RIPA arrangements, so he could he just revisit that?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Yes, I think it is right. The investigatory powers Act is a tool for law enforcement and intelligence agencies, whereas the Bill is designed to regulate technology companies—an important high-level distinction. As such, the Bill does not grant investigatory powers to state bodies. It does not allow the Government or the regulator to access private messages. Instead, it requires companies to implement proportionate systems and processes to tackle illegal content on their platforms. I will come on to say a little about legal redress and the role of the courts in looking at Ofcom’s decisions so, if I may, I will respond to that in a moment.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

The investigatory powers Act includes a different form of technical notice, which is to put in place surveillance equipment. The noble Lord, Lord Stevenson, has a good point: we need to ensure that we do not have two regimes, both requiring companies to put in place technical equipment but with quite different standards applying.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I will certainly take that point away and I understand, of course, that different Acts require different duties of the same platforms. I will take that away and discuss it with colleagues in other departments who lead on investigatory powers.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - - - Excerpts

Before my noble friend moves on, when he is reviewing that back in the office, could he also satisfy himself that the concerns coming from the journalism and news organisations in the context of RIPA are also understood and have been addressed? That is another angle which, from what my noble friend has said so far, I am not sure has really been acknowledged. That is not a criticism but it is worth him satisfying himself on it.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am about to talk about the safeguards for journalists in the context of the Bill and the questions posed by the noble Baroness, Lady Bennett. However, I take my noble friend’s point about the implications of other Acts that are already on the statute book in that context as well.

Just to finish the train of thought of what I was saying on Amendment 202, making a reference to encryption, as it suggests, would be out of step with the wider approach of the Bill, which is to remain technology-neutral.

I come to the safeguards for journalistic protections, as touched on by the noble Baroness, Lady Bennett. The Government are fully committed to protecting the integrity of journalistic sources, and there is no intention or expectation that the tools required to be used under this power would result in a compromising of those sources. Any tools required on private communications must be accredited by Ofcom as highly accurate only in detecting child sexual abuse and exploitation content. These minimum standards of accuracy will be approved and published by the Secretary of State, following advice from Ofcom. We therefore expect it to be very unlikely that journalistic content will be falsely detected by the tools being required.

Under Clause 59, companies are obliged to report child sexual abuse material which is detected on their service to the National Crime Agency; this echoes a point made by the noble Lord, Lord Allan, in an earlier contribution. That would include child sexual abuse and exploitation material identified through tools required by a notice and, even in this event, the appropriate protections in relation to journalistic sources would be applied by the National Crime Agency if it were necessary to identify individuals involved in sharing illegal material.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I want to flag that in the context of terrorist content, this is quite high risk for journalists. It is quite common for them, for example, to be circulating a horrific ISIS video not because they support ISIS but because it is part of a news article they are putting together. We should flag that terrorist content in particular is commonly distributed by journalists and it could be picked up by any system that is not sufficiently sophisticated.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I see that my noble friend Lord Murray of Blidworth has joined the Front Bench in anticipation of the lunch-break business for the Home Office. That gives me the opportunity to say that I will discuss some of these points with him, my noble friend Lord Sharpe of Epsom and others at the Home Office.

Amendment 246 aims to ensure that there is no requirement for a provider to comply with a notice until the High Court has determined the appeal. The Government have ensured that, in addition to judicial review through the High Court, there is an accessible and relatively affordable alternative means of appealing Ofcom’s decisions via the Upper Tribunal. We cannot accept amendments such as this, which could unacceptably delay Ofcom’s ability to issue a notice, because that would leave children vulnerable.

To ensure that Ofcom’s use of its powers under Clause 110, and the technology that underpins it, are transparent, Ofcom will produce an annual report about the exercise of its functions using these powers. This must be submitted to the Secretary of State and laid before Parliament. The report must also provide the details of technology that has been assessed as meeting minimum standards of accuracy, and Ofcom may also consider other factors, including the impact of technologies on privacy. That will be separate to Ofcom’s annual report to allow for full scrutiny of this power.

The legislation also places a statutory requirement on Ofcom to publish guidance before its functions with regard to Clause 110 come into force. This will be after Royal Assent, given that the legislation is subject to change until that point. Before producing the guidance, Ofcom must consult the Information Commissioner. As I said, there are already strong safeguards regarding Ofcom’s use of these powers, so we think that this additional oversight is unnecessary.

Amendments 203 and 204, tabled by the noble Lord, Lord Clement-Jones, seek to probe the privacy implications of Ofcom’s powers to require technology under Clause 110. I reiterate that the Bill will not ban or weaken any design, including end-to-end encryption. But, given the scale of child sexual abuse and exploitation taking place on private communications, it is important that Ofcom has effective powers to require companies to tackle this abhorrent activity. Data from the Office for National Statistics show that in nearly three-quarters of cases where children are contacted online by someone they do not know, this takes place by private message. This highlights the scale of the threat and the importance of technology providers taking steps to safeguard children in private spaces online.

As already set out, there are already strong safeguards regarding the use of this power, and these will prevent Ofcom from requiring the use of any technology that would undermine a platform’s security and put users’ privacy at risk. These safeguards will also ensure that platforms will not be required to conduct mass scanning of private communications by default.

Until the regime comes into force, it is of course not possible to say with certainty which tools would be accredited. However, some illustrative examples of the kinds of current tools we might expect to be used—providing that they are highly accurate and compatible with a service’s design—are machine learning or artificial intelligence, which assess content to determine whether it is illegal, and hashing technology, which works by assigning a unique number to an image that has been identified as illegal.

Given the particularly abhorrent nature of the crimes we are discussing, it is important that services giving rise to a risk of child sexual abuse and exploitation in the UK are covered, wherever they are based. The Bill, including Ofcom’s ability to issue notices in relation to this or to terrorism, will therefore have extraterritorial effect. The Bill will apply to any relevant service that is linked to the UK. A service is linked to the UK if it has a significant number of UK users, if UK users form a target market or if the service is capable of being used in the UK and there is a material risk of significant harm to individuals in the UK arising from the service. I hope that that reassures the noble Lord, on behalf of his noble friend, about why that amendment is not needed.

Amendments 209 to 214 seek to place additional requirements on Ofcom to consider the effect on user privacy when using its powers under Clause 110. I agree that tackling online harm needs to take place while protecting privacy and security online, which is why Ofcom already has to consider user privacy before issuing notices under Section 110, among the other stringent safeguards I have set out. Amendment 202A would impose a duty on Ofcom to issue a notice under Clause 110, where it is satisfied that it is necessary and proportionate to do so—this will have involved ensuring that the safeguards have been met.

Ofcom will have access to a wide range of information and must have the discretion to decide the most appropriate course of action in any particular scenario, including where this action lies outside the powers and procedures conferred by Clause 110; for instance, an initial period of voluntary engagement. This is an in extremis power. It is essential that we balance users’ rights with the need to enable a strong response, so Ofcom must be able to assess whether any alternative, less intrusive measures would effectively reduce the level of child sexual exploitation and abuse or terrorist content occurring on a service before issuing a notice.

I hope that that provides reassurance to noble Lords on the amendments in this group, and I invite the noble Lord to withdraw Amendment 14.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, this has been a very useful debate and serves as a good appetite builder for lunch, which I understand we will be able to take shortly.

I am grateful to the Minister for his response and to all noble Lords who have taken part in the debate. As always, the noble Baroness, Lady Kidron, gave us a balanced view of digital rights—the right to privacy and to security—and the fact that we should be trying to advance these two things simultaneously. She was right again to remind us that this is a real problem and there is a lot we can do. I know she has worked on this through things such as metadata—understanding who is communicating with whom—which might strike that nice balance where we are not infringing on people’s privacy too grossly but are still able to identify those who wish harm on our society and in particular on our children.

The noble Baroness, Lady Bennett, was right to pick up this tension between everything, everywhere, all at once and targeted surveillance. Again, that is really interesting to tease out. I am personally quite comfortable with quite intrusive targeted surveillance. I do not know whether noble Lords have been reading the Pegasus spyware stories: I am not comfortable with some Governments placing such spyware on the phones of human rights defenders but I would be much more relaxed about the British authorities placing something similar on the phones of people who are going to plant bombs in Manchester. We need to be really honest about where we are drawing our red lines if we want to go in the direction of targeted surveillance.

The noble Lord, Lord Moylan, was right again to remind us about the importance of private conversations. I cited the example of police officers whose conversations have been exposed. Although it is hard, we should remember that if ordinary citizens want to exchange horrible racist jokes with each other and so on in private groups that is not a matter for the state, but it is when it is somebody in a position of public authority; we have a right to intervene there. Again, we have to remember that as long as it is not illegal people can say horrible things in private, and we should not encourage any situation where we suggest that the state would interfere unless there are legitimate grounds—for example, it is a police officer or somebody is doing something that crosses the line of legality.

The noble Baroness, Lady Fox, reminded us that it is either encrypted or it is not. That is really helpful, as things cannot be half encrypted. If a service provider makes a commitment it is critical that it is truthful. That is what our privacy law tells us. If I say, “This service is encrypted between you and the person you send the message to”, and I know that there is somebody in between who could access it, I am lying. I cannot say it is a private service unless it is truly private. We have to bear that in mind. Historically, people might have been more comfortable with fudging it, but not in 2023, when have this raft of privacy legislation.

The noble Baroness is also right to remind us that privacy can be safety. There is almost nothing more devastating than the leaking of intimate images. When services such as iCloud move to encrypted storage that dramatically reduces the risk that somebody will get access to your intimate images if you store them there, which you are legally entitled to do. Privacy can be a critical part of an individual maintaining their own security and we should not lose that.

The noble Baroness, Lady Stowell, was right again to talk about general monitoring. I am pleased that she found the WhatsApp briefing useful. I was unable to attend but I know from previous contact that there are people doing good work and it is sad that that often does not come out. We end up with this very polarised debate, which my noble friend Lord McNally was right to remind us is unhelpful. The people south of the river are often working very closely in the public interest with people in tech companies. Public rhetoric tends to focus on why more is not being done; there are very few thanks for what is being done. I would like to see the debate move a little more in that direction.

The noble Lord, Lord Knight, opened up a whole new world of pain with VPNs, which I am sure we will come back to. I say simply that if we get the regulatory frameworks right, most people in Britain will continue to use mainstream services as long as they are allowed to be offered. If those services are regulated by the European Union under its Digital Services Act and pertain to the UK and the US in a similar way, they will in effect have global standards, so it will not matter where you VPN from. The scenario the noble Lord painted, which I worry about, is where those mainstream services are not available and we drive people into small, new services that are not regulated by anyone. We would then end up inadvertently driving people back to the wild west that we complain about, when most of them would prefer to use mainstream services that are properly regulated by Ofcom, the European Commission and the US authorities.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
So having different terms of service for different types of service is healthy, but I also think that Ofcom making sure that people do what they say they do is a reasonably healthy development, as long as we recognise and accept the consequences of that.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, I am grateful for this short and focused debate, which has been helpful, and for the points made by the noble Lords, Lord Stevenson and Lord Allan, and the noble Baroness, Lady Kidron. I think we all share the same objective: ensuring that terms of service promote accountability and transparency, and empower users.

One of the Bill’s key objectives is to ensure that the terms of service of user-to-user platforms are suitable and effective. Under the Bill, companies will be required both to set out clearly how they will tackle illegal content and protect children and to ensure that their terms of service are properly enforced. The additional transparency and accountability duties on category 1 services will further ensure that users know what to expect on the largest platforms. This will put an end to these services arbitrarily removing content or, conversely, failing to remove content that they profess to prohibit.

The Bill will also ensure that search services are clear to their users about how they are complying with their adult and child safety duties under this new law. Given the very different way in which search services operate, however, this will be achieved through a publicly available statement rather than through terms of service. The two are meant distinctly.

Noble Lords are right to point to the question of intelligibility. It struck me that, if it takes 10 days to read terms of service, perhaps we should have a race during the 10 days allotted to this Committee stage to see which is quicker—but I take the point. The noble Lord, Lord Allan, is also right that the further requirements imposed through this Bill will only add to that.

The noble Baroness, Lady Kidron, asked a fair question about what “accessibility” means. The Bill requires all platforms’ terms of service for illegal content and child safety duties to be clear and accessible. Ofcom will provide guidance on what that means, including ensuring that they are suitably prominent. The same applies to terms of service for category 1 services relating to content moderation.

I will focus first on Amendments 16, 21, 66DA, 75 and 197, which seek to ensure that both Ofcom and platforms consider the risks associated with platforms’ terms of service with regard to the illegal content and child safety duties in the Bill. We do not think that these amendments are needed. User-to-user services will already be required to assess the risks regarding their terms of service for illegal content. Clause 8 requires companies to assess the “design and operation” of a service in relation to illegal content. As terms of service are integral to how a service operates, they would be covered by this provision. Similarly, Clause 10 sets out that companies likely to be accessed by children will be required to assess the “design and operation” of a service as part of their child risk assessments, which would include the extent to which their terms of service may reduce or increase the risk of harm to children.

In addition to those risk assessment duties, the safety duties will require companies to take proportionate measures effectively to manage and mitigate the risk of harm to people whom they have identified through risk assessments. This will include making changes to their terms of service, if appropriate. The Bill does not impose duties on search services relating to terms of service, as search services’ terms of service play a less important role in determining how users can engage on a platform. I will explain this point further when responding to specific amendments relating to search services but I can assure the noble Lord, Lord Stevenson, that search services will have comprehensive duties to understand and mitigate how the design and operation of their service affects risk.

Amendment 197 would require Ofcom to assess how platforms’ terms of service affect the risk of harm to people that the sector presents. While I agree that this is an important risk factor which Ofcom must consider, it is already provided for in Clause 89, which requires Ofcom to undertake an assessment of risk across regulated services. That requires Ofcom to consider which characteristics of regulated services give rise to harm. Given how integral terms of service are to how many technology companies function, Ofcom will necessarily consider the risk associated with terms of service when undertaking that risk assessment.

However, elevating terms of service above other systems and processes, as mentioned in Clause 89, would imply that Ofcom needs to take account of the risk of harm on the regulated service, more than it needs to do so for other safety-by-design systems and processes or for content moderation processes, for instance. That may not be suitable, particularly as the service delivery methods will inevitably change over time. Instead, Clause 89 has been written to give Ofcom scope to organise its risk assessment, risk register and risk profiles as it thinks suitable. That is appropriate, given that it is best placed to develop detailed knowledge of the matters in question as they evolve over time.

Amendments 70, 71, 72, 79, 80, 81, 174 and 302 seek to replace the Bill’s references to publicly available statements, in relation to search services, with terms of service. This would mean that search services would have to publish how they are complying with their illegal content and child protection duties in terms of service rather than in publicly available statements. I appreciate the spirit in which the noble Lord has tabled and introduced these amendments. However, they do not consider the very different ways in which search services operate.

User-to-user services’ terms of service fulfil a very specific purpose. They govern a user’s behaviour on the service and set rules on what a user is allowed to post and how they can interact with others. If a user breaks these terms, a service can block his or her access or remove his or her content. Under the status quo, users have very few mechanisms by which to hold user-to-user platforms accountable to these terms, meaning that users can arbitrarily see their content removed with few or no avenues for redress. Equally, a user may choose to use a service because its terms and conditions lead them to believe that certain types of content are prohibited while in practice the company does not enforce the relevant terms.

The Bill’s duties relating to user-to-user services’ terms of service seek to redress this imbalance. They will ensure that people know what to expect on a platform and enable them to hold platforms accountable. In contrast, users of search services do not create content or interact with other users. Users can search for anything without restriction from the search service provider, although a search term may not always return results. It is therefore not necessary to provide detailed information on what a user can and cannot do on a search service. The existing duties on such services will ensure that search engines are clear to users about how they are complying with their safety duties. The Bill will require search services to set out how they are fulfilling them, in publicly available statements. Their actions must meet the standards set by Ofcom. Using these statements will ensure that search services are as transparent as user-to-user services about how they are complying with their safety duties.

The noble Lord’s Amendment 174 also seeks to expand the transparency reporting requirements to cover the scope and application of the terms of service set out by search service providers. This too is unnecessary because, via Schedule 8, the Bill already ensures transparency about the scope and application of the provisions that search services must make publicly available. I hope that gives the noble Lord some reassurance that the concerns he has raised are already covered. With that, I invite him to withdraw Amendment 16.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am very grateful to the Minister for that very detailed response, which I will have to read very carefully because it was quite complicated. That is the answer to my question. Terms of service will not be very easy to identify because to answer my questions he has had to pray in aid issues that Ofcom will necessarily have to assess—terms of services—to get at whether the companies are performing the duties that the Bill requires of them.

I will not go further on that. We know that there will be enough there to answer the main questions I had about this. I take the point about search being distinctively different in this area, although a tidy mind like mine likes to see all these things in one place and understand all the words. Every time I see “publicly available statement”, I do not know why but I think about people being hanged in public rather than a term of service or a contract.

--- Later in debate ---
Moved by
16A: Clause 8, page 7, line 23, after “19(2)” insert “and (8A)”
Member’s explanatory statement
This amendment inserts a signpost to the new duty in clause 19 about supplying records of risk assessments to OFCOM.
--- Later in debate ---
Moved by
16B: Clause 9, page 7, line 27, leave out “all”
Member’s explanatory statement
This is a technical amendment needed because the new duty to summarise illegal content risk assessments in the terms of service (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 1 services.
--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, we seem to have done it again—a very long list of amendments in a rather ill-conceived group has generated a very interesting discussion. We are getting quite good at this, exchanging views across the table, across the Committee, even within the Benches—Members who perhaps have not often talked together are sharing ideas and thoughts, and that is a wonderful feeling.

I want to start with an apology. I think I may be the person who got the noble Baroness, Lady Kidron, shopped by the former leader—once a leader, always a leader. What I thought I was being asked was whether the Committee would be interested in hearing the views of the noble Viscount who could not be present, and I was very keen, because when he does speak it is from a point of view that we do not often hear. I did not know that it was a transgression of the rules—but of course it is not, really, because we got round it. Nevertheless, I apologise for anything that might have upset the noble Baroness’s blood pressure—it did not stop her making a very good contribution later.

We have covered so much ground that I do not want to try and summarise it in one piece, because you cannot do that. The problem with the group as it stands is that the right reverend Prelate the Bishop of Derby and myself must have some secret connection, because we managed to put down almost the same amendments. They were on issues that then got overtaken by the Minister, who finally got round to—I mean, who put down a nice series of amendments which exactly covered the points we made, so we can lose all those. But this did not stop the right reverend Prelate the Bishop of Guildford making some very good additional points which I think we all benefited from.

I welcome back the noble Baroness, Lady Buscombe, after her illness; she gave us a glimpse of what is to come from her and her colleagues, but I will leave the particular issue that she raised for the Minister to respond to. It raises an issue that I am not competent on, but it is a very important one—we need to get the right balance between what is causing the alarm and difficulty outside in relation to what is happening on the internet, and I think we all agree with her that we should not put any barrier in the way of dealing with that.

Indeed, that was the theme of a number of the points that have been raised on the question of what is or can constitute illegal content, and how we judge it. It is useful to hear again from the master about how you do it in practice. I cannot imagine being in a room of French lawyers and experts and retaining my sanity, let alone making decisions that affect the ability of people to carry on, but the noble Lord did it; he is still here and lives to tell the tale—bearded or otherwise.

The later amendments, particularly from the noble Lord, Lord Clement-Jones, are taking us round in a circle towards the process by which Ofcom will exercise the powers that it is going to get in this area. These are probably worth another debate on their own, and maybe it will come up in a different form, because—I think the noble Baroness, Lady Stowell, made this point as well—there is a problem in having an independent regulator that is also the go-to function for getting advice on how others have to make decisions that are theirs to rule on at the end if they go wrong. That is a complicated way of saying that we may be overloading Ofcom if we also expect it to provide a reservoir of advice on how you deal with the issues that the Bill puts firmly on the companies—I agree that this is a problem that we do not really have an answer to.

My amendments were largely overtaken by the Government’s amendments, but the main one I want to talk about was Amendment 272. I am sorry that the noble Baroness, Lady Morgan, is not here, because her expertise is in an area that I want to talk about, which is fraud—cyber fraud in particular—and how that is going to be brought into the Bill. The issue, which I think has been raised by Which?, but a number of other people have also written to us about it, is that the Bill in Clauses 170 and 171 is trying to establish how a platform should identify illegal content in relation to fraud—but it is quite prescriptive. In particular, it goes into some detail which I will leave for the Minister to respond to, but uniquely it sets out a specific way for gathering information to determine whether content is illegal in this area, although it may have applicability in other areas.

One of the points that have to be taken into account is whether the platform is using human moderators, automated systems or a combination of the two. I am not quite sure why that is there in the Bill; that is really the basis for the tabling of our amendments. Clearly, one would hope that the end result is whether or not illegality has taken place, not how that information has been gathered. If one must make concessions to the process of law because a judgment is made that, because it is automated, it is in some way not as valid as if it had been done by a human moderator, there seems to be a whole world there that we should not be going into. I certainly hope that that is not going to be the case if we are talking about illegality concerning children or other vulnerable people, but that is how the Bill reads at present; I wonder whether the Minister can comment on that.

There is a risk of consumers being harmed here. The figures on fraud in the United Kingdom are extraordinary; the fact that it is not the top priority for everybody, let alone the Government, is extraordinary. It is something like the equivalent of consumers being scammed at the rate of around £7.5 billion per year. A number of awful types of scamming have emerged only because of the internet and social media. They create huge problems of anxiety and emotional distress, with lots of medical care and other things tied in if you want to work out the total bill. So we have a real problem here that we need to settle. It is great that it is in the Bill, but it would be a pity if the movement towards trying to resolve it is in any way infringed on by there being imperfect instructions in the Bill. I wonder whether the Minister would be prepared to respond to that; I would be happy to discuss it with him later, if that is possible.

As a whole, this is an interesting question as we move away from what a crime is towards how people judge how to deal with what they think is a crime but may not be. The noble Lord, Lord Allan, commented on how to do it in practice but one hopes that any initial problems will be overcome as we move forward and people become more experienced with this.

When the Joint Committee considered this issue, we spent a long time talking about why we were concerned about having certainty on the legal prescription in the Bill; that is why we were very much against the idea of “legal but harmful” because it seemed too subjective and too subject to difficulties. Out of that came another thought, which answers the point made by the noble Baroness, Lady Stowell: so much of this is about fine judgments on certain things that are there in stone and that you can work to but you then have to interpret them.

There is a role for Parliament here, I think; we will come on to this in later amendments but, if there is a debate to be had on this, let us not forget the points that have been made here today. If we are going to think again about Ofcom’s activity in practice, that is the sort of thing that either a Joint Committee or Select Committees of the two Houses could easily take on board as an issue that needs to be reflected on, with advice given to Parliament about how it might be taken forward. This might be the answer in the medium term.

In the short term, let us work to the Bill and make sure that it works. Let us learn from the experience but let us then take time out to reflect on it; that would be my recommendation but, obviously, that will be subject to the situation after we finish the Bill. I look forward to hearing the Minister’s response.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, as well as throwing up some interesting questions of law, this debate has provoked some interesting tongue-twisters. The noble Lord, Lord Allan of Hallam, offered a prize to the first person to pronounce the Netzwerkdurchsetzungsgesetz; I shall claim my prize in our debate on a later group when inviting him to withdraw his amendment.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Yes, that would be welcome.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

Can I suggest one of mine?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I thank the noble Lord.

I was pleased to hear about Wicipedia Cymraeg—there being no “k” in Welsh. As the noble Lord, Lord Stevenson, said, there has been a very good conversational discussion in this debate, as befits Committee and a self-regulating House. My noble friend Lady Stowell is right to point out matters of procedure, although we were grateful to know why the noble Viscount, Lord Colville, supports the amendments in question.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

Or indeed any evidence.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I take the noble Lord’s point and my noble friend’s further contribution. I will see whether I can give a clearer and more succinct description in writing to flesh that out, but that it is the reason that we have alighted on the words that we have.

The noble Lord, Lord Allan, also asked about jurisdiction. If an offence has been committed in the UK and viewed by a UK user, it can be treated as illegal content. That is set out in Clause 53(11), which says:

“For the purposes of determining whether content amounts to an offence, no account is to be taken of whether or not anything done in relation to the content takes place in any part of the United Kingdom”.


I hope that that bit, at least, is clearly set out to the noble Lord’s satisfaction. It looks like it may not be.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

Again, I think that that is clear. I understood from the Bill that, if an American says something that would be illegal were they to be in the United Kingdom, we would still want to exclude that content. But that still leaves it open, and I just ask the question again, for confirmation. If all of the activities are outside the United Kingdom—Americans talking to each other, as it were—and a British person objects, at what point would the platform be required to restrict the content of the Americans talking to each other? Is it pre-emptively or only as and when somebody in the United Kingdom objects to it? We should flesh out that kind of practical detail before this becomes law.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

If it has been committed in the UK and is viewed by a UK user, it can be treated as illegal. I will follow up on the noble Lord’s further points ahead of the next stage.

Amendment 272 explicitly provides that relevant information that is reasonably available to a provider includes information submitted by users in complaints. Providers will already need to do this when making judgments about content, as it will be both relevant and reasonably available.

My noble friend Lord Moylan returned to the question that arose on day 2 in Committee, querying the distinction between “protect” and “prevent”, and suggesting that a duty to protect would or could lead to the excessive removal of content. To be clear, the duty requires platforms to put in place proportionate systems and processes designed to prevent users encountering content. I draw my noble friend’s attention to the focus on systems and processes in that. This requires platforms to design their services to achieve the outcome of preventing users encountering such content. That could include upstream design measures, as well as content identification measures, once content appears on a service. By contrast, a duty to protect is a less stringent duty and would undermine the proactive nature of the illegal content duties for priority offences.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

Before he moves on, is my noble friend going to give any advice to, for example, Welsh Wikipedia, as to how it will be able to continue, or are the concerns about smaller sites simply being brushed aside, as my noble friend explicates what the Bill already says?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I will deal with all the points in the speech. If I have not done so by the end, and if my noble friend wants to intervene again, I would be more than happy to hear further questions, either to answer now or write to him about.

Amendments 128 to 133 and 143 to 153, in the names of the right reverend Prelate the Bishop of Derby and the noble Lord, Lord Stevenson of Balmacara, seek to ensure that priority offences relating to modern slavery and human trafficking, where they victimise children, are included in Schedule 6. These amendments also seek to require technology companies to report content which relates to modern slavery and the trafficking of children—including the criminal exploitation of children—irrespective of whether it is sexual exploitation or not. As noble Lords know, the strongest provisions in the Bill relate to children’s safety, and particularly to child sexual exploitation and abuse content. These offences are captured in Schedule 6. The Bill includes a power for Ofcom to issue notices to companies requiring them to use accredited technology or to develop new technology to identify, remove and prevent users encountering such illegal content, whether communicated publicly or privately.

These amendments would give Ofcom the ability to issue such notices for modern slavery content which affects children, even when there is no child sexual exploitation or abuse involved. That would not be appropriate for a number of reasons. The power to tackle illegal content on private communications has been restricted to the identification of content relating to child sexual exploitation and abuse because of the particular risk to children posed by content which is communicated privately. Private spaces online are commonly used by networks of criminals to share illegal images—as we have heard—videos, and tips on the commitment of these abhorrent offences. This is highly unlikely to be reported by other offenders, so it will go undetected if companies do not put in place measures to identify it. Earlier in Committee, the noble Lord, Lord Allan, suggested that those who receive it should report it, but of course, in a criminal context, a criminal recipient would not do that.

Extending this power to cover the identification of modern slavery in content which is communicated privately would be challenging to justify and could represent a disproportionate intrusion into someone’s privacy. Furthermore, modern slavery is usually identified through patterns of behaviour or by individual reporting, rather than through content alone. This reduces the impact that any proactive technology required under this power would have in tackling such content. Schedule 6 already sets out a comprehensive list of offences relating to child sexual exploitation and abuse which companies must tackle. If these offences are linked to modern slavery—for example, if a child victim of these offences has been trafficked—companies must take action. This includes reporting content which amounts to an offence under Schedule 6 to the National Crime Agency or another reporting body outside of the UK.

My noble friend Lord Moylan’s Amendment 135 seeks to remove the offence in Section 5 of the Public Order Act 1986 from the list of priority offences. His amendment would mean that platforms were not required to take proactive measures to reduce the risk of content which is threatening or abusive, and intended to cause a user harassment, alarm or distress, from appearing on their service. Instead, they would be obliged to respond only once they are made aware of the content, which would significantly reduce the impact of the Bill’s framework for tackling such threatening and abusive content. Given the severity of the harm which can be caused by that sort of content, it is right that companies tackle it. Ofcom will have to include the Public Order Act in its guidance about illegal content, as provided for in Clause 171.

Government Amendments 136A to 136C seek to strengthen the illegal content duties by adding further priority offences to Schedule 7. Amendments 136A and 136B will add human trafficking and illegal entry offences to the list of priority offences in the Bill. Crucially, this will mean that platforms will need to take proactive action against content which encourages or assists others to make dangerous, illegal crossings of the English Channel, as well as those who use social media to arrange or facilitate the travel of another person with a view to their exploitation.

The noble Lord, Lord Allan, asked whether these amendments would affect the victims of trafficking themselves. This is not about going after the victims. Amendment 136B addresses only content which seeks to help or encourage the commission of an existing immigration offence; it will have no impact on humanitarian communications. Indeed, to flesh out a bit more detail, Section 2 of the Modern Slavery Act makes it an offence to arrange or facilitate the travel of another person, including through recruitment, with a view to their exploitation. Facilitating a victim’s travel includes recruiting them. This offence largely appears online in the form of advertisements to recruit people into being exploited. Some of the steps that platforms could put in place include setting up trusted flagger programmes, signposting users to support and advice, and blocking known bad actors. Again, I point to some of the work which is already being done by social media companies to help tackle both illegal channel crossings and human trafficking.

--- Later in debate ---
Moved by
18A: Clause 9, page 8, line 23, at end insert—
“(8A) A duty to summarise in the terms of service the findings of the most recent illegal content risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to individuals).”Member’s explanatory statement
This amendment requires providers of Category 1 services to summarise (in their terms of service) the findings of their latest risk assessment regarding illegal content and activity. The limitation to Category 1 services is achieved by an amendment in the name of the Minister to clause 6.
--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, over the last few hours I have praised us for having developed a style of discussion and debate that is certainly relatively new and not often seen in the House, where we have tried to reach out to each other and find common ground. That was not a problem in this last group of just over an hour; I think we are united around the themes that were so brilliantly introduced in a very concise and well-balanced speech by the noble Baroness, Lady Kidron, who has been a leading and inspirational force behind this activity for so long.

Although different voices have come in at different times and asked questions that still need to be answered, I sense that we have reached a point in our thinking, if not in our actual debates, where we need a plan. I too reached this point; that was exactly the motivation I had in tabling Amendment 1, which was discussed on the first day. Fine as the Bill is—it is a very impressive piece of work in every way—it lacks what we need as a Parliament to convince others that we have understood the issues and have the answers to their questions about what this Government, or this country as a whole, are going to do about this tsunami of difference, which has arrived in the wake of the social media companies and search engines, in the way we do our business and live our lives these days. There is consensus, but it is slightly different to the consensus we had in earlier debates, where we were reassuring ourselves about the issues we were talking about but were not reaching out to the Government to change anything so much as being happy that we were speaking the same language and that they were in the same place as we are gradually coming to as a group, in a way.

Just before we came back in after the lunch break, I happened to talk to the noble Lord, Lord Grade, who is the chair of Ofcom and is listening to most of our debates and discussions when his other duties allow. I asked him what he thought about it, and he said that it was fascinating for him to recognise the level of expertise and knowledge that was growing up in the House, and that it would be a useful resource for Ofcom in the future. He was very impressed by the way in which everyone was engaging and not getting stuck in the niceties of the legislation, which he admitted he was experiencing himself. I say that softly; I do not want to embarrass him in any way because he is an honourable man. However, the point he makes is really important.

I say to the Minister that I do not think we are very far apart on this. He knows that, because we have discussed it at some length over the last six to eight weeks. What I think he should take away from this debate is that this is a point where a decision has to be taken about whether the Government are going to go with the consensus view being expressed here and put deliberately into the Bill a repetitive statement, but one that is clear and unambiguous, about the intention behind the Government’s reason for bringing forward the Bill and for us, the Opposition and other Members of this House, supporting it, which is that we want a safe internet for our children. The way we are going to do that is by having in place, up front and clearly in one place, the things that matter when the regulatory structure sits in place and has to deal with the world as it is, of companies with business plans and business models that are at variance with what we think should be happening and that we know are destroying the lives of people we love and the future of our country—our children—in a way that is quite unacceptable when you analyse it down to its last detail.

It is not a question of saying back to us across the Dispatch Box—I know he wants to but I hope he will not—“Everything that you have said is in the Bill; we don’t need to go down this route, we don’t need another piece of writing that says it all”. I want him to forget that and say that actually it will be worth it, because we will have written something very special for the world to look at and admire. It is probably not in its perfect form yet, but that is what the Government can do: take a rough and ready potential diamond, polish it, chamfer it, and bring it back and set it in a diadem we would all be proud to wear—Coronations excepted—so that we can say, “Look, we have done the dirty work here. We’ve been right down to the bottom and thought about it. We’ve looked at stuff that we never thought in our lives we would ever want to see and survived”.

I shake at some of the material we were shown that Molly Russell was looking at. But I never want to be in a situation where I will have to say to my children and grandchildren, “We had the chance to get this right and we relied on a wonderful piece of work called the Online Safety Act 2023; you will find it in there, but it is going to take you several weeks and a lot of mental harm and difficulty to understand what it means”.

So, let us make it right. Let us not just say “It’ll be alright on the night”. Let us have it there. It is almost right but, as my noble friend Lord Knight said, it needs to be patched back into what is already in the Bill. Somebody needs to look at it and say, “What, out of that, will work as a statement to the world that we care about our kids in a way that will really make a difference?” I warn the Minister that, although I said at Second Reading that I wanted to see this Bill on the statute book as quickly as possible, I will not accept a situation where we do not have more on this issue.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I am grateful to all noble Lords who have spoken on this group and for the clarity with which the noble Lord, Lord Stevenson, has concluded his remarks.

Amendments 20, 74, 93 and 123, tabled by the noble Baroness, Lady Kidron, would mean a significant revising of the Bill’s approach to content that is harmful to children. It would set a new schedule of harmful content and risk to children—the 4 Cs—on the face of the Bill and revise the criteria for user-to-user and search services carrying out child safety risk assessments.

I start by thanking the noble Baroness publicly—I have done so privately in our discussions—for her extensive engagement with the Government on these issues over recent weeks, along with my noble friends Lord Bethell and Lady Harding of Winscombe. I apologise that it has involved the noble Baroness, Lady Harding, missing her stop on the train. A previous discussion we had also very nearly delayed her mounting a horse, so I can tell your Lordships how she has devoted hours to this—as they all have over recent weeks. I would like to acknowledge their campaigning and the work of all organisations that the noble Baroness, Lady Kidron, listed at the start of her speech, as well as the families of people such as Olly Stephens and the many others that the right reverend Prelate the Bishop of Oxford mentioned.

I also reassure your Lordships that, in developing this legislation, the Government carried out extensive research and engagement with a wide range of interested parties. That included reviewing international best practice. We want this to be world-leading legislation, including the four Cs framework on the online risks of harm to children. The Government share the objectives that all noble Lords have echoed in making sure that children are protected from harm online. I was grateful to the noble Baroness, Lady Benjamin, for echoing the remarks I made earlier in Committee on this. I am glad we are on the same page, even if we are still looking at points of detail, as we should be.

As the noble Baroness, Lady Kidron, knows, it is the Government’s considered opinion that the Bill’s provisions already deliver these objectives. I know that she remains to be convinced, but I am grateful to her for our continuing discussions on that point, and for continuing to kick the tyres on this to make sure that this is indeed legislation of which we can be proud.

It is also clear that there is broad agreement across the House that the Bill should tackle harmful content to children such as content that promotes eating disorders, illegal behaviour such as grooming and risk factors for harm such as the method by which content is disseminated, and the frequency of alerts. I am pleased to be able to put on record that the Bill as drafted already does this in the Government’s opinion, and reflects the principles of the four Cs framework, covering each of those: content, conduct, contact and commercial or contract risks to children.

First, it is important to understand how the Bill defines content, because that question of definition has been a confusing factor in some of the discussions hitherto. When we talk in general terms about content, we mean the substance of a message. This has been the source of some confusion. The Bill defines “content”, for the purposes of this legislation, in Clause 207 extremely broadly as

“anything communicated by means of an internet service”.

Under this definition, in essence, all user communication and activity, including recommendations by an algorithm, interactions in the metaverse, live streams, and so on, is facilitated by “content”. So, for example, unwanted and inappropriate contact from an adult to a child would be treated by the Bill as content harm. The distinctions that the four Cs make between content, conduct and contact risks is therefore not necessary. For the purposes of the Bill, they are all content risks.

Secondly, I know that there have been concerns about whether the specific risks highlighted in the new schedule will be addressed by the Bill.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

Where are the commercial harms? I cannot totally get my head around my noble friend’s definition of content. I can sort of understand how it extends to conduct and contact, but it does not sound as though it could extend to the algorithm itself that is driving the addictive behaviour that most of us are most worried about.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

In that vein, will the noble Lord clarify whether that definition of content does not include paid-for content?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I was about to list the four Cs briefly in order, which will bring me on to commercial or contract risk. Perhaps I may do that and return to those points.

I know that there have been concerns about whether the specific risks highlighted in the new schedule will be addressed by the Bill. In terms of the four Cs category of content risks, there are specific duties for providers to protect children from illegal content, such as content that intentionally assists suicide, as well as content that is harmful to children, such as pornography. Regarding conduct risks, the child safety duties cover harmful conduct or activity such as online bullying or abuse and, under the illegal content safety duties, offences relating to harassment, stalking and inciting violence.

With regard to commercial or contract risks, providers specifically have to assess the risks to children from the design and operation of their service, including their business model and governance under the illegal content and child safety duties. In relation to contact risks, as part of the child safety risk assessment, providers will need specifically to assess contact risks of functionalities that enable adults to search for and contact other users, including children, in a way that was set out by my noble friend Lord Bethell. This will protect children from harms such as harassment and abuse, and, under the illegal content safety duties, all forms of child sexual exploitation and abuse, including grooming.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I agree that content, although unfathomable to the outside world, is defined as the Minister says. However, does that mean that when we see that

“primary priority content harmful to children”

will be put in regulations by the Secretary of State under Clause 54(2)—ditto Clause 54(3) and (4)—we will see those contact risks, conduct risks and commercial risks listed as primary priority, priority and non-designated harms?

I do not want to make my speech twice, but in my final sentence I said that my challenge to the Government is to have a very simple way forward by other means, if those things were articulated, but my understanding is that they are to bring forward content harms that describe only content as we normally believe it.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I have tried to outline the Bill’s definition of content, which I think will give some reassurance that other concerns that noble Lords have raised are covered. I will turn in a moment to address priority and primary priority content, if the noble Baroness will allow me to do that, and then perhaps intervene again if I have not done so to her satisfaction. I want to set that out and try to keep track of all the questions which have been posed as I do so.

For now, I know there have been concerns from some noble Lords that if functionalities are not labelled as harm in the legislation they would not be addressed by providers, and I reassure your Lordships’ House that this is not the case. There is an important distinction between content and other risk factors such as, for instance, an algorithm, which without content cannot risk causing harm to a child. That is why functionalities are not covered by the categories of primary, priority and priority content which is harmful to children. The Bill sets out a comprehensive risk assessment process which will cover content or activity that poses a risk of harm to children and other factors, such as functionality, which may increase the risk of harm. As such, the existing children’s risk assessment criteria already cover many of the changes proposed in this amendment. For example, the duties already require service providers to assess the risk of harm to children from their business model and governance. They also require providers to consider how a comprehensive range of functionalities affect risk, how the service is used and how the use of algorithms could increase the risks to children.

Turning to the examples of harmful content set out in the proposed new schedule, I am happy to reassure the noble Baroness and other noble Lords that the Government’s proposed list of primary, priority and priority content covers a significant amount of this content. In her opening speech she asked about cumulative harm—that is, content sent many times or content which is harmful due to the manner of its dissemination. We will look at that in detail on the next group as well, but I will respond to the points she made earlier now. The definition of harm in the Bill under Clause 205 makes it clear that physical or psychological harm may arise from the fact or manner of dissemination of the content, not just the nature of the content—content which is not harmful per se, but which if sent to a child many times, for example by an algorithm, would meet the Bill’s threshold for content that is harmful to children. Companies will have to consider this as a fundamental part of their risk assessment, including, for example, how the dissemination of content via algorithmic recommendations may increase the risk of harm, and they will need to put in place proportionate and age-appropriate measures to manage and mitigate the risks they identify. I followed the exchanges between the noble Baronesses, Lady Kidron and Lady Fox, and I make it clear that the approach set out by the Bill will mean that companies cannot avoid tackling the kind of awful content which Molly Russell saw and the harmful algorithms which pushed that content relentlessly at her.

This point on cumulative harm was picked up by my noble friend Lord Bethell. The Bill will address cumulative risk where it is the result of a combination of high-risk functionality, such as live streaming, or rewards in service by way of payment or non-financial reward. This will initially be identified through Ofcom’s sector risk assessments, and Ofcom’s risk profiles and risk assessment guidance will reflect where a combination of risk in functionalities such as these can drive up the risk of harm to children. Service providers will have to take Ofcom’s risk profiles into account in their own risk assessments for content which is illegal or harmful to children. The actions that companies will be required to take under their risk assessment duties in the Bill and the safety measures they will be required to put in place to manage the services risk will consider this bigger-picture risk profile.

The amendments of the noble Baroness, Lady Kidron, would remove references to primary priority and priority harmful content to children from the child risk assessment duties, which we fear would undermine the effectiveness of the child safety duties as currently drafted. That includes the duty for user-to-user providers to prevent children encountering primary priority harms, such as pornography and content that promotes self-harm or suicide, as well as the duty to put in place age-appropriate measures to protect children from other harmful content and activity. As a result, we fear these amendments could remove the requirement for an age-appropriate approach to protecting children online and make the requirement to prevent children accessing primary priority content less clear.

The noble Baroness, Lady Kidron, asked in her opening remarks about emerging harms, which she was right to do. As noble Lords know, the Bill has been designed to respond as rapidly as possible to new and emerging harms. First, the primary priority and priority list of content can be updated by the Secretary of State. Secondly, it is important to remember the function of non-designated content that is harmful to children in the Bill—that is content that meets the threshold of harmful content to children but is not on the lists designated by the Government. Companies are required to understand and identify this kind of content and, crucially, report it to Ofcom. Thirdly, this will inform the actions of Ofcom itself in its review and report duties under Clause 56, where it is required to review the incidence of harmful content and the severity of harm experienced by children as a result of it. This is not limited to content that the Government have listed as being harmful, as it is intended to capture new and emerging harms. Ofcom will be required to report back to the Government with recommendations on changes to the primary priority and priority content lists.

I turn to the points that the noble Lord, Lord Knight of Weymouth, helpfully raised earlier about things that are in the amendments but not explicitly mentioned in the Bill. As he knows, the Bill has been designed to be tech-neutral, so that it is future-proof. That is why there is no explicit reference to the metaverse or virtual or augmented reality. However, the Bill will apply to service providers that enable users to share content online or interact with each other, as well as search services. That includes a broad range of services such as websites, applications, social media sites, video games and virtual reality spaces such as the metaverse; those are all captured. Any service that allows users to interact, as the metaverse does, will need to conduct a children’s access assessment and comply with the child safety duties if it is likely to be accessed by children.

Amendment 123 from the noble Baroness, Lady Kidron, seeks to amend Clause 48 to require Ofcom to create guidance for Part 3 service providers on this new schedule. For the reasons I have just set out, we do not think it would be workable to require Ofcom to produce guidance on this proposed schedule. For example, the duty requires Ofcom to provide guidance on the content, whereas the proposed schedule includes examples of risky functionality, such as the frequency and volume of recommendations.

I stress again that we are sympathetic to the aim of all these amendments. As I have set out, though, our analysis leads us to believe that the four Cs framework is simply not compatible with the existing architecture of the Bill. Fundamental concepts such as risk, harm and content would need to be reconsidered in the light of it, and that would inevitably have a knock-on effect for a large number of clauses and timing. The Bill has benefited from considerable scrutiny—pre-legislative and in many discussions over many years. The noble Baroness, Lady Kidron, has been a key part of that and of improving the Bill. The task is simply unfeasible at this stage in the progress of the Bill through Parliament and risks delaying it, as well as significantly slowing down Ofcom’s implementation of the child safety duties. We do not think that this slowing down is a risk worth taking, because we believe the Bill already achieves what is sought by these amendments.

Even so, I say to the Committee that we have listened to the noble Baroness, Lady Kidron, and others and have worked to identify changes which would further address these concerns. My noble friend Lady Harding posed a clear question: if not this, what would the Government do instead? I am pleased to say that, as a result of the discussions we have had, the Government have decided to make a significant change to the Bill. We will now place the categories of primary priority and priority content which is harmful to children on the face of the Bill, rather than leaving them to be designated in secondary legislation, so Parliament will have its say on them.

We hope that this change will reassure your Lordships that protecting children from the most harmful content is indeed the priority for the Bill. That change will be made on Report. We will continue to work closely with the noble Baroness, Lady Kidron, my noble friends and others, but I am not able to accept the amendments in the group before us today. With that, I hope that she will be willing to withdraw.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank all the speakers. There were some magnificent speeches and I do not really want to pick out any particular ones, but I cannot help but say that the right reverend Prelate described the world without the four Cs. For me, that is what everybody in the Box and on the Front Bench should go and listen to.

I am grateful and pleased that the Minister has said that the Government are moving in this direction. I am very grateful for that but there are a couple of things that I have to come back on. First, I have swiftly read Amendment 205’s definition of harm and I do not think it says that you do not have to reach a barrier of harm; dissemination is quite enough. There is always the problem of what the end result of the harm is. The thing that the Government are not listening to is the relationship between the risk assessment and the harm. It is about making sure that we are clear that it is the functionality that can cause harm. I think we will come back to this at another point, but that is what I beg them to listen to. Secondly, I am not entirely sure that it is correct to say that the four Cs mean that you cannot have primary priority, priority and so on. That could be within the schedule of content, so those two things are not actually mutually exclusive. I would be very happy to have a think about that.

What was not addressed in the Minister’s answer was the point made by the noble Lord, Lord Allan of Hallam, in supporting the proposal that we should have in the schedule: “This is what you’ve got to do; this is what you’ve got to look at; this is what we’re expecting of you; and this is what Parliament has delivered”. That is immensely important, and I was so grateful to the noble Lord, Lord Stevenson, for putting his marker down on this set of amendments. I am absolutely committed to working alongside him and to finding ways around this, but we need to find a way of stating it.

Ironically, that is my answer to both the noble Baronesses, Lady Ritchie and Lady Fox: we should have our arguments here and now, in this Chamber. I do not wish to leave it to the Secretary of State, whom I have great regard for, as it happens, but who knows: I have seen a lot of Secretaries of State. I do not even want to leave it to the Minister, because I have seen a lot of Ministers too—ditto Ofcom, and definitely not the tech sector. So here is the place, and we are the people, to work out the edges of this thing.

Not for the first time, my friend, the noble Baroness, Lady Harding, read out what would have been my answer to the noble Baroness, Lady Ritchie. I have gone round and round, and it is like the Marx brothers’ movie: in the end, harm is defined by subsection (4)(c), but that says that harm will defined by the Secretary of State. It goes around like that through the Bill.

--- Later in debate ---
Moved by
21A: Clause 10, page 10, line 1, after “19(2)” insert “and (8A)”
Member’s explanatory statement
This amendment inserts a signpost to the new duty in clause 19 about supplying records of risk assessments to OFCOM.
--- Later in debate ---
Moved by
22A: Clause 11, page 10, line 6, at end insert “(as indicated by the headings).”
Member’s explanatory statement
This amendment provides clarification because the new duty to summarise children’s risk assessments in the terms of service (see the amendment in the Minister’s name inserting new subsection (10A) below) is imposed only on providers of Category 1 services.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I am grateful, as ever, to the noble Baroness, and I hope that has assisted the noble Lord, Lord Vaizey.

Finally—just about—I will speak to Amendment 32A, tabled in my name, about VPNs. I was grateful to the noble Baroness for her comments. In many ways, I wanted to give the Minister the opportunity to put something on the record. I understand, and he can confirm whether my understanding is correct, that the duties on the platforms to be safe is regardless of whether a VPN has been used to access the systems and the content. The platforms, the publishers of content that are user-to-user businesses, will have to detect whether a VPN is being used, one would suppose, in order to ensure that children are being protected and that that is genuinely a child. Is that a correct interpretation of how the Bill works? If so, is it technically realistic for those platforms to be able to detect whether someone is landing on their site via a VPN or otherwise? In my mind, the anecdote that the noble Baroness, Lady Harding, related, about what the App Store algorithm on Apple had done in pushing VPNs when looking for porn, reinforces the need for app stores to become in scope, so that we can get some of that age filtering at that distribution point, rather than just relying on the platforms.

Substantially, this group is about platforms anticipating harms, not reviewing them and then fixing them despite their business model. If we can get the platforms themselves designing for children’s safety and then working out how to make the business models work, rather than the other way around, we will have a much better place for children.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, I join in the chorus of good wishes to the bungee-jumping birthday Baroness, Lady Kidron. I know she will not have thought twice about joining us today in Committee for scrutiny of the Bill, which is testament to her dedication to the cause of the Bill and, more broadly, to protecting children online. The noble Lord, Lord Clement-Jones, is right to note that we have already had a few birthdays along the way; I hope that we get only one birthday each before the Bill is finished.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My birthday is in October, so I hope not.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Very good—only one each, and hopefully fewer. I thank noble Lords for the points they raised in the debate on these amendments. I understand the concerns raised about how the design and operation of services can contribute to risk and harm online.

The noble Lord, Lord Russell, was right, when opening this debate, that companies are very successful indeed at devising and designing products and services that people want to use repeatedly, and I hope to reassure all noble Lords that the illegal and child safety duties in the Bill extend to how regulated services design and operate their services. Providers with services that are likely to be accessed by children will need to provide age-appropriate protections for children using their service. That includes protecting children from harmful content and activity on their service. It also includes reviewing children’s use of higher-risk features, such as live streaming or private messaging. Service providers are also specifically required to consider the design of functionalities, algorithms and other features when delivering the child safety duties imposed by the Bill.

I turn first to Amendments 23 and 76 in the name of the noble Lord, Lord Russell. These would require providers to eliminate the risk of harm to children identified in the service’s most recent children’s risk assessment, in addition to mitigating and managing those risks. The Bill will deliver robust and effective protections for children, but requiring providers to eliminate the risk of harm to children would place an unworkable duty on providers. As the noble Baroness, Lady Fox, my noble friend Lord Moylan and others have noted, it is not possible to eliminate all risk of harm to children online, just as it is not possible entirely to eliminate risk from, say, car travel, bungee jumping or playing sports. Such a duty could lead to service providers taking disproportionate measures to comply; for instance, as noble Lords raised, restricting children’s access to content that is entirely appropriate for them to see.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Does the Minister accept that that is not exactly what we were saying? We were not saying that they would have to eliminate all risk: they would have to design to eliminate risks, but we accept that other risks will apply.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

It is part of the philosophical ruminations that we have had, but the point here is that elimination is not possible through the design or any drafting of legislation or work that is there. I will come on to talk a bit more about how we seek to minimise, mitigate and manage risk, which is the focus.

Amendments 24, 31, 32, 77, 84, 85 and 295, from the noble Lord, Lord Russell, seek to ensure that providers do not focus just on content when fulfilling their duties to mitigate the impact of harm to children. The Bill already delivers on those objectives. As the noble Baroness, Lady Kidron, noted, it defines “content” very broadly in Clause 207 as

“anything communicated by means of an internet service”.

Under this definition, in essence, all communication and activity is facilitated by content.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I hope that the Minister has in his brief a response to the noble Baroness’s point about Clause 11(14), which, I must admit, comes across extraordinarily in this context. She quoted it, saying:

“The duties set out … are to be taken to extend only to content that is harmful to children where the risk of harm is presented by the nature of the content (rather than the fact of its dissemination)”.


Is not that exception absolutely at the core of what we are talking about today? It is surely therefore very difficult for the Minister to say that this applies in a very broad way, rather than purely to content.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I will come on to talk a bit about dissemination as well. If the noble Lord will allow me, he can intervene later on if I have not done that to his satisfaction.

I was about to talk about the child safety duties in Clause 11(5), which also specifies that they apply to the way that a service is designed, how it operates and how it is used, as well as to the content facilitated by it. The definition of content makes it clear that providers are responsible for mitigating harm in relation to all communications and activity on their service. Removing the reference to content would make service providers responsible for all risk of harm to children arising from the general operation of their service. That could, for instance, bring into scope external advertising campaigns, carried out by the service to promote its website, which could cause harm. This and other elements of a service’s operations are already regulated by other legislation.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I apologise for interrupting. Is that the case, and could that not be dealt with by defining harm in the way that it is intended, rather than as harm from any source whatever? It feels like a big leap that, if you take out “content”, instead of it meaning the scope of the service in its functionality and content and all the things that we have talked about for the last hour and a half, the suggestion is that it is unworkable because harm suddenly means everything. I am not sure that that is the case. Even if it is, one could find a definition of harm that would make it not the case.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Taking it out in the way that the amendment suggests throws up that risk. I am sure that it is not the intention of the noble Lord or the noble Baroness in putting it, but that is a risk of the drafting, which requires some further thought.

Clause 11(2), which is the focus of Amendments 32, 85 and 295, already means that platforms have to take robust action against content which is harmful because of the manner of its dissemination. However, it would not be feasible for providers to fulfil their duties in relation to content which is harmful only by the manner of its dissemination. This covers content which may not meet the definition of content which is harmful to children in isolation but may be harmful when targeted at children in a particular way. One example could be content discussing a mental health condition such as depression, where recommendations are made repeatedly or in an amplified manner through the use of algorithms. The nature of that content per se may not be inherently harmful to every child who encounters it, but, when aggregated, it may become harmful to a child who is sent it many times over. That, of course, must be addressed, and is covered by the Bill.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Can the Minister assure us that he will take another look at this between Committee and Report? He has almost made the case for this wording to be taken out—he said that it is already covered by a whole number of different clauses in the Bill—but it is still here. There is still an exception which, if the Minister is correct, is highly misleading: it means that you have to go searching all over the Bill to find a way of attacking the algorithm, essentially, and the way that it amplifies, disseminates and so on. That is what we are trying to get to: how to address the very important issue not just of content but of the way that the algorithm operates in social media. This seems to be highly misleading, in the light of what the Minister said.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I do not think so, but I will certainly look at it again, and I am very happy to speak to the noble Lord as I do. My point is that it would not be workable or proportionate for a provider to prevent or protect all children from encountering every single instance of the sort of content that I have just outlined, which would be the effect of these amendments. I will happily discuss that with the noble Lord and others between now and Report.

Amendment 27, by the noble Lord, Lord Stevenson, seeks to add a duty to prevent children encountering targeted paid-for advertising. As he knows, the Bill has been designed to tackle harm facilitated through user-generated content. Some advertising, including paid-for posts by influencers, will therefore fall under the scope of the Bill. Companies will need to ensure that systems for targeting such advertising content to children, such as the use of algorithms, protect them from harmful material. Fully addressing the challenges of paid-for advertising is a wider task than is possible through the Bill alone. The Bill is designed to reduce harm on services which host user-generated content, whereas online advertising poses a different set of problems, with different actors. The Government are taking forward work in this area through the online advertising programme, which will consider the full range of actors and sector-appropriate solutions to those problems.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I understand the Minister’s response, and I accept that there is a parallel stream of work that may well address this. However, we have been waiting for the report from the group that has been looking at that for some time. Rumours—which I never listen to—say that it has been ready for some time. Can the Minister give us a timescale?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I cannot give a firm timescale today but I will seek what further information I can provide in writing. I have not seen it yet, but I know that the work continues.

Amendments 28 and 82, in the name of the noble Lord, Lord Russell, seek to remove the size and capacity of a service provider as a relevant factor when determining what is proportionate for services in meeting their child safety duties. This provision is important to ensure that the requirements in the child safety duties are appropriately tailored to the size of the provider. The Bill regulates a large number of service providers, which range from some of the biggest companies in the world to small voluntary organisations. This provision recognises that what it is proportionate to require of providers at either end of that scale will be different.

Removing this provision would risk setting a lowest common denominator. For instance, a large multinational company could argue that it is required only to take the same steps to comply as a smaller provider.

Amendment 32A from the noble Lord, Lord Knight of Weymouth, would require services to have regard to the potential use of virtual private networks and similar tools to circumvent age-restriction measures. He raised the use of VPNs earlier in this Committee when we considered privacy and encryption. As outlined then, service providers are already required to think about how safety measures could be circumvented and take steps to prevent that. This is set out clearly in the children’s risk assessment and safety duties. Under the duty at Clause 10(6)(f), all services must consider the different ways in which the service is used and the impact of such use on the level of risk. The use of VPNs is one factor that could affect risk levels. Service providers must ensure that they are effectively mitigating and managing risks that they identify, as set out in Clause 11(2). The noble Lord is correct in his interpretation of the Bill vis-à-vis VPNs.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Is this technically possible?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Technical possibility is a matter for the sector—

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I am grateful to the noble Lord for engaging in dialogue while I am in a sedentary position, but I had better stand up. It is relevant to this Committee whether it is technically possible for providers to fulfil the duties we are setting out for them in statute in respect of people’s ability to use workarounds and evade the regulatory system. At some point, could he give us the department’s view on whether there are currently systems that could be used —we would not expect them to be prescribed—by platforms to fulfil the duties if people are using their services via a VPN?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

This is the trouble with looking at legislation that is technologically neutral and future-proofed and has to envisage risks and solutions changing in years to come. We want to impose duties that can technically be met, of course, but this is primarily a point for companies in the sector. We are happy to engage and provide further information, but it is inherently part of the challenge of identifying evolving risks.

The provision in Clause 11(16) addresses the noble Lord’s concerns about the use of VPNs in circumventing age-assurance or age-verification measures. For it to apply, providers would need to ensure that the measures they put in place are effective and that children cannot normally access their services. They would need to consider things such as how the use of VPNs affects the efficacy of age-assurance and age-verification measures. If children were routinely using VPNs to access their service, they would not be able to conclude that Clause 11(16) applies. I hope that sets out how this is covered in the Bill.

Amendments 65, 65ZA, 65AA, 89, 90, 90B, 96A, 106A, 106B, 107A, 114A, 122, 122ZA, 122ZB and 122ZC from the noble Lord, Lord Russell of Liverpool, seek to make the measures Ofcom sets out in codes of practice mandatory for all services. I should make it clear at the outset that companies must comply with the duties in the Bill. They are not optional and it is not a non-statutory regime; the duties are robust and binding. It is important that the binding legal duties on companies are decided by Parliament and set out in legislation, rather than delegated to a regulator.

Codes of practice provide clarity on how to comply with statutory duties, but should not supersede or replace them. This is true of codes in other areas, including the age-appropriate design code, which is not directly enforceable. Following up on the point from my noble friend Lady Harding of Winscombe, neither the age-appropriate design code nor the SEND code is directly enforceable. The Information Commissioner’s Office or bodies listed in the Children and Families Act must take the respective codes into account when considering whether a service has complied with its obligations as set out in law.

As with these codes, what will be directly enforceable in this Bill are the statutory duties by which all sites in scope of the legislation will need to abide. We have made it clear in the Bill that compliance with the codes will be taken as compliance with the duties. This will help small companies in particular. We must also recognise the diversity and innovative nature of this sector. Requiring compliance with prescriptive steps rather than outcomes may mean that companies do not use the most effective or efficient methods to protect children.

I reassure noble Lords that, if companies decide to take a different route to compliance, they will be required to document what their own measures are and how they amount to compliance. This will ensure that Ofcom has oversight of how companies comply with their duties. If the alternative steps that providers have taken are insufficient, they could face enforcement action. We expect Ofcom to take a particularly robust approach to companies which fail to protect their child users.

My noble friend Lord Vaizey touched on the age-appropriate design code in his remarks—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My noble friend the Minister did not address the concern I set out that the Bill’s approach will overburden Ofcom. If Ofcom has to review the suitability of each set of alternative measures, we will create an even bigger monster than we first thought.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I do not think that it will. We have provided further resource for Ofcom to take on the work that this Bill will give it; it has been very happy to engage with noble Lords to talk through how it intends to go about that work and, I am sure, would be happy to follow up on that point with my noble friend to offer her some reassurance.

Responding to the point from my noble friend Lord Vaizey, the Bill is part of the UK’s overall digital regulatory landscape, which will deliver protections for children alongside the data protection requirements for children set out in the Information Commissioner’s age-appropriate design code. Ofcom has strong existing relationships with other bodies in the regulatory sphere, including through the Digital Regulation Co-operation Forum. The Information Commissioner has been added to this Bill as a statutory consultee for Ofcom’s draft codes of practice and relevant pieces of guidance formally to provide for the ICO’s input into its areas of expertise, especially relating to privacy.

Amendment 138 from the noble Lord, Lord Russell of Liverpool, would amend the criteria for non-designated content which is harmful to children to bring into scope content whose risk of harm derives from its potential financial impact. The Bill already requires platforms to take measures to protect all users, including children, from financial crime online. All companies in scope of the Bill will need to design and operate their services to reduce the risk of users encountering content amounting to a fraud offence, as set out in the list of priority offences in Schedule 7. This amendment would expand the scope of the Bill to include broader commercial harms. These are dealt with by a separate legal framework, including the Consumer Protection from Unfair Trading Regulations. This amendment therefore risks creating regulatory overlap, which would cause confusion for business while not providing additional protections to consumers and internet users.

Amendment 261 in the name of the right reverend Prelate the Bishop of Oxford seeks to modify the existing requirements for the Secretary of State’s review into the effectiveness of the regulatory framework. The purpose of the amendment is to ensure that all aspects of a regulated service are taken into account when considering the risk of harm to users and not just content.

As we have discussed already, the Bill defines “content” very broadly and companies must look at every aspect of how their service facilitates harm associated with the spread of content. Furthermore, the review clause makes explicit reference to the systems and processes which regulated services use, so the review can already cover harm associated with, for example, the design of services.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, we too support the spirit of these amendments very much and pay tribute to the noble Lord, Lord Russell, for tabling them.

In many ways, I do not need to say very much. I think the noble Baroness, Lady Kidron, made a really powerful case, alongside the way the group was introduced in respect of the importance of these things. We do want the positivity that the noble Baroness, Lady Harding, talked about in respect of the potential and opportunity of technology for young people. We want them to have the right to freedom of expression, privacy and reliable information, and to be protected from exploitation by the media. Those happen to be direct quotes from the UN Convention on the Rights of the Child, as some of the rights they would enjoy. Amendments 30 and 105, which the noble Lord, Lord Clement-Jones, tabled—I attached my name to Amendment 30—are very much in that spirit of trying to promote well-being and trying to say that there is something positive that we want to see here.

In particular, I would like to see that in respect of Ofcom. Amendment 187 is, in some ways, the more significant amendment and the one I most want the Minister to reflect on. That is the one that applies to Ofcom: that it should have reference to the UN Convention on the Rights of the Child. I think even the noble Lord, Lord Weir, could possibly agree. I understand his thoughtful comments around whether or not it is right to encumber business with adherence to the UN convention, but Ofcom is a public body in how it carries out its duties as a regulator. There are choices for regulation. Regulation can just be about minimum standards, but it can also be about promoting something better. What we are seeking here in trying to have reference to the UN convention is for Ofcom to regulate for something more positive and better, as well as police minimum standards. On that basis, we support the amendments.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I will start in the optimistic spirit of the debate we have just had. There are many benefits to young people from the internet: social, educational and many other ways that noble Lords have mentioned today. That is why the Government’s top priority for this legislation has always been to protect children and to ensure that they can enjoy those benefits by going online safely.

Once again, I find myself sympathetic to these amendments, but in a position of seeking to reassure your Lordships that the Bill already delivers on their objectives. Amendments 25, 78, 187 and 196 seek to add references to the United Nations Convention on the Rights of the Child and general comment 25 on children’s rights in relation to the digital environment to the duties on providers and Ofcom in the Bill.

As I have said many times before, children’s rights are at the heart of this legislation, even if the phrase itself is not mentioned in terms. The Bill already reflects the principles of the UN convention and the general comment. Clause 207, for instance, is clear that a “child” means a person under the age of 18, which is in line with the convention. All providers in scope of the Bill need to take robust steps to protect users, including children, from illegal content or activity on their services and to protect children from content which is harmful to them. They will need to ensure that children have a safe, age-appropriate experience on services designed for them.

Both Ofcom and service providers will also have duties in relation to users’ rights to freedom of expression and privacy. The safety objectives will require Ofcom to ensure that services protect children to a higher standard than adults, while also making sure that these services account for the different needs of children at different ages, among other things. Ofcom must also consult bodies with expertise in equality and human rights, including those representing the interests of children, for instance the Children’s Commissioner. While the Government fully support the UN convention and its continued implementation in the UK, it would not be appropriate to place obligations on regulated services to uphold an international treaty between state parties. We agree with the reservations that were expressed by the noble Lord, Lord Weir of Ballyholme, in his speech, and his noble friend Lady Foster.

The convention’s implementation is a matter for the Government, not for private businesses or voluntary organisations. Similarly, the general comment acts as guidance for state parties and it would not be appropriate to refer to that in relation to private entities. The general comment is not binding and it is for individual states to determine how to implement the convention. I hope that the noble Lord, Lord Russell, will feel reassured that children’s rights are baked into the Bill in more ways than a first glance may suggest, and that he will be content to withdraw his amendment.

The noble Lord, Lord Clement-Jones, in his Amendments 30 and 105, seeks to require platforms and Ofcom to consider a service’s benefits to children’s rights and well-being when considering what is proportionate to fulfil the child safety duties of the Bill. They also add children’s rights and well-being to the online safety objectives for user-to-user services. The Bill as drafted is focused on reducing the risk of harm to children precisely so that they can better enjoy the many benefits of being online. It already requires companies to take a risk-based and proportionate approach to delivering the child safety duties. Providers will need to address only content that poses a risk of harm to children, not that which is beneficial or neutral. The Bill does not require providers to exclude children or restrict access to content or services that may be beneficial for them.

Children’s rights and well-being are already a central feature of the existing safety objectives for user-to-user services in Schedule 4 to the Bill. These require Ofcom to ensure that services protect children to a higher standard than adults, while making sure that these services account for the different needs of children at different ages, among other things. On this basis, while I am sympathetic to the aims of the amendments the noble Lord has brought forward, I respectfully say that I do not think they are needed.

More pertinently, Amendment 30 could have unintended consequences. By introducing a broad balancing exercise between the harms and benefits that children may experience online, it would make it more difficult for Ofcom to follow up instances of non-compliance. For example, service providers could take less effective safety measures to protect children, arguing that, as their service is broadly beneficial to children’s well-being or rights, the extent to which they need to protect children from harm is reduced. This could mean that children are more exposed to more harmful content, which would reduce the benefits of going online. I hope that this reassures the noble Lord, Lord Russell, of the work the Bill does in the areas he has highlighted, and that it explains why I cannot accept his amendments. I invite him to withdraw Amendment 25.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I thank all noble Lords for taking part in this discussion. I thank the noble Lord, Lord Weir, although I would say to him that his third point—that, in his experience, the UNCRC is open to different interpretations by different departments—is my experience of normal government. Name me something that has not been interpreted differently by different departments, as it suits them.

--- Later in debate ---
Moved by
27A: Clause 11, page 11, line 19, at end insert—“(10A) A duty to summarise in the terms of service the findings of the most recent children’s risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to children).”Member’s explanatory statementThis amendment requires providers of Category 1 services to summarise (in their terms of service) the findings of their latest children’s risk assessment. The limitation to Category 1 services is achieved by an amendment in the name of the Minister to clause 6.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Yes; the noble Baroness is right. She has pointed out in other discussions I have been party to that, for example, gaming technology that looks at the movement of the player can quite accurately work out from their musculoskeletal behaviour, I assume, the age of the gamer. So there are alternative methods. Our challenge is to ensure that if they are to be used, we will get the equivalent of age verification or better. I now hand over to the Minister.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, I think those last two comments were what are known in court as leading questions.

As the noble Baroness, Lady Ritchie of Downpatrick, said herself, some of the ground covered in this short debate was covered in previous groups, and I am conscious that we have a later grouping where we will cover it again, including some of the points that were made just now. I therefore hope that noble Lords will understand if I restrict myself at this point to Amendments 29, 83 and 103, tabled by the noble Baroness, Lady Ritchie.

These amendments seek to mandate age verification for pornographic content on a user-to-user or search service, regardless of the size and capacity of a service provider. The amendments also seek to remove the requirement on Ofcom to have regard to proportionality and technical feasibility when setting out measures for providers on pornographic content in codes of practice. While keeping children safe online is the top priority for the Online Safety Bill, the principle of proportionate, risk-based regulation is also fundamental to the Bill’s framework. It is the Government’s considered opinion that the Bill as drafted already strikes the correct balance between these two.

The provisions in the Bill on proportionality are important to ensure that the requirements in the child-safety duties are tailored to the size and capacity of providers. It is also essential that measures in codes of practice are technically feasible. This will ensure that the regulatory framework as a whole is workable for service providers and enforceable by Ofcom. I reassure your Lordships that the smaller providers or providers with less capacity are still required to meet the child safety duties where their services pose a risk to children. They will need to put in place sufficiently stringent systems and processes that reflect the level of risk on their services, and will need to make sure that these systems and processes achieve the required outcomes of the child safety duty. Wherever in the Bill they are regulated, companies will need to take steps to ensure that they cannot offer pornographic content online to those who should not see it. Ofcom will set out in its code of practice the steps that companies in the scope of Part 3 can take to comply with their duties under the Bill, and will take a robust approach to sites that pose the greatest risk of harm to children, including sites hosting online pornography.

The passage of the Bill should be taken as a clear message to providers that they need to begin preparing for regulation now—indeed, many are. Responsible providers should already be factoring in regulatory compliance as part of their business costs. Ofcom will continue to work with providers to ensure that the transition to the new regulatory framework will be as smooth as possible.

The Government expect companies to use age-verification technologies to prevent children accessing services that pose the highest risk of harm to children, such as online pornography. The Bill will not mandate that companies use specific technologies to comply with new duties because, as noble Lords have heard me say before, what is most effective in preventing children accessing pornography today might not be equally effective in future. In addition, age verification might not always be the most appropriate or effective approach for user-to-user companies to comply with their duties. For instance, if a user-to-user service, such as a particular social medium, does not allow pornography under its terms of service, measures such as strengthening content moderation and user reporting would be more appropriate and effective for protecting children than age verification. This would allow content to be better detected and taken down, instead of restricting children from seeing content which is not allowed on the service in the first place. Companies may also use another approach if it is proportionate to the findings of the child safety risk assessment and a provider’s size and capacity. This is an important element to ensure that the regulatory framework remains risk-based and proportionate.

In addition, the amendments in the name of the noble Baroness, Lady Ritchie, risk inadvertently shutting children out of large swathes of the internet that are entirely appropriate for them to access. This is because it is impossible totally to eliminate the risk that a single piece of pornography or pornographic material might momentarily appear on a site, even if that site prohibits it and has effective systems in place to prevent it appearing. Her amendments would have the effect of essentially requiring every service to block children through the use of age verification.

Those are the reasons why the amendments before us are not ones that we can accept. Mindful of the fact that we will return to these issues in a future group, I invite the noble Baroness to withdraw her amendment.

Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I thank all noble Lords who have participated in this wide-ranging debate, in which various issues have been raised.

The noble Baroness, Lady Benjamin, made the good point that there needs to be a level playing field between Parts 3 and 5, which I originally raised and which other noble Lords raised on Tuesday of last week. We keep coming back to this point, so I hope that the Minister will take note of it on further reflection before we reach Report. Pornography needs to be regulated on a consistent basis across the Bill.

--- Later in debate ---
Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I wonder whether I can make a brief intervention—I am sorry to do so after the noble Lord, Lord Clement-Jones, but I want to intervene before my noble friend the Minister stands up, unless the Labour Benches are about to speak.

I have been pondering this debate and have had a couple of thoughts. Listening to the noble Lord, Lord Clement-Jones, I am reminded of something which was always very much a guiding light for me when I chaired the Charity Commission, and therefore working in a regulatory space: regulation is never an end in itself; you regulate for a reason.

I was struck by the first debate we had on day one of Committee about the purpose of the Bill. If noble Lords recall, I said in that debate that, for me, the Bill at its heart was about enhancing the accountability of the platforms and the social media businesses. I felt that the contribution from my noble friend Lady Harding was incredibly important. What we are trying to do here is to use enforcement to drive culture change, and to force the organisations not to never think about profit but to move away from profit-making to focusing on child safety in the way in which they go about their work. That is really important when we start to consider the whole issue of enforcement.

It struck me at the start of this discussion that we have to be clear what our general approach and mindset is about this part of our economy that we are seeking to regulate. We have to be clear about the crimes we think are being committed or the offences that need to be dealt with. We need to make sure that Ofcom has the powers to tackle those offences and that it can do so in a way that meets Parliament’s and the public’s expectations of us having legislated to make things better.

I am really asking my noble friend the Minister, when he comes to respond on this, to give us a sense of clarity on the whole question of enforcement. At the moment, it is insufficiently clear. Even if we do not get that level of clarity today, when we come back later on and look at enforcement, it is really important that we know what we are trying to tackle here.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I will endeavour to give that clarity, but it may be clearer still if I flesh some points out in writing in addition to what I say now.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
I hope that the Minister will accept that a number of these amendments are particularly helpful in strengthening the Bill, and that he will find a way to accept that form of strengthening.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

I am very grateful to the noble Lords who have spoken on the amendments in this group, both this afternoon and last Tuesday evening. As this is a continuation of that debate, I think my noble friend Lord Moylan is technically correct still to wish the noble Baroness, Lady Kidron, a happy birthday, at least in procedural terms.

We have had a very valuable debate over both days on the Bill’s approach to holding platforms accountable to their users. Amendments 33B, 41A, 43ZA, 138A and 194A in the names of the noble Lords, Lord Lipsey and Lord McNally, and Amendment 154 in the name of the noble Lord, Lord Stevenson of Balmacara, seek to bring back the concept of legal but harmful content and related adult risk assessments. They reintroduce obligations for companies to consider the risk of harm associated with legal content accessed by adults. As noble Lords have noted, the provisions in the Bill to this effect were removed in another place, after careful consideration, to protect freedom of expression online. In particular, the Government listened to concerns that the previous legal but harmful provisions could create incentives for companies to remove legal content from their services.

In place of adult risk assessments, we introduced new duties on category 1 services to enable users themselves to understand how these platforms treat different types of content, as set out in Clauses 64 and 65. In particular, this will allow Ofcom to hold them to account when they do not follow through on their promises regarding content they say that they prohibit or to which they say that they restrict access. Major platforms already prohibit much of the content listed in Clause 12, but these terms of service are often opaque and not consistently enforced. The Bill will address and change that.

I would also like to respond to concerns raised through Amendments 41A and 43ZA, which seek to ensure that the user empowerment categories cover the most harmful categories of content to adults. I reassure noble Lords that the user empowerment list reflects input from a wide range of interested parties about the areas of greatest concern to users. Platforms already have strong commercial incentives to tackle harmful content. The major technology companies already prohibit most types of harmful and abusive content. It is clear that most users do not want to see that sort of content and most advertisers do not want their products advertised alongside it. Clause 12 sets out that providers must offer user empowerment tools with a specified list of content to the extent that it is proportionate to do so. This will be based on the size or capacity of the service as well as the likelihood that adult users will encounter the listed content. Providers will therefore need internally to assess the likelihood that users will encounter the content. If Ofcom disagrees with the assessment that a provider has made, it will have the ability to request information from providers for the purpose of assessing compliance.

Amendments 44 and 158, tabled by the right reverend Prelate the Bishop of Oxford, seek to place new duties on providers of category 1 services to produce an assessment of their compliance with the transparency, accountability, freedom of expression and user empowerment duties as set out in Clauses 12, 64 and 65 and to share their assessments with Ofcom. I am sympathetic to the aim of ensuring that Ofcom can effectively assess companies’ compliance with these duties. But these amendments would enable providers to mark their own homework when it comes to their compliance with the duties in question. The Bill has been designed to ensure that Ofcom has responsibility for assessing compliance and that it can obtain sufficient information from all regulated services to make judgments about compliance with their duties. The noble Baroness, Lady Kidron, asked about this—and I think the noble Lord, Lord Clement-Jones, is about to.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I hope the Minister will forgive me for interrupting, but would it not be much easier for Ofcom to assess compliance if a risk assessment had been carried out?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I will come on to say a bit more about how Ofcom goes about that work.

The Bill will ensure that providers have the information they need to understand whether they are in compliance with their duties under the Bill. Ofcom will set out how providers can comply in codes of practice and guidance that it publishes. That information will help providers to comply, although they can take alternative action if they wish to do so.

The right reverend Prelate’s amendments also seek to provide greater transparency to Ofcom. The Bill’s existing duties already account for this. Indeed, the transparency reporting duties set out in Schedule 8 already enable Ofcom to require category 1, 2A and 2B services to publish annual transparency reports with relevant information, including about the effectiveness of the user empowerment tools, as well as detailed information about any content that platforms prohibit or restrict, and the application of their terms of service.

Amendments 159, 160 and 218, tabled by the noble Lord, Lord Stevenson, seek to require user-to-user services to create and abide by minimum terms of service recommended by Ofcom. The Bill already sets detailed and binding requirements on companies to achieve certain outcomes. Ofcom will set out more detail in codes of practice about the steps providers can take to comply with their safety duties. Platforms’ terms of service will need to provide information to users about how they are protecting users from illegal content, and children from harmful content.

These duties, and Ofcom’s codes of practice, ensure that providers take action to protect users from illegal content and content that is harmful to children. As such, an additional duty to have adequate and appropriate terms of service, as envisaged in the amendments, is not necessary and may undermine the illegal and child safety duties.

I have previously set out why we do not agree with requiring platforms to set terms of service for legal content. In addition, it would be inappropriate to delegate this much power to Ofcom, which would in effect be able to decide what legal content adult users can and cannot see.

Amendment 155, tabled by my noble friend Lord Moylan, seeks to clarify whether and how the Bill makes the terms of service of foreign-run platforms enforceable by Ofcom. Platforms’ duties under Clause 65 apply only to the design, operation and use of the service in the United Kingdom and to UK users, as set out in Clause 65(11). Parts or versions of the service which are used in foreign jurisdictions—

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

On that, in an earlier reply the Minister explained that platforms already remove harmful content because it is harmful and because advertisers and users do not like it, but could he tell me what definition of “harmful” he thinks he is using? Different companies will presumably have a different interpretation of “harmful”. How will that work? It would mean that UK law will require the removal of legal speech based on a definition of harmful speech designed by who—will it be Silicon Valley executives? This is the problem: UK law is being used to implement the removal of content based on decisions that are not part of UK law but with implications for UK citizens who are doing nothing unlawful.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

The noble Baroness’s point gets to the heart of the debate that we have had. I talked earlier about the commercial incentive that there is for companies to take action against harmful content that is legal which users do not want to see or advertisers do not want their products to be advertised alongside, but there is also a commercial incentive to ensure that they are upholding free speech and that there are platforms on which people can interact in a less popular manner, where advertisers that want to advertise products legally alongside that are able to do so. As with anything that involves the market, the majority has a louder voice, but there is room for innovation for companies to provide products that cater to minority tastes within the law.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, my noble friend has explained clearly how terms of service would normally work, which is that, as I said myself, a business might write its own terms of service to its own advantage but it cannot do so too egregiously or it will lose customers, and businesses may aim themselves at different customers. All this is part of normal commercial life, and that is understood. What my noble friend has not really addressed is the question of why uniquely and specifically in this case, especially given the egregious history of censorship by Silicon Valley, he has chosen to put that into statute rather than leave it as a commercial arrangement, and to make it enforceable by Ofcom. For example, when my right honourable friend David Davis was removed from YouTube for his remarks about Covid passes, it would have been Ofcom’s obligation not to vindicate his right to free speech but to cheer on YouTube and say how well it had done for its terms of service.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Our right honourable friend’s content was reuploaded. This makes the point that the problem at the moment is the opacity of these terms and conditions; what platforms say they do and what they do does not always align. The Bill makes sure that users can hold them to account for the terms of service that they publish, so that people can know what to expect on platforms and have some form of redress when their experience does not match their expectations.

I was coming on to say a bit more about that after making some points about foreign jurisdictions and my noble friend’s Amendment 155. As I say, parts or versions of the service that are used in foreign jurisdictions but not in the UK are not covered by the duties in Clause 65. As such, the Bill does not require a provider to have systems and processes designed to enforce any terms of service not applicable in the UK.

In addition, the duties do not give powers to Ofcom to enforce a provider’s terms of service directly. Ofcom’s role will be focused on ensuring that platforms have systems and processes in place to enforce their own terms of service consistently rather than assessing individual pieces of content.

Requiring providers to set terms of service for specific types of content suggests that the Government view that type of content as harmful or risky. That would encourage providers to prohibit such content, which of course would have a negative impact on freedom of expression, which I am sure is not what my noble friend wants to see. Freedom of expression is essential to a democratic society. Throughout the passage of the Bill, the Government have always committed to ensuring that people can speak freely online. We are not in the business of indirectly telling companies what legal content they can and cannot allow online. Instead, the approach that we have taken will ensure that platforms are transparent and accountable to their users about what they will and will not allow on their services.

Clause 65 recognises that companies, as private entities, have the right to remove content that is legal from their services if they choose to do so. To prevent them doing so, by requiring them to balance this against other priorities, would have perverse consequences for their freedom of action and expression. It is right that people should know what to expect on platforms and that they are able to hold platforms to account when that does not happen. On that basis, I invite the noble Lords who have amendments in this group not to press them.

Lord McNally Portrait Lord McNally (LD)
- View Speech - Hansard - - - Excerpts

My Lords, in his opening remarks, the Minister referred to the fact that this debate began last Tuesday. Well, it did, in that I made a 10-minute opening speech and the noble Baroness, Lady Stowell, rather elegantly hopped out of this group of amendments; perhaps she saw what was coming.

How that made me feel is perhaps best summed up by what the noble Earl, Lord Howe, said earlier when he was justifying the business for tomorrow. He said that adjournments were never satisfactory. In that spirit, I wrote to the Leader of the House, expressing the grumbles I made in my opening remarks. He has written back in a very constructive and thoughtful way. I will not delay the Committee any longer, other than to say that I hope the Leader of the House would agree to make his reply available for other Members to read. It says some interesting things about how we manage business. It sounds like a small matter but if what happened on Tuesday had happened in other circumstances in the other place, business would probably have been delayed for at least an hour while the usual suspects picked holes in it. If the usual channels would look at this, we could avoid some car crashes in future.

I am pleased that this group of amendments has elicited such an interesting debate, with fire coming from all sides. In introducing the debate, I said that probably the only real advice I could give the Committee came from my experience of being on the pre-legislative scrutiny committee in 2003. That showed just how little we were prepared for the tsunami of new technology that was about to engulf us. My one pleasure was that we were part of forming Ofcom. I am pleased that the chairman of Ofcom, the noble Lord, Lord Grade, has assiduously sat through our debates. I suspect he is thinking that he had better hire some more lawyers.

We are trying to get this right. I have no doubt that all sides of the House want to get this legislation through in good shape and for it to play an important role. I am sure that the noble Lord, Lord Grade, never imagined that he would become a state regulator in the kind of ominous way in which the noble Baroness, Lady Fox, said it. Ofcom has done a good job and will do so in future.

There is a problem of getting definitions right. When I was at the Ministry of Justice, I once had to entertain a very distinguished American lawyer. As I usually did, I explained that I was not a lawyer. He looked at me and said, “Then I will speak very slowly”. There is a danger, particularly in this part of the Bill, of wandering into a kind of lawyer-fest. It is important that we are precise about what powers we are giving to whom. Just to chill the Minister’s soul, I remember being warned as well about Pepper v Hart. What he says at the Dispatch Box will be used to interpret what Parliament meant when it gave this or that power.

The debate we have had thus far has been fully justified in sending a few warning signals to the Minister that it is perhaps not quite right yet. It needs further work. There is a lot of good will on all sides of the House to get it right. For the moment, I beg leave to withdraw my amendment.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

As ever, the noble Baroness is an important voice in bursting our bubble in the Chamber. I continue to respect her for that. It will not be perfect; there is no perfect answer to all this. I am siding with safety and caution rather than a bit of a free-for-all. Sometimes there might be overcaution and aspects of debate where the platforms, the regulator, the media, and discussion and debate in this Chamber would say, “The toggles have got it wrong”, but we just have to make a judgment about which side we are on. That is what I am looking forward to hearing from the Minister.

These amendments are supported on all sides and by a long list of organisations, as listed by the noble Baroness, Lady Morgan, and the noble Lord, Lord Clement-Jones. The Minister has not conceded very much at all so far to this Committee. We have heard compelling speeches, such as those from the noble Baroness, Lady Parminter, that have reinforced my sense that he needs to give in on this when we come to Report.

I will also speak to my Amendment 38A. I pay tribute to John Penrose MP, who was mentioned by the noble Baroness, Lady Harding, and his work in raising concerns about misinformation and in stimulating discussion outside the Chambers among parliamentarians and others. Following discussions with him and others in the other place, I propose that users of social media should have the option to filter out content the provenance of which cannot be authenticated.

As we know, social media platforms are often awash with content that is unverified, misleading or downright false. This can be particularly problematic when it comes to sensitive or controversial topics such as elections, health or public safety. In these instances, it can be difficult for users to know whether the information presented to them is accurate. Many noble Lords will be familiar with the deep-fake photograph of the Pope in a white puffa jacket that recently went viral, or the use of imagery for propaganda purposes following the Russian invasion of Ukraine.

The Content Authenticity Initiative has created an open industry standard for content authenticity and provenance. Right now, tools such as Adobe Photoshop allow users to turn on content credentials to securely attach provenance data to images and any edits then made to those images. That technology has now been adopted by camera manufacturers such as Leica and Nikon, so the technology is there to do some of this to help give us some reassurance.

Amendment 38A would allow users to filter out unverified content and is designed to flag posts or articles that do not come from a reliable source or have not been independently verified by a reputable third party. Users could then choose to ignore or filter out such content, ensuring that they are exposed only to information that has been vetted and verified. This would not only help users to make more informed decisions but help to combat the spread of false information on social media platforms. By giving users the power to filter out unverified content, we can help to ensure that social media platforms are not used to spread harmful disinformation or misinformation.

Amendments 42 and 45, in the name of my noble friend Lord Stevenson, are good and straightforward improvements to the Bill, suggesting that Clause 12(11) should include all the protected characteristics in the Equality Act 2010. I listened closely to the argument of the noble Baroness, Lady Buscombe, about listing the various characteristics, but I think it makes much more sense to do what my noble friend is suggesting and just reference the Equality Act. If we chose separately to amend the Equality Act to change protected characteristics, that change would then flow through into this Bill. That seems to be a much more sensible and pragmatic way to proceed. Similarly, in Amendment 45 my noble friend is suggesting that consideration should be given by the Secretary of State as to whether any of these tools should also be available to child users, and we have heard good arguments in favour of that when it comes to dealing with the cliff-edge effect, as described by the noble Baroness, Lady Kidron.

Lastly—and, perhaps the Committee will forgive me, slightly bizarrely—Amendment 283ZA, in the name of the noble Baroness, Lady Fox, to Clause 170(7) in Part 11 invites us to consider, among other things, whether a bot can act reasonably. Given that self-coding bots are now technically possible, I would be fascinated to hear the Minister’s view on whether an autonomous bot can have reason. I asked ChatGPT this question. For the help of the Committee, it replied:

“As an artificial intelligence language model, I do not have consciousness or emotions, and therefore, I do not have a capacity for ‘reason’ in the way that humans understand it. However, I am programmed to use advanced algorithms that allow me to process and analyze large amounts of data, recognize patterns, and provide reasoned responses based on the information available to me. This allows me to simulate a form of reasoning, but it is important to note that it is not the same as human reasoning, as I do not have subjective experiences or personal biases. Ultimately, my abilities are limited to the algorithms and data that have been programmed into my system, and I cannot generate my own subjective experiences or judgments.”


That is the view of the algorithm as to whether or not bots can have reason. I look forward to the Minister’s response.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, the Government recognise the objectives of the amendments in this group: to strengthen protections for adults online. I hope noble Lords will agree that the Bill will indeed significantly improve the safety of all adult users, particularly those who are more vulnerable.

The user empowerment content features will not be the only measures in the Bill that will protect adults. They will act as a final layer of protection, coming after the duties on illegal content and the requirement on category 1 providers to uphold their terms of service. However, as the Clause 12 duties apply to legal content, we need to tread carefully and not inadvertently restrict free expression.

Amendments 34 and 35 in the name of my noble friend Lady Morgan of Cotes and Amendments 36 and 37 in the name of the noble Lord, Lord Clement-Jones, seek to require category 1 services to have their user empowerment content features in operation by default for adult users. The Government share concerns about users who experience disproportionate levels of abuse online or those who are more susceptible to suicide, self-harm or eating disorder content, but these amendments encroach on users’ rights in two ways.

First, the amendments intend to make the decision on behalf of users about whether to have these features turned on. That is aimed especially at those who might not otherwise choose to use those features. The Government do not consider it appropriate to take that choice away from adults, who must be allowed to decide for themselves what legal content they see online. That debate was distilled in the exchange just now between the noble Lord, Lord Knight, and the noble Baroness, Lady Fox, when the noble Lord said he would err on the side of caution, even overcaution, while he characterised the other side as a free-for-all. I might say that it was erring on the side of freedom. That is the debate that we are having, and should have, when looking at these parts of the Bill.

Secondly, the amendments would amount to a government requirement to limit adults’ access to legal content. That presents real concerns about freedom of expression, which the Government cannot accept.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Does the Minister therefore think that the Government condone the current system, where we are inundated algorithmically with material that we do not want? Are the Government condoning that behaviour, in the way that he is saying they would condone a safety measure?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

We will come to talk about algorithms and their risks later on. There is an important balance to strike here that we have debated, rightly, in this group. I remind noble Lords that there are a range of measures that providers can put in place—

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

Because of the importance of that point in relation to what the Minister is about to say, we should be clear about this point: is he ruling out the ability to prioritise the needs and requirements of those who are effectively unable to take the decisions themselves in favour of a broader consideration of freedom of expression? It would be helpful for the future of this debate to be clear on that point.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

We will come in a moment to the provisions that are in the Bill to make sure that decisions can be taken by adults, including vulnerable adults, easily and clearly. If the noble Lord will allow, I will cover that point.

I was in the middle of reminding noble Lords that there are a range of measures that providers can put in place under these duties, some of which might have an impact on a user’s experience if they were required to be switched on by default. That may include, for example, restricting a user’s news feed to content from connected users, adding to the echo chamber and silos of social media, which I know many noble Lords would join me in decrying. We think it is right that that decision is for individual users to make.

The Bill sets out that the user empowerment content tools must be offered to all adult users and must be easy to access—to go the point raised just now as well as by my noble friend Lady Harding, and the noble Baroness, Lady Burt, and, as noble Lords were right to remind us, pushed by the noble Baroness, Lady Campbell of Surbiton, who I am pleased to say I have been able to have discussions with separately from this Committee.

Providers will also be required to have clear and accessible terms of service about what tools are offered on their service and how users might take advantage of them. Ofcom will be able to require category 1 services to report on user empowerment tools in use through transparency reports. Ofcom is also bound by the Communications Act 2003 and the public sector equality duty, so it will need to take into account the ways that people with certain characteristics, including people with disabilities, may be affected when performing its duties, such as writing the codes of practice for the user empowerment duties.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I think the Minister is trying to answer the point raised by my noble friend about vulnerable adults. I am interested in the extent to which he is relying on the Equality Act duty on Ofcom then to impact the behaviour of the platforms that it is regulating in respect of how they are protecting vulnerable adults. My understanding is that the Equality Act duty will apply not to the platforms but only to Ofcom in the way that it regulates them. I am unclear how that is going to provide the protection that we want.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

That is right. Platforms are not in the public sector, so the public sector equality duty does not apply to them. However, that duty applies to Ofcom, taking into account the ways in which people with certain characteristics can be affected through the codes of practice and the user empowerment duties that it is enforcing. So it suffuses the thinking there, but the duty is on Ofcom as a public sector body.

We talk later in Clause 12(11) of some of the characteristics that are similar in approach to the protected characteristics in the Equality Act 2010. I will come to that again shortly in response to points made by noble Lords.

I want to say a bit about the idea of there being a cliff edge at the age of 18. This was raised by a number of noble Lords, including the noble Lord, Lord Griffiths, my noble friends Lady Morgan and Lady Harding and the noble Baroness, Lady Kidron. The Bill’s protections recognise that, in law, people become adults when they turn 18—but it is not right to say that there are no protections for young adults. As noble Lords know, the Bill will provide a triple shield of protection, of which the user empowerment duties are the final element.

The Bill already protects young adults from illegal content and content that is prohibited in terms and conditions. As we discussed in the last group, platforms have strong commercial incentives to prohibit content that the majority of their users do not want to see. Our terms of service duties will make sure that they are transparent about and accountable for how they treat this type of content.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, what distinguishes young adults from older adults in what the Minister in saying?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

In law, there is nothing. I am engaging with the point that there is no cliff edge. There are protections for people once they turn 18. People’s tastes and risk appetites may change over time, but there are protections in the Bill for people of all ages.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Surely, this is precisely the point that the noble Baroness, Lady Kidron, was making. As soon as you reach 18, there is no graduation at all. There is no accounting for vulnerable adults.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

There is not this cliff edge which noble Lords have feared—that there are protections for children and then, at 18, a free for all. There are protections for adult users—young adults, older adults, adults of any age—through the means which I have just set out: namely, the triple shield and the illegal content provisions. I may have confused the noble Lord in my attempt to address the point. The protections are there.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

There is an element of circularity to what the Minister is saying. This is precisely why we are arguing for the default option. It allows this vulnerability to be taken account of.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Perhaps it would help if the Minister wanted to just set out the difference for us. Clearly, this Committee has spent some time debating the protection for children, which has a higher bar than protection for adults. It is not possible to argue that there will be no difference at the age of 18, however effective the first two elements of the triple shield are. Maybe the Minister needs to think about coming at it from the point of view of a child becoming an adult, and talk us through what the difference will be.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Once somebody becomes an adult in law at the age of 18, they are protected through the triple shield in the Bill. The user empowerment duties are one element of this, along with the illegal content duties and the protection against content prohibited in terms and conditions and the redress through Ofcom.

The legislation delivers protection for adults in a way that preserves their choice. That is important. At the age of 18, you can choose to go into a bookshop and to encounter this content online if you want. It is not right for the Government to make decisions on behalf of adults about the legal content that they see. The Bill does not set a definition of a vulnerable adult because this would risk treating particular adults differently, or unfairly restricting their access to legal content or their ability to express themselves. There is no established basis on which to do that in relation to vulnerability.

Finally, we remain committed to introducing a new criminal offence to capture communications that intentionally encourage or assist serious self-harm, including eating disorders. This will provide another layer of protection on top of the regulatory framework for both adults and children.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I understand all of that—I think—but that is not the regime being applied to children. It is really clear that children have a safer, better experience. The difference between those experiences suddenly happening on an 18th birthday is what we are concerned about.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Before the Minister stands up—a new phrase—can he confirm that it is perfectly valid to have a choice to lift the user empowerment tool, just as it is to impose it? Choice would still be there if our amendments were accepted.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

It would be, but we fear the chilling effect of having the choice imposed on people. As the noble Baroness, Lady Fox, rightly put it, one does not know what one has not encountered until one has engaged with the idea. At the age of 18, people are given the choice to decide what they encounter online. They are given the tools to ensure that they do not encounter it if they do not wish to do so. As the noble Lord has heard me say many times, the strongest protections in the Bill are for children. We have been very clear that the Bill has extra protections for people under the age of 18, and it preserves choice and freedom of expression online for adult users—young and old adults.

My noble friend Lady Buscombe asked about the list in Clause 12(11). We will keep it under constant review and may consider updating it should compelling evidence emerge. As the list covers content that is legal and designed for adults, it is right that it should be updated by primary legislation after a period of parliamentary scrutiny.

Amendments 42 and 38A, tabled by the noble Lords, Lord Stevenson of Balmacara and Lord Knight of Weymouth, respectively, seek to change the scope of user empowerment content features. Amendment 38A seeks to expand the user empowerment content features to include the restriction of content the provenance of which cannot be authenticated. Amendment 42 would apply features to content that is abusive on the basis of characteristics protected under the Equality Act 2010.

The user empowerment content list reflects areas where there is the greatest need for users to be offered choice about reducing their exposure to types of content. While I am sympathetic to the intention behind the amendments, I fear they risk unintended consequences for users’ rights online. The Government’s approach recognises the importance of having clear, enforceable and technically feasible duties that do not infringe users’ rights to free expression. These amendments risk undermining this. For instance, Amendment 38A would require the authentication of the provenance of every piece of content present on a service. This could have severe implications for freedom of expression, given its all-encompassing scope. Companies may choose not to have anything at all.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I will try to help the Minister. If the amendment has been poorly drafted, I apologise. It does not seek to require a platform to check the provenance of every piece of content, but content that is certified as having good provenance would have priority for me to be able to see it. In the Bill, I can see or not see verified users. In the same way, I could choose to see or not see verified content.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Thank you. I may be reading the noble Lord’s Amendment 38A excessively critically. I will look at it again. To try to reassure the noble Lord, the Bill already ensures that all services take steps to remove illegal manufactured or manipulated content when they become aware of it. Harmful and illegal misinformation and disinformation is covered in that way.

Amendment 42 would require providers to try to establish on a large scale what is a genuinely held belief that is more than an opinion. In response, I fear that providers would excessively apply the user empowerment features to manage that burden.

A number of noble Lords referred to the discrepancy between the list—

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

Several times in the Bill—but this is a clear example—the drafters have chosen to impose a different sequence of words from that which exists in statute. The obvious one here is the Equality Act, which we have touched on before. The noble Baroness, Lady Buscombe, made a number of serious points about that. Why have the Government chosen to list, separately and distinctively, the characteristics which we have also heard, through a different route, the regulator will be required to uphold in respect of the statute, while the companies will be looking to the text of the Bill, when enacted? Is that not just going to cause chaos?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

The discrepancy comes from the point we touched on earlier. Ofcom, as a public body, is subject to the public sector equality duty and therefore the list set out in the Equality Act 2010. The list at Clause 12(11) relates to content which is abusive, and is therefore for providers to look at. While the Equality Act has established an understanding of characteristics which should be given special protection in law, it is not necessarily desirable to transpose those across. They too are susceptible to the point made by my noble friend Lady Buscombe about lists set out in statute. If I remember rightly, the Equality Act was part of a wash-up at the end of that Parliament, and whether Parliament debated that Bill as thoroughly as it is debating this one is a moot point.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

The noble Lord made that point before, and I was going to pick him up on it. It really is not right to classify our legislation by whether it came through in a short or long period. We are spending an awfully long time on this but that is not going to make it any better. I was involved in the Equality Act, and I have the scars on my back to prove it. It is jolly good legislation and has stood the test of time. I do not think the point is answered properly by simply saying that this is a better way of doing it. The Minister said that Clause 12(11) was about abuse targets, but Clause 12(12) is about “hatred against people” and Clause 12(13) is a series of explanatory points. These provisions are all grist to the lawyers. They are not trying to clarify the way we operate this legislation, in my view, to the best benefit of those affected by it.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

The content which we have added to Clause 12 is a targeted approach. It reflects input from a wide range of interested parties, with whom we have discussed this, on the areas of content that users are most concerned about. The other protected characteristics that do not appear are, for instance, somebody’s marriage or civil partnership status or whether they are pregnant. We have focused on the areas where there is the greatest need for users to be offered the choice about reducing their exposure to types of content because of the abuse they may get from it. This recognises the importance of clear, enforceable and technically feasible duties. As I said a moment ago in relation to the point made by my noble friend Lady Buscombe, we will keep it under review but it is right that these provisions be debated at length—greater length than I think the Equality Bill was, but that was long before my time in your Lordships’ House, so I defer to the noble Lord’s experience and I am grateful that we are debating them thoroughly today.

I will move now, if I may, to discuss Amendments 43 and 283ZA, tabled by the noble Baroness, Lady Fox of Buckley. Amendment 43 aims to ensure that the user empowerment content features do not capture legitimate debate and discussion, specifically relating to the characteristics set out in subsections (11) and (12). Similarly, her Amendment 283ZA aims to ensure that category 1 services apply the features to content only when they have reasonable grounds to infer that it is user empowerment content.

With regard to both amendments, I can reassure the noble Baroness that upholding users’ rights to free expression is an integral principle of the Bill and it has been accounted for in drafting these duties. We have taken steps to ensure that legitimate online discussion or criticism will not be affected, and that companies make an appropriate judgment on the nature of the content in question. We have done this by setting high thresholds for inclusion in the content categories and through further clarification in the Bill’s Explanatory Notes, which I know she has consulted as well. However, the definition here deliberately sets a high threshold. By targeting only abuse and incitement to hatred, it will avoid capturing content which is merely challenging or robust discussion on controversial topics. Further clarity on definitions will be provided by Ofcom through regulatory guidance, on which it will be required to consult. That will sit alongside Ofcom’s code of practice, which will set out the steps companies can take to fulfil their duties.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I appreciate the Minister’s comments but, as I have tried to indicate, incitement to hatred and abuse, despite people thinking they know what those words mean, is causing huge difficulty legally and in institutions throughout the land. Ofcom will have its work cut out, but it was entirely for that reason that I tabled this amendment. There needs to be an even higher threshold, and this needs to be carefully thought through.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

But as I think the noble Baroness understands from that reference, this is a definition already in statute, and with which Parliament and the courts are already engaged.

The Bill’s overarching freedom of expression duties also apply to Clause 12. Subsections (4) to (7) of Clause 18 stipulate that category 1 service providers are required to assess the impact on free expression from their safety policies, including the user empowerment features. This is in addition to the duties in Clause 18(2), which requires all user-to-user services to have particular regard to the importance of protecting freedom of expression when complying with their duties. The noble Baroness’s Amendment 283ZA would require category 1 providers to make judgments on user empowerment content to a similar standard required for illegal content. That would be disproportionate. Clause 170 already specifies how providers must make judgments about whether content is of a particular kind, and therefore in scope of the user empowerment duties. This includes making their judgment based on “all relevant information”. As such, the Bill already ensures that the user empowerment content features will be applied in a proportionate way that will not undermine free speech or hinder legitimate debate online.

Amendment 45, tabled by the noble Lord, Lord Stevenson of Balmacara, would require the Secretary of State to lay a Statement before Parliament outlining whether any of the user empowerment duties should be applied to children. I recognise the significant interest that noble Lords have in applying the Clause 12 duties to children. The Bill already places comprehensive requirements on Part 3 services which children are likely to access. This includes undertaking regular risk assessments of such services, protecting children from harmful content and activity, and putting in place age-appropriate protections. If there is a risk that children will encounter harm, such as self-harm content or through unknown or unverified users contacting them, service providers will need to put in place age- appropriate safety measures. Applying the user empowerment duties for child users runs counter to the Bill’s child safety objectives and may weaken the protections for children—for instance, by giving children an option to see content which is harmful to them or to engage with unknown, unverified users. While we recognise the concerns in this area, for the reasons I have set out, the Government do not agree with the need for this amendment.

I will resist the challenge of the noble Lord, Lord Knight, to talk about bots because I look forward to returning to that in discussing the amendments on future-proofing. With that, I invite noble Lords—

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I noted the points made about the way information is pushed and, in particular, the speech of the right reverend Prelate. Nothing in the Government’s response has really dealt with that concern. Can the Minister say a few words about not the content but the way in which users are enveloped? On the idea that companies always act because they have a commercial imperative not to expose users to harmful material, actually, they have a commercial imperative to spread material and engage users. It is well recorded that a lot of that is in fact harmful material. Can the Minister speak a little more about the features rather than the content?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

We will discuss this when it comes to the definition of content in the Bill, which covers features. I was struck by the speech by the right reverend Prelate about the difference between what people encounter online, and the analogy used by the noble Baroness, Lady Fox, about a bookshop. Social media is of a different scale and has different features which make that analogy not a clean or easy one. We will debate in other groups the accumulated threat of features such as algorithms, if the noble Baroness, Lady Kidron, will allow me to go into greater detail then, but I certainly take the points made by both the right reverend Prelate and the noble Baroness, Lady Fox, in their contributions.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I thank my noble friend very much indeed, and thank all noble Lords who have taken part. As the noble Lord, Lord Knight, said, this has been an important debate—they are all important, of course—but I think this has really got to the heart of parts of the Bill, parts of why it has been proposed in the first place, and some choices the Government made in their drafting and the changes they have made to the Bill. The right reverend Prelate reminded us, as Bishops always do, of the bigger picture, and he was quite right to do so. There is no equality of arms, as he put it, between most of us as internet users and these enormous companies that are changing, and have changed, our society. My noble friend was right—and I was going to pick up on it too—that the bookshop example given by the noble Baroness, Lady Fox, is, I am afraid, totally misguided. I love bookshops; the point is that I can choose to walk into one or not. If I do not walk into a bookshop, I do not see the books promoting some of the content we have discussed today. If they spill out on to the street where I trip over them, I cannot ignore them. This would be even harder if I were a vulnerable person, as we are going to discuss.

Noble Lords said that this is not a debate about content or freedom of expression, but that it is about features; I think that is right. However, it is a debate about choice, as the noble Lord, Lord Clement-Jones, said. I am grateful to each of those noble Lords who supported my amendments; we have had a good debate on both sets of amendments, which are similar. But as the noble Lord, Lord Griffiths, said, some of the content we are discussing, particularly in subsection (10), relating to suicide, pro-self-harm and pro-anorexia content, has literal life or death repercussions. To those noble Lords, and those outside this House, who seem to think we should not worry and should allow a total free-for-all, I say that we are doing so, in that the Government, in choosing not to adopt such amendments, are making an active choice. I am afraid the Government are condoning the serving up of insidious, deliberately harmful and deliberately dangerous content to our society, to younger people and vulnerable adults. The Minister and the Government would be better off if they said, “That is the choice that we have made”. I find it a really troubling choice because, as many noble Lords will know, I was involved in this Bill a number of years ago—there has been a certain turnover of Culture Secretaries in the last couple of years, and I was one of them. I find the Government’s choice troubling, but it has been made. As the noble Lord, Lord Knight, said, we are treating children differently from how we are treating adults. As drafted, there is a cliff edge at the age of 18. As a society, we should say that there are vulnerabilities among adults, as we do in many walks of life; and exactly as the noble Baroness, Lady Parminter, so powerfully said, there are times when we as a House, as a Parliament, as a society and as a state, should say we want to protect people. There is an offer here in both sets of amendments—I am not precious about which ones we choose—to have that protection.

I will of course withdraw the amendment today, because that is the convention of the House, but I ask my noble friend to reflect on the strength of feeling expressed by the House on this today; I think the Whip on the Bench will report as well. I am certain we will return to this on Report, probably with a unified set of amendments. In the algorithmic debate we will return to, the Government will have to explain, in words of one syllable, to those outside this House who worry about the vulnerable they work with or look after, about the choice that the Government have made in not offering protections when they could have done, in relation to these enormously powerful platforms and the insidious content they serve up repeatedly.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Baroness Buscombe Portrait Baroness Buscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, before we continue this debate, I want to understand why we have changed the system so that we break part way through a group of amendments. I am sorry, but I think this is very poor. It is definitely a retrograde step. Why are we doing it? I have never experienced this before. I have sat here and waited for the amendment I have just spoken to. We have now had a break; it has broken the momentum of that group. It was even worse last week, because we broke for several days half way through the debate on an amendment. This is unheard of in my memory of 25 years in this House. Can my noble friend the Minister explain who made this decision, and how this has changed?

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

I have not had as long in your Lordships’ House, but this is not unprecedented, in my experience. These decisions are taken by the usual channels; I will certainly feed that back through my noble friend. One of the difficulties, of course, is that because there are no speaking limits on legislation and we do not know how many people want to speak on each amendment, the length of each group can be variable, so I think this is for the easier arrangement of dinner-break business. Also, for the dietary planning of those of us who speak on every group, it is useful to have some certainty, but I do appreciate my noble friend’s point.

Baroness Buscombe Portrait Baroness Buscombe (Con)
- Hansard - - - Excerpts

Okay; I thank my noble friend for his response. However, I would just say that we never would have broken like that, before 7.30 pm. I will leave it at that, but I will have a word with the usual channels.

--- Later in debate ---
I have a question for the Minister in concluding my comments on this group. Could he confirm whether, under the current provisions, somebody’s full name would have to be publicly displayed for the verification duty to have been met, or could they use a pseudonym or a generic username publicly, with verification having taken place in a private and secure manner? I look forward to hearing from the Minister.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, the range of the amendments in this group indicates the importance of the Government’s approach to user verification and non-verified user duties. The way these duties have been designed seeks to strike a careful balance between empowering adults while safeguarding privacy and anonymity.

Amendments 38, 39, 139 and 140 have been tabled by my noble friend Lord Moylan. Amendments 38 and 39 seek to remove subsections (6) and (7) of the non-verified users’ duties. These place a duty on category 1 platforms to give adult users the option of preventing non-verified users interacting with their content, reducing the likelihood that a user sees content from non-verified users. I want to be clear that these duties do not require the removal of legal content from a service and do not impinge on free speech.

In addition, there are already existing duties in the Bill to safeguard legitimate online debate. For example, category 1 services will be required to assess the impact on free expression of their safety policies, including the impact of their user empowerment tools. Removing subsections (6) and (7) of Clause 12 would undermine the Bill’s protection for adult users of category 1 services, especially the most vulnerable. It would be entirely at the service provider’s discretion to offer users the ability to minimise their exposure to anonymous and abusive users, sometimes known as trolls. In addition, instead of mandating that users verify their identity, the Bill gives adults the choice. On that basis, I am confident that the Bill already achieves the effect of Amendment 139.

Amendment 140 seeks to reduce the amount of personal data transacted as part of the verification process. Under subsection (3) of Clause 57, however, providers will be required to explain in their terms of service how the verification process works, empowering users to make an informed choice about whether they wish to verify their identity. In addition, the Bill does not alter the UK’s existing data protection laws, which provide people with specific rights and protections in relation to the processing of their personal data. Ofcom’s guidance in this area will reflect existing laws, ensuring that users’ data is protected where personal data is processed. I hope my noble friend will therefore be reassured that these duties reaffirm the concept of choice and uphold the importance of protecting personal data.

While I am speaking to the questions raised by my noble friend, I turn to those he asked about Wikipedia. I have nothing further to add to the comments I made previously, not least that it is impossible to pre-empt the assessments that will be made of which services fall into which category. Of course, assessments will be made at the time, based on what the services do at the time of the assessment, so if he will forgive me, I will not be drawn on particular services.

To speak in more general terms, category 1 services are those with the largest reach and the greatest influence over public discourse. The Bill sets out a clear process for determining category 1 providers, based on thresholds set by the Secretary of State in secondary legislation following advice from Ofcom. That is to ensure that the process is objective and evidence based. To deliver this advice, Ofcom will undertake research into the relationship between how quickly, easily and widely user-generated content is disseminated by that service, the number of users and functionalities it has and other relevant characteristics and factors.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

Will my noble friend at least confirm what he said previously: namely, that it is the Government’s view—or at least his view—that Wikipedia will not qualify as a category 1 service? Those were the words I heard him use at the Dispatch Box.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

That is my view, on the current state of play, but I cannot pre-empt an assessment made at a point in the future, particularly if services change. I stand by what I said previously, but I hope my noble friend will understand if I do not elaborate further on this, at the risk of undermining the reassurance I might have given him previously.

Amendments 40, 41, 141 and 303 have been tabled by the noble Lord, Lord Stevenson of Balmacara, and, as noble Lords have noted, I have added my name to Amendment 40. I am pleased to say that the Government are content to accept it. The noble Baroness, Lady Merron, should not minimise this, because it involves splitting an infinitive, which I am loath to do. If this is a statement of intent, I have let that one go, in the spirit of consensus. Amendment 40 amends Clause 12(7) to ensure that the tools which will allow adult users to filter out content from non-verified users are effective and I am pleased to add my name to it.

Amendment 41 seeks to make it so that users can see whether another user is verified or not. I am afraid we are not minded to accept it. While I appreciate the intent, forcing users to show whether they are verified or not may have unintended consequences for those who are unable to verify themselves for perfectly legitimate reasons. This risks creating a two-tier system online. Users will still be able to set a preference to reduce their interaction with non-verified users without making this change.

Amendment 141 seeks to prescribe a set of principles and standards in Ofcom’s guidance on user verification. It is, however, important that Ofcom has discretion to determine, in consultation with relevant persons, which principles will have the best outcomes for users, while ensuring compliance with the duties. Further areas of the Bill also address several issues raised in this amendment. For example, all companies in scope will have a specific legal duty to have effective user reporting and redress mechanisms.

Existing laws also ensure that Ofcom’s guidance will reflect high standards. For example, it is a general duty of Ofcom under Section 3 of the Communications Act 2003 to further the interests of consumers, including by promoting competition. This amendment would, in parts, duplicate existing duties and undermine Ofcom’s independence to set standards on areas it deems relevant after consultation with expert groups.

Amendment 303 would add a definition of user identity verification. The definition it proposes would result in users having to display their real name online if they decide to verify themselves. In answer to the noble Baroness’s question, the current requirements do not specify that users must display their real name. The amendment would have potential safety implications for vulnerable users, for example victims and survivors of domestic abuse, whistleblowers and others of whom noble Lords have given examples in their contributions. The proposed definition would also create reliance on official forms of identification. That would be contrary to the existing approach in Clause 57 which specifically sets out that verification need not require such forms of documentation.

The noble Baroness, Lady Kidron, talked about paid-for verification schemes. The user identity verification provisions were brought in to ensure that adult users of the largest services can verify their identity if they so wish. These provisions are different from the blue tick schemes and others currently in place, which focus on a user’s status rather than verifying their identity. Clause 57 specifically sets out that providers of category 1 services will be required to offer all adult users the option to verify their identity. Ofcom will provide guidance for user identity verification to assist providers in complying with these duties. In doing so, it will consult groups that represent the interests of vulnerable adult users. In setting out recommendations about user verification, Ofcom must have particular regard to ensuring that providers of category 1 services offer users a form of identity verification that is likely to be available to vulnerable adult users. Ofcom will also be subject to the public sector equality duty, so it will need to take into account the ways in which people with certain characteristics may be affected when it performs this and all its duties under the Bill.

A narrow definition of identity verification could limit the range of measures that service providers might offer their users in the future. Under the current approach, Ofcom will produce and publish guidance on identity verification after consulting those with technical expertise and groups which represent the interests of vulnerable adult users.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I am sorry to interrupt the noble Lord. Is the answer to my question that the blue tick and the current Meta system will not be considered as verification under the terms of the Bill? Is that the implication of what he said?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Yes. The blue tick is certainly not identity verification. I will write to confirm on Meta, but they are separate and, as the example of blue ticks and Twitter shows, a changing feast. That is why I am talking in general terms about the approach, so as not to rely too much on examples that are changing even in the course of this Committee.

Government Amendment 43A stands in my name. This clarifies that “non-verified user” refers to users whether they are based in the UK or elsewhere. This ensures that, if a UK user decides he or she no longer wishes to interact with non-verified users, this will apply regardless of where they are based.

Finally, Amendment 106 in the name of my noble friend Lady Buscombe would make an addition to the online safety objectives for regulated user-to-user services. It would amend them to make it clear that one of the Bill’s objectives is to protect people from communications offences committed by anonymous users.

The Bill already imposes duties on services to tackle illegal content. Those duties apply across all areas of a service, including the way it is designed and operated. Platforms will be required to take measures—for instance, changing the design of functionalities, algorithms, and other features such as anonymity—to tackle illegal content.

Ofcom is also required to ensure that user-to-user services are designed and operated to protect people from harm, including with regard to functionalities and other features relating to the operation of their service. This will likely include the use of anonymous accounts to commit offences in the scope of the Bill. My noble friend’s amendment is therefore not needed. I hope she will be satisfied not to press it, along with the other noble Lords who have amendments in this group.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, I would like to say that that was a rewarding and fulfilling debate in which everyone heard very much what they wanted to hear from my noble friend the Minister. I am afraid I cannot say that. I think it has been one of the most frustrating debates I have been involved in since I came into your Lordships’ House. However, it gave us an opportunity to admire the loftiness of manner that the noble Lord, Lord Clement-Jones, brought to dismissing my concerns about Wikipedia—that I was really just overreading the whole thing and that I should not be too bothered with words as they appear in the Bill because the noble Lord thinks that Wikipedia is rather a good thing and why is it not happy with that as a level of assurance?

I would like to think that the Minister had dealt with the matter in the way that I hoped he would, but I do thin, if I may say so, that it is vaguely irresponsible to come to the Dispatch Box and say, “I don’t think Wikipedia will qualify as a category 1 service”, and then refuse to say whether it will or will not and take refuge in the process the Bill sets up, when at least one Member of the House of Lords, and possibly a second in the shape of the noble Lord, Lord Clement-Jones, would like to know the answer to the question. I see a Minister from the business department sitting on the Front Bench with my noble friend. This is a bit like throwing a hand grenade into a business headquarters, walking away and saying, “It was nothing to do with me”. You have to imagine what the position is like for the business.

We had a very important amendment from my noble friend Lady Buscombe. I think we all sympathise with the type of abuse that she is talking about—not only its personal effects but its deliberate business effects, the deliberate attempt to destroy businesses. I say only that my reading of her Amendment 106 is that it seeks to impose on Ofcom an objective to prevent harm, essentially, arising from offences under Clauses 160 and 162 of the Bill committed by unverified or anonymous users. Surely what she would want to say is that, irrespective of verification and anonymity, one would want action taken against this sort of deliberate attempt to undermine and destroy businesses. While I have every sympathy with her amendment, I am not entirely sure that it relates to the question of anonymity and verification.

Apart from that, there were in a sense two debates going on in parallel in our deliberations. One was to do with anonymity. On that question, I think the noble Lord, Lord Clement-Jones, put the matter very well: in the end, you have to come down on one side or the other. My personal view, with some reluctance, is that I have come down on the same side as the Government, the noble Lord and others. I think we should not ban anonymity because there are costs and risks to doing so, however satisfying it would be to be able to expose and sue some of the people who say terrible and untrue things about one another on social media.

The more important debate was not about anonymity as such but about verification. We had the following questions, which I am afraid I do not think were satisfactorily answered. What is verification? What does it mean? Can we define what verification is? Is it too expensive? Implicitly, should it be available for free? Is there an obligation for it to be free or do the paid-for services count, and what happens if they are so expensive that one cannot reasonably afford them? Is it real, in the sense that the verification processes devised by the various platforms genuinely provide verification? Various other questions like that came up but I do not think that any of them was answered.

I hate to say this as it sounds a little harsh about a Government whom I so ardently support, but the truth is that the triple shield, also referred to as a three-legged stool in our debate, was hastily cobbled together to make up for the absence of legal but harmful, but it is wonky; it is not working, it is full of holes and it is not fit for purpose. Whatever the Minister says today, there has to be a rethink before he comes back to discuss these matters at the next stage of the Bill. In the meantime, I beg leave to withdraw my amendment.

--- Later in debate ---
Moved by
43A: Clause 12, page 13, line 20, leave out from “who” to end of line 21 and insert “—
(a) is an individual, whether in the United Kingdom or outside it, and(b) has not verified their identity to the provider of a service;”Member’s explanatory statement
This amendment makes it clear that the term “non-verified user” in clause 12 (user empowerment duties) refers to individuals and includes users outside the United Kingdom.
--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

Lawyers—don’t you love them? How on earth are we supposed to unscramble that at this time of night? It was good to have my kinsman, the noble and learned Lord, Lord Hope, back in our debates. We were remarking only a few days ago that we had not seen enough lawyers in the House in these debates. One appears, and light appears. It is a marvellous experience.

I thank the Committee for listening to my earlier introductory remarks; I hope they helped to untangle some of the issues. The noble Lord, Lord Black, made it clear that the press are happy with what is in the current draft. There could be some changes, and we have heard a number of examples of ways in which one might either top or tail what there is.

There was one question that perhaps he could have come back on, and maybe he will, as I have raised it separately with the department before. I agree with a lot of what he said, but it applies to a lot more than just news publishers. Quality journalism more generally enhances and restores our faith in public services in so many ways. Why is it only the news? Is there a way in which we could broaden that? If there is not this time round, perhaps that is something we need to pick up later.

As the noble Lord, Lord Clement-Jones, has said, the noble Viscount, Lord Colville, made a very strong and clear case for trying to think again about what journalism does in the public realm and making sure that the Bill at least carries that forward, even if it does not deal with some of the issues that he raised.

We have had a number of other good contributions about how to capture some of the good ideas that were flying around in this debate and keep them in the foreground so that the Bill is enhanced. But I think it is time that the Minister gave us his answers.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I join noble Lords who have sent good wishes for a speedy recovery to the noble Baroness, Lady Featherstone.

Amendments 46, 47 and 64, in the name of my noble friend Lady Stowell of Beeston, seek to require platforms to assess the risk of, and set terms for, content currently set out in Clause 12. Additionally, the amendments seek to place duties on services to assess risks to freedom of expression resulting from user empowerment tools. Category 1 platforms are already required to assess the impact on free expression of their safety policies, including user empowerment tools; to keep that assessment up to date; to publish it; and to demonstrate the positive steps they have taken in response to the impact assessment in a publicly available statement.

Amendments 48 and 100, in the name of the noble Lord, Lord Stevenson, seek to introduce a stand-alone duty on category 1 services to protect freedom of expression, with an accompanying code of practice. Amendments 49, 50, 53A, 61 and 156, in the name of the noble Baroness, Lady Fox, seek to amend the Bill’s Clause 17 and Clause 18 duties and clarify duties on content of democratic importance.

All in-scope services must already consider and implement safeguards for freedom of expression when fulfilling their duties. Category 1 services will need to be clear what content is acceptable on their services and how they will treat it, including when removing or restricting access to it, and that they will enforce the rules consistently. In setting these terms of service, they must adopt clear policies designed to protect journalistic and democratic content. That will ensure that the most important types of content benefit from additional protections while guarding against the arbitrary removal of any content. Users will be able to access effective appeal mechanisms if content is unfairly removed. That marks a considerable improvement on the status quo.

Requiring all user-to-user services to justify why they are removing or restricting each individual piece of content, as Amendment 53A would do, would be disproportionately burdensome on companies, particularly small and medium-sized ones. It would also duplicate some of the provisions I have previously outlined. Separately, as private entities, service providers have their own freedom of expression rights. This means that platforms are free to decide what content should or should not be on their website, within the bounds of the law. The Bill should not mandate providers to carry or to remove certain types of speech or content. Accordingly, we do not think it would be appropriate to require providers to ensure that free speech is not infringed, as suggested in Amendment 48.

--- Later in debate ---
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

Why would it not be possible for us to try to define what the public interest might be, and not leave it to the platforms to do so?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I ask the noble Viscount to bear with me. I will come on to this a bit later. I do not think it is for category 1 platforms to do so.

We have introduced Clause 15 to reduce the powers that the major technology companies have over what journalism is made available to UK users. Accordingly, Clause 15 requires category 1 providers to set clear terms of service which explain how they take the importance of journalistic content into account when making their moderation decisions. These duties will not stop platforms removing journalistic content. Platforms have the flexibility to set their own journalism policies, but they must enforce them consistently. They will not be able to remove journalistic content arbitrarily. This will ensure that platforms give all users of journalism due process when making content moderation decisions. Amendment 51 would mean that, where platforms subjectively reached a decision that journalism was not conducive to the public good, they would not have to give it due process. Platforms could continue to treat important journalistic content arbitrarily where they decided that this content was not in the public interest of the UK.

In his first remarks on this group the noble Lord, Lord Stevenson, engaged with the question of how companies will identify content of democratic importance, which is content that seeks to contribute to democratic political debate in the UK at a national and local level. It will be broad enough to cover all political debates, including grass-roots campaigns and smaller parties. While platforms will have some discretion about what their policies in this area are, the policies will need to ensure that platforms are balancing the importance of protecting democratic content with their safety duties. For example, platforms will need to consider whether the public interest in seeing some types of content outweighs the potential harm it could cause. This will require companies to set out in their terms of service how they will treat different types of content and the systems and processes they have in place to protect such content.

Amendments 57 and 62, in the name of my noble friend Lord Kamall, seek to impose new duties on companies to protect a broader range of users’ rights, as well as to pay particular attention to the freedom of expression of users with protected characteristics. As previously set out, services will have duties to safeguard the freedom of expression of all users, regardless of their characteristics. Moreover, UK providers have existing duties under the Equality Act 2010 not to discriminate against people with characteristics which are protected in that Act. Given the range of rights included in Amendment 57, it is not clear what this would require from service providers in practice, and their relevance to service providers would likely vary between different rights.

Amendment 60, in the name of the noble Lord, Lord Clement-Jones, and Amendment 88, in the name of the noble Lord, Lord Stevenson, probe whether references to privacy law in Clauses 18 and 28 include Article 8 of the European Convention on Human Rights. That convention applies to member states which are signatories. Article 8(1) requires signatories to ensure the right to respect for private and family life, home and correspondence, subject to limited derogations that must be in accordance with the law and necessary in a democratic society. The obligations flowing from Article 8 do not apply to individuals or to private companies and it would not make sense for these obligations to be applied in this way, given that states which are signatories will need to decide under Article 8(2) which restrictions on the Article 8(1) right they need to impose. It would not be appropriate or possible for private companies to make decisions on such restrictions.

Providers will, however, need to comply with all UK statutory and common-law provisions relating to privacy, and must therefore implement safeguards for user privacy when meeting their safety duties. More broadly, Ofcom is bound by the Human Rights Act 1998 and must therefore uphold Article 8 of the European Convention on Human Rights when implementing the Bill’s regime.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

It is so complicated that the Minister is almost enticing me to stand up and ask about it. Let us just get that right: the reference to the Article 8 powers exists and applies to those bodies in the UK to which such equivalent legislation applies, so that ties us into Ofcom. Companies cannot be affected by it because it is a public duty, not a private duty, but am I then allowed to walk all the way around the circle? At the end, can Ofcom look back at the companies to establish whether, in Ofcom’s eyes, its requirements in relation to its obligations under Article 8 have or have not taken place? It is a sort of transparent, backward-reflecting view rather than a proactive proposition. That seems a complicated way of saying, “Why don’t you behave in accordance with Article 8?”

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Yes, Ofcom, which is bound by it through the Human Rights Act 1998, can ask those questions and make that assessment of the companies, but it would not be right for private companies to be bound by something to which it is not appropriate for companies to be signatories. Ofcom will be looking at these questions but the duty rests on it, as bound by the Human Rights Act.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

It is late at night and this is slightly tedious, but in the worst of all possible circumstances, Ofcom would be looking at what happened over the last year in relation to its codes of practice and assertions about a particular company. Ofcom is then in trouble because it has not discharged its Article 8 obligations, so who gets to exercise a whip on whom? Sorry, whips are probably the wrong things to use, but you see where I am coming from. All that is left is for the Secretary of State, but probably it would effectively be Parliament, to say to Ofcom, “You’ve failed”. That does not seem a very satisfactory solution.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Platforms will be guided by Ofcom in taking measures to comply with their duties which are recommended in Ofcom’s codes, and which contain safeguards for privacy, including ones based on the European Convention on Human Rights and the rights therein. Paragraph 10(2)(b) of Schedule 4 requires Ofcom to ensure that measures, which it describes in the code of practice, are designed in light of the importance of protecting the privacy of users. Clause 42(2) and (3) provides that platforms will be treated as complying with the privacy duties set out at Clause 18(2) and Clause 28(2), if they take the recommended measures that Ofcom sets out in the codes.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

That is the point I was making.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

It worked. In seriousness, we will both consult the record and, if the noble Lord wants more, I am very happy to set it out in writing.

Amendment 63 in the name of the noble and learned Lord, Lord Hope of Craighead, seeks to clarify that “freedom of expression” in Clause 18 refers to the

“freedom to impart ideas, opinions or information”,

as referred to in Article 10 of the European Convention on Human Rights. I think I too have been guilty of using the phrases “freedom of speech” and “freedom of expression” as though they were interchangeable. Freedom of expression, within the law, is intended to encompass all the freedom of expression rights arising from UK law, including under common law. The rights to freedom of expression under Article 10 of the European Convention on Human Rights include both the rights to impart ideas, opinions and information, but also the right to receive such ideas, opinions and information. Any revised definition of freedom of expression to be included in the Bill should refer to both aspects of the Article 10 definition, given the importance for both children and adults of receiving information via the internet. We recognise the importance of clarity in relation to the duties set out in Clauses 18 and 28, and we are very grateful to the noble and learned Lord for proposing this amendment, and for the experience he brings to bear on behalf of the Constitution Committee of your Lordships’ House. The Higher Education (Freedom of Speech) Bill and the Online Safety Bill serve very different purposes, but I am happy to say that the Bill team and I will consider this amendment closely between now and Report.

Amendments 101, 102, 109, 112, 116, 121, 191 and 220, in the name of my noble friend Lord Moylan, seek to require Ofcom to have special regard to the importance of protecting freedom of expression when exercising its enforcement duties, and when drafting or amending codes of practice or guidance. Ofcom must already ensure that it protects freedom of expression when overseeing the Bill, because it is bound by the Human Rights Act, as I say. It also has specific duties to ensure that it is clear about how it is protecting freedom of expression when exercising its duties, including when developing codes of practice.

My noble friend’s Amendment 294 seeks to remove “psychological” from the definition of harm in the Bill. It is worth being clear that the definition of harm is used in the Bill as part of the illegal and child safety duties. There is no definition of harm, psychological or otherwise, with regard to adults, given that the definition of content which is harmful to adults was removed from the Bill in another place. With regard to children, I agree with the points made by the noble Baroness, Lady Kidron. It is important that psychological harm is captured in the Bill’s child safety duties, given the significant impact that such content can have on young minds.

I invite my noble friend and others not to press their amendments in this group.

--- Later in debate ---
Moved by
50A: Clause 13, page 14, line 8, at end insert—
“(5A) In determining what is proportionate for the purposes of subsection (2), the size and capacity of the provider of a service, in particular, is relevant.”Member’s explanatory statement
This amendment indicates that the size and capacity of a provider is important in construing the reference to “proportionate systems and processes” in clause 13 (duties to protect content of democratic importance).

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Moved by
50B: Clause 14, page 15, line 30, leave out “subsection (2)(a)” and insert “this section”
Member’s explanatory statement
This is a technical amendment to make it clear that clause 14(9), which sets out circumstances which do not count as a provider “taking action” in relation to news publisher content, applies for the purposes of the whole clause.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - -

My Lords, His Majesty’s Government are committed to defending the invaluable role of our free media. We are clear that our online safety legislation must protect the vital role of the press in providing people with reliable and accurate information. That is why this Bill includes strong protections for recognised news publishers. The Bill does not impose new duties on news publishers’ content, which is exempt from the Bill’s safety duties. In addition, the Bill includes strong safeguards for news publisher content, set out in Clause 14. In order to benefit from these protections, publishers will have to meet a set of stringent criteria, set out in Clause 50.

I am aware of concerns in your Lordships’ House and another place that the definition of news publishers is too broad and that these protections could therefore create a loophole to be exploited. That is why the Government are bringing forward amendments to the definition of “recognised news publisher” to ensure that sanctioned entities cannot benefit from these protections. I will shortly explain these protections in detail but I would like to be clear that narrowing the definition any further would pose a critical risk to our commitment to self-regulation of the press. We do not want to create requirements which would in effect put Ofcom in the position of a press regulator. We believe that the criteria set out in Clause 50 are already strong, and we have taken significant care to ensure that established news publishers are captured, while limiting the opportunity for bad actors to benefit. 

Government Amendments 126A and 127A propose changes to the criteria for recognised news publishers. These criteria already exclude any entity that is a proscribed organisation under the Terrorism Act 2000 or the purpose of which is to support a proscribed organisation under that Act. We are clear that sanctioned news outlets such as RT, formerly Russia Today, must not benefit from these protections either. The amendments we are tabling today will therefore tighten the recognised news publisher criteria further by excluding entities that have been designated for sanctions imposed by both His Majesty’s Government and the United Nations Security Council. I hope noble Lords will accept these amendments, in order to ensure that content from publishers which pose a security threat to this country cannot benefit from protections designed to defend a free press.

In addition, the Government have also tabled amendments 50B, 50C, 50D, 127B, 127C and 283A, which are aimed at ensuring that the protections for news publishers in Clause 14 are workable and do not have unforeseen consequences for the operation of category 1 services. Clause 14 gives category 1 platforms a duty to notify recognised news publishers and offer a right of appeal before taking action against any of their content or accounts.

Clause 14 sets out the circumstances in which companies must offer news publishers an appeal. As drafted, it states that platforms must offer this before they take down news publisher content, before they restrict users’ access to such content or where they propose to “take any other action” in relation to publisher content. Platforms must also offer an appeal if they propose to take action against a registered news publisher’s account by giving them a warning, suspending or banning them from using a service or in any way restricting their ability to use a service.

These amendments provide greater clarity about what constitutes “taking action” in relation to news publisher content, and therefore when category 1 services must offer an appeal. They make it clear that a platform must offer this before they take down such content, add a warning label or take any other action against content in line with any terms of service that allow or prohibit content. This will ensure that platforms are not required to offer publishers a right of appeal every time they propose to carry out routine content curation and similar routine actions. That would be unworkable for platforms and would be likely to inhibit the effectiveness of the appeal process.

As noble Lords know, the Bill has a strong focus on user empowerment and enabling users to take control of their online experience. The Government have therefore tabled amendments to Clause 52 to ensure that providers are required only to offer publishers a right of appeal in relation to their own moderation decisions, not where a user has voluntarily chosen not to view certain types of content. For example, if a user has epilepsy and has opted not to view photo-sensitive content, platforms will not be required to offer publishers a right of appeal before restricting that content for the user in question.

In addition, to ensure that the Bill maintains strong protections for children, the amendments make it clear that platforms are not required to offer news publishers an appeal before applying warning labels to content viewed by children. The amendments also make it clear that platforms would be in breach of the legislation if they applied warning labels to content encountered by adults without first offering news publishers an appeal, but in order to ensure that the Bill maintains strong protections for children, that does not apply to warning labels on content encountered by children. I beg to move.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome the amendments the Government have tabled, but I ask the Minister to clarify the effect of Amendment 50E. I declare an interest as chair of the Communications and Digital Select Committee, which has discussed Amendment 50E and the labelling of content for children with the news media organisations. This is a very technical issue, but from what my noble friend was just saying, it seems that content that would qualify for labelling for child protection purposes, and which therefore does not qualify for a right of appeal before the content is so labelled, is not content that would normally be encountered by adults but might happen to appeal to children. I would like to be clear that we are not giving the platforms scope for adding labels to content that they ought not to be adding labels to. That aside, as I say, I am grateful to my noble friend for these amendments.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I am sorry; in my enthusiasm to get this day of Committee off to a swift start, I perhaps rattled through that rather quickly. On Amendment 50E, which my noble friend Lady Stowell asked about, I make clear that platforms will be in breach of their duties if, without applying the protection, they add warning labels to news publishers’ content that they know will be seen by adult users, regardless of whether that content particularly appeals to children.

As the noble Lord, Lord Clement-Jones, and others noted, we will return to some of the underlying principles later on, but the Government have laid these amendments to clarify category 1 platforms’ duties to protect recognised news publishers’ content. They take some publishers out of scope of the protections and make it clearer that category 1 platforms will have only to offer news publishers an appeal before taking punitive actions against their content.

The noble Baroness, Lady Fox, asked about how we define “recognised news publisher”. I am conscious that we will debate this more in later groups, but Clause 50 sets out a range of criteria that an organisation must meet to qualify as a recognised news publisher. These include the organisation’s “principal purpose” being the publication of news, it being subject to a “standards code” and its content being “created by different persons”. The protections for organisations are focused on publishers whose primary purpose is reporting on news and current affairs, recognising the importance of that in a democratic society. I am grateful to noble Lords for their support.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - - - Excerpts

What my noble friend said is absolutely fine with me, and I thank him very much for it. It might be worth letting the noble Baroness, Lady Fox, know that Amendment 127 has now been moved to the group that the noble Lord, Lord Clement-Jones, referred to. I thought it was worth offering that comfort to the noble Baroness.

Amendment 50B agreed.
Moved by
50C: Clause 14, page 15, line 44, leave out subsection (11)
Member’s explanatory statement
This amendment omits a provision about OFCOM’s guidance under clause 171, as that provision is now to be made in clause 171 itself.
--- Later in debate ---
Moved by
50F: Clause 15, page 17, line 14, at end insert—
“(8A) In determining what is proportionate for the purposes of subsection (2), the size and capacity of the provider of a service, in particular, is relevant.”Member’s explanatory statement
This amendment indicates that the size and capacity of a provider is important in construing the reference to “proportionate systems and processes” in clause 15 (duties to protect journalistic content).
--- Later in debate ---
These amendments are a means of at least groping towards a better way of tackling misinformation and disinformation, which, as we have heard, can have a huge impact, particularly in health.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, this debate has demonstrated the diversity of opinion regarding misinformation and disinformation—as the noble Lord said, the Joint Committee gave a lot of thought to this issue—as well as the difficulty of finding the truth of very complex issues while not shutting down legitimate debate. It is therefore important that we legislate in a way that takes a balanced approach to tackling this, keeping people safe online while protecting freedom of expression.

The Government take misinformation and disinformation very seriously. From Covid-19 to Russia’s use of disinformation as a tool in its illegal invasion of Ukraine, it is a pervasive threat, and I pay tribute to the work of my noble friend Lord Bethell and his colleagues in the Department of Health and Social Care during the pandemic to counter the cynical and exploitative forces that sought to undermine the heroic effort to get people vaccinated and to escape from the clutches of Covid-19.

We recognise that misinformation and disinformation come in many forms, and the Bill reflects this. Its focus is rightly on tackling the most egregious, illegal forms of misinformation and disinformation, such as content which amounts to the foreign interference offence or which is harmful to children—for instance, that which intersects with named categories of primary priority or priority content.

That is not the only way in which the Bill seeks to tackle it, however. The new terms of service duties for category 1 services will hold companies to account over how they say they treat misinformation and disinformation on their services. However, the Government are not in the business of telling companies what legal content they can and cannot allow online, and the Bill should not and will not prevent adults accessing legal content. In addition, the Bill will establish an advisory committee on misinformation and disinformation to provide advice to Ofcom on how they should be tackled online. Ofcom will be given the tools to understand how effectively misinformation and disinformation are being addressed by platforms through transparency reports and information-gathering powers.

Amendment 52 from the noble Baroness, Lady Merron, seeks to introduce a new duty on platforms in relation to health misinformation and disinformation for adult users, while Amendments 59 and 107 from my noble friend Lord Moylan aim to introduce new proportionality duties for platforms tackling misinformation and disinformation. The Bill already addresses the most egregious types of misinformation and disinformation in a proportionate way that respects freedom of expression by focusing on misinformation and disinformation that are illegal or harmful to children.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I am curious as to what the Bill says about misinformation and disinformation in relation to children. My understanding of primary priority and priority harms is that they concern issues such as self-harm and pornography, but do they say anything specific about misinformation of the kind we have been discussing and whether children will be protected from it?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am sorry—I am not sure I follow the noble Baroness’s question.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Twice so far in his reply, the Minister has said that this measure will protect children from misinformation and disinformation. I was just curious because I have not seen any sight of that, either in discussions or in the Bill. I was making a distinction regarding harmful content that we know the shape of—for example, pornography and self-harm, which are not, in themselves, misinformation or disinformation of the kind we are discussing now. It is news to me that children are going to be protected from this, and I am delighted, but I was just checking.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Yes, that is what the measure does—for instance, where it intersects with the named categories of primary priority or priority content in the Bill, although that is not the only way the Bill does it. This will be covered by non-designated content that is harmful to children. As we have said, we will bring forward amendments on Report—which is perhaps why the noble Baroness has not seen them in the material in front of us—regarding material harms to children, and they will provide further detail and clarity.

Returning to the advisory committee that the Bill sets up and the amendments from the noble Baroness, Lady Merron, and my noble friend Lord Moylan, all regulated service providers will be forced to take action against illegal misinformation and disinformation in scope of the Bill. That includes the new false communication offences in the Bill that will capture communications where the sender knows the information to be false but sends it intending to cause harm—for example, hoax cures for a virus such as Covid-19. The noble Baroness is right to say that that is a slightly different approach from the one taken in her amendment, but we think it an appropriate and proportionate response to tackling damaging and illegal misinformation and disinformation. If a platform is likely to be accessed by children, it will have to protect them from encountering misinformation and disinformation content that meets the Bill’s threshold for content that is harmful to children. Again, that is an appropriate and proportionate response.

Turning to the points made by my noble friend Lord Moylan and the noble Baroness, Lady Fox, services will also need to have particular regard to freedom of expression when complying with their safety duties. Ofcom will be required to set out steps that providers can take when complying with their safety duties in the codes of practice, including what is proportionate for different providers and how freedom of expression can be protected.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

This might be an appropriate moment for me to say—on the back of that—that, although my noble friend explained current government practice, he has not addressed my point on why there should not be an annual report to Parliament that describes what government has done on these various fronts. If the Government regularly meet newspaper publishers to discuss the quality of information in their newspapers, I for one would have entire confidence that the Government were doing so in the public interest, but I would still quite like—I think the Government would agree on this—a report on what was happening, making an exception for national security. That would still be a good thing to do. Will my noble friend explain why we cannot be told?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

While I am happy to elaborate on the work of the counter-disinformation unit in the way I just have, the Government cannot share operational details about its work, as that would give malign actors insight into the scope and scale of our capabilities. As my noble friend notes, this is not in the public interest. Moreover, reporting representations made to platforms by the unit would also be unnecessary as this would overlook both the existing processes that govern engagements with external parties and the new protections that are introduced through the Bill.

In the first intervention, the noble Baroness, Lady Fox, gave a number of examples, some of which are debatable, contestable facts. Companies may well choose to keep them on their platforms within their terms of service. We have also seen deliberate misinformation and disinformation during the pandemic, including from foreign actors promoting more harmful disinformation. It is right that we take action against this.

I hope that I have given noble Lords some reassurance on the points raised about the amendments in this group. I invite them not to press the amendments.

Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am most grateful to noble Lords across the Committee for their consideration and for their contributions in this important area. As the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, both said, this was an area of struggle for the Joint Committee. The debate today shows exactly why that is so, but it is a struggle worth having.

The noble Lord, Lord Bethell, talked about there being a gap in the Bill as it stands. The amendments include the introduction of risk assessments and transparency and, fundamentally, explaining things in a way that people can actually understand. These are all tried and tested methods and can serve only to improve the Bill.

I am grateful to the Minister for his response and consideration of the amendments. I want to take us back to the words of the noble Baroness, Lady Kidron. She explained it beautifully—partly in response to the comments from the noble Baroness, Lady Fox. This is about tackling a system of amplification of misinformation and disinformation that moves the most marginal of views into the mainstream. It deals with restricting the damage that, as I said earlier, can produce the most dire circumstances. Amplification is the consideration that these amendments seek to tackle.

I am grateful to the noble Lord, Lord Moylan, for his comments, as well as for his amendments. I am sure the noble Lord has reflected that some of the previous amendments he brought before the House somewhat put the proverbial cat among the Committee pigeons. On this occasion, I think the noble Lord has nicely aligned the cats and the pigeons. He has managed to rally us all—with the exception of the Minister—behind these amendments.

--- Later in debate ---
The All-Party Parliamentary Group on Media Literacy has done some really good work. Just saying, “This is cross-government”, “We need a holistic approach to this” and so on does not obviate the fact that our schools need to be much more vigorous in what they do in this area. Indeed, the group is advocating a media literacy education Bill, talking about upskilling teachers and talking, as does one of the amendments here, about Ofcom having a duty in this area. We need to take a much broader view of this and be much more vigorous in what we do on media literacy, as has been clear from all the contributions from around the House today.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, this has been a good debate. I am glad that a number of noble Lords mentioned Lord Puttnam and the committee that he chaired for your Lordships’ House on democracy and digital technologies. I responded to the debate that we had on that; sadly, it was after he had already retired from your Lordships’ House, but he participated from the steps of the Throne. I am mindful of that report and the lessons learned in it in the context of the debate that we have had today.

We recognise the intent behind the amendments in this group to strengthen the UK’s approach to media literacy in so far as it relates to services that will be regulated by the Bill. Ofcom has a broad duty to promote media literacy under the Communications Act 2003. That is an important responsibility for Ofcom, and it is right that the regulator is able to adapt its approach to support people in meeting the evolving challenges of the digital age.

Amendments 52A and 91 from the noble Lord, Lord Knight, and Amendment 91A from the noble Lord, Lord Holmes of Richmond, seek to introduce duties on in-scope services, requiring them to put in place measures that promote users’ media literacy, while Amendment 98 tabled by the noble Lord, Lord Knight, would require Ofcom to issue a code of practice in relation to the new duty proposed in his Amendment 91. While we agree that the industry has a role to play in promoting media literacy, the Government believe that these amendments could lead to unintended, negative consequences.

I shall address the role of the industry and media literacy, which the noble Baroness, Lady Kidron, dwelt on in her remarks. We welcome the programmes that it runs in partnership with online safety experts such as Parent Zone and Internet Matters and hope they continue to thrive, with the added benefit of Ofcom’s recently published evaluation toolkit. However, we believe that platforms can go further to empower and educate their users. That is why media literacy has been included in the Bill’s risk assessment duties, meaning that regulated services will have to consider measures to promote media literacy to their users as part of the risk assessment process. Additionally, through work delivered under its existing media literacy duty, Ofcom is developing a set of best-practice design principles for platform-based media literacy measures. That work will build an evidence base of the most effective measures that platforms can take to build their users’ media literacy.

In response to the noble Baroness’s question, I say: no, platforms will not be able to avoid putting in place protections for children by using media literacy campaigns. Ofcom would be able to use its enforcement powers if a platform was not achieving appropriate safety outcomes. There are a range of ways in which platforms can mitigate risks, of which media literacy is but one, and Ofcom would expect platforms to consider them all in their risk assessments.

Let me say a bit about the unintended consequences we fear might arise from these amendments. First, the resource demands to create a code of practice and then to regulate firms’ compliance with this type of broad duty will place an undue burden on the regulator. It is also unclear how the proposed duties in Amendments 52A, 91 and 91A would interact with Ofcom’s existing media literacy duty. There is a risk, we fear, that these parallel duties could be discharged in conflicting ways. Amendment 91A is exposed to broad interpretation by platforms and could enable them to fulfil the duty in a way that lacked real impact on users’ media literacy.

The amendment in the name of my noble friend Lord Holmes proposes a duty to promote awareness of financial deception and fraud. The Government are already taking significant action to protect people from online fraud, including through their new fraud strategy and other provisions in this Bill. I know that my noble friends Lord Camrose, Lord Sharpe of Epsom and Lady Penn met noble Lords to talk about that earlier this week. We believe that measures such as prompts for users before they complete financial transactions sit more logically with financial service providers than with services in scope of this Bill.

Amendment 52A proposes a duty on carriers of journalistic content to promote media literacy to their users. We do not want to risk requiring platforms to act as de facto press regulators, assessing the quality of news publishers’ content. That would not be compatible with our commitment to press freedom. Under its existing media literacy duty, Ofcom is delivering positive work to support people to discern high-quality information online. It is also collaborating with the biggest platforms to design best practice principles for platform-based media literacy measures. It intends to publish these principles this year and will encourage platforms to adopt them.

It is right that Ofcom is given time to understand the benefits of these approaches. The Secretary of State’s post-implementation review will allow the Government and Parliament to establish the effectiveness of Ofcom’s current approach and to reconsider the role of platforms in enhancing users’ media literacy, if appropriate. In the meantime, the Bill introduces new transparency-reporting and information-gathering powers to enhance Ofcom’s visibility of platforms delivery and evaluation of media literacy activities. We would not want to see amendments that would inadvertently dissuade platforms from delivering these activities in favour of less costly and less effective measures.

My noble friend Lord Holmes asked about the Online Media Literacy Strategy, published in July 2021, which set out the Government’s vision for improving media literacy in the country. Alongside the strategy, we have committed to publishing annual action plans each financial year until 2024-25, setting out how we meet the ambition of the strategy. In April 2022 we published the Year 2 Action Plan, which included extending the reach of media literacy education to those who are currently disengaged, in consultation with the media literacy task force—a body of 17 cross-sector experts—expanding our grant funding programme to provide nearly £2.5 million across two years for organisations delivering innovative media literacy activities, and commissioning research to improve our understanding of the challenges faced by the sector. We intend to publish the research later this year, for the benefit of civil society organisations, technology platforms and policymakers.

The noble Lord, Lord Knight, in his Amendment 186, would stipulate that Ofcom must levy fees on regulated firms sufficient to fund the work of third parties involved in supporting it to meet its existing media literacy duties. The Bill already allows Ofcom to levy fees sufficient to fund the annual costs of exercising its online safety functions. This includes its existing media literacy duty as far as it relates to services regulated by this Bill. As such, the Bill already ensures that these media literacy activities, including those that Ofcom chooses to deliver through third parties, can be funded through fees levied on industry.

I turn to Amendments 188, 235, 236, 237 and 238. The Government recognise the intent behind these amendments, which is to help improve the media literacy of the general public. Ofcom already has a statutory duty to promote media literacy with regard to the publication of anything by means of electronic media, including services in scope of the Bill. These amendments propose rather prescriptive objectives, either as part of a new duty for Ofcom or through updating its existing duty. They reflect current challenges in the sector but run the risk of becoming obsolete over time, preventing Ofcom from adapting its work in response to emerging issues.

Ofcom has demonstrated flexibility in its existing duty through its renewed Approach to Online Media Literacy, launched in 2021. This presented an expanded media literacy programme, enabling it to achieve almost all the objectives specified in this group. The Government note the progress that Ofcom has already achieved under its renewed approach in the annual plan it produced last month. The Online Safety Bill strengthens Ofcom’s functions relating to media literacy, which is included in Ofcom’s new transparency-reporting and information-gathering powers, which will give it enhanced oversight of industry activity by enabling it to require regulated services to share or publish information about the work that that they are doing on media literacy.

The noble Baroness, Lady Prashar, asked about the view expressed by the Joint Committee on minimum standards for media literacy training. We agree with the intention behind that, but, because of the broad and varied nature of media literacy, we do not believe that introducing minimum standards is the most effective way of achieving that outcome. Instead, we are focusing efforts on improving the evaluation practices of media literacy initiatives to identify which ones are most effective and to encourage their delivery. Ofcom has undertaken extensive work to produce a comprehensive toolkit to support practitioners to deliver robust evaluations of their programmes. This was published in February this year and has been met with praise from practitioners, including those who received grant funding from the Government’s non-legislative media literacy work programme. The post-implementation review of Ofcom’s online safety regime, which covers its existing media literacy duty in so far as it relates to regulated services, will provide a reasonable point at which to establish the effectiveness of Ofcom’s new work programme, after giving it time to take effect.

Noble Lords talked about the national curriculum and media literacy in schools. Media literacy is indeed a crucial skill for everyone in the digital age. Key media literacy skills are already taught through a number of compulsory subjects in the national curriculum. Digital literacy is included in the computing national curriculum in England, which equips pupils with the knowledge, understanding and skills to use information and communication technology creatively and purposefully. I can reassure noble Lords that people such as Monica are being taught not about historic things like floppy disks but about emerging and present challenges; the computing curriculum ensures that pupils are taught how to design program systems and accomplish goals such as collecting, analysing, evaluating and presenting data.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Does the Minister know how many children are on computing courses?

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I do not know, but I shall find out from the Department for Education and write. But those who are on them benefit from a curriculum that includes topics such as programming and algorithms, the responsible and safe use of technology, and other foundational knowledge that may support future study in fields such as artificial intelligence and data science.

This is not the only subject in which media literacy and critical thinking are taught. In citizenship education, pupils are taught about critical thinking and the proper functioning of a democracy. They learn to distinguish fact from opinion, as well as exploring freedom of speech and the role and responsibility of the media in informing and shaping public opinion. As Minister for Arts and Heritage, I will say a bit about subjects such as history, English and other arts subjects, in which pupils learn to ask questions about information, think critically and weigh up arguments, all of which are important skills for media literacy, as well as more broadly.

--- Later in debate ---
It may or may not be correct, in terms of what we are doing, to restrict what the Bill does to those aspects of user-to-user content and other areas. If something is illegal, surely the Bill should be quite clear that it should not be happening and Ofcom should have the necessary powers, however we frame them, to make sure we follow this through to the logical conclusion. The most-needed powers are the ability for Ofcom to take the lead, if required, in relation to the other regulators who have an impact on this world—can we be sure that is in the Bill and can be exercised?—and to make sure that the transparency, the user reporting and the complaints issues that are so vital to cracking this in the medium term get sorted. I leave that with the Minister to take forward.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I am grateful to my noble friends for their amendments in this group, and for the useful debate that we have had. I am grateful also to my noble friend Lady Morgan of Cotes and the members of her committee who have looked at fraud, and for the work of the Joint Committee which scrutinised the Bill, in earlier form, for its recommendations on strengthening the way it tackles fraud online. As the noble Lord, Lord Clement-Jones, said, following those recommendations, the Government have brought in new measures to strengthen the Bill’s provisions to tackle fraudulent activity on in-scope services. I am glad he was somewhat satisfied by that.

All in-scope services will be required to take proactive action to tackle fraud facilitated through user-generated content. In addition, the largest and most popular platforms have a stand-alone duty to prevent fraudulent paid-for advertising appearing on their services. This represents a major step forward in ensuring that internet users are protected from scams, which have serious financial and psychological impacts, as noble Lords noted in our debate. Fully addressing the challenges of paid-for advertising is a wider task than is possible through the Bill alone. Advertising involves a broad range of actors not covered by the current legislative framework, such as advertising intermediaries. I am sympathetic to these concerns and the Government are taking action in this area. Through the online advertising programme, we will deliver a holistic review of the regulatory framework in relation to online advertising. The Government consulted on this work last year and aim to publish a response erelong. As the noble Lord, Lord Stevenson, and others noted, there are a number of Bills which look at this work. Earlier this week, there was a meeting hosted by my noble friends Lord Camrose, Lord Sharpe of Epsom and Lady Penn to try to avoid the cracks opening up between the Bills. I am grateful to my noble friend Lady Morgan for attending; I hope it was a useful discussion.

I turn to the amendments tabled by my noble friend. The existing duties on user reporting and user complaints have been designed for user-generated content and search content and are not easily applicable to paid-for advertising. The duties on reporting and complaints mechanisms require platforms to take action in relation to individual complaints, but many in-scope services do not have control over the paid-for advertising on their services. These amendments are therefore difficult to operate for many in-scope services and would create a substantial burden for small businesses. I assure her and other noble Lords that the larger services, which have strong levers over paid-for advertising, will have to ensure that they have processes in place to enable users to report fraudulent advertising.

In reference to transparency reporting, let me assure my noble friend and others that Ofcom can already require information about how companies comply with their fraudulent advertising duties through transparency reports. In addition, Ofcom will also have the power to gather any information it requires for the purpose of exercising its online safety functions. These powers are extensive and will allow Ofcom to assess compliance with the fraudulent advertising duties.

The noble Viscount, Lord Colville of Culross, asked about the difficulty of identifying fraudulent advertising. Clauses 170 and 171 give guidance and a duty on Ofcom about providers making a judgment about content, including fraudulent advertising. There will also be a code of practice on fraudulent advertising to provide further guidance on mechanisms to deal with this important issue.

My noble friend Lord Lucas’s Amendments 94 and 95 aim to require services to report information relating to fraudulent advertising to UK authorities. I am confident that the Bill’s duties will reduce the prevalence of online fraud, reducing the need for post hoc reporting in this way. If fraud does appear online, there are adequate systems in place for internet users to report this to the police.

People can report a scam to Action Fraud, the national reporting service for fraud and cybercrime. Reports submitted to Action Fraud are considered by the National Fraud Intelligence Bureau and can assist a police investigation. Additionally, the Advertising Standards Authority has a reporting service for reporting online scam adverts, and those reports are automatically shared with the National Cyber Security Centre.

The online advertising programme, which I mentioned earlier, builds on the Bill’s fraudulent advertising duty and looks at the wider online advertising system. That programme is considering measures to increase accountability and transparency across the supply chain, including proposals for all parties to enhance record keeping and information sharing.

My noble friend Lord Lucas was keen to meet to speak further. I will pass that request to my noble friend Lord Sharpe of Epsom, who I think would be the better person to talk to in relation to this on behalf of the Home Office—but I am sure that one of us will be very happy to talk with him.

I look forward to discussing this issue in more detail with my noble friend Lady Morgan and others between now and Report, but I hope that this provides sufficient reassurance on the work that the Government are doing in this Bill and in other ways. I invite my noble friends not to press their amendments.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pity that we have not had the benefit of hearing from the Minister, because a lot of his amendments in this group seem to bear on some of the more generic points made in the very good speech by the noble Baroness, Lady Fraser. I assume he will cover them, but I wonder whether he would at least be prepared to answer any questions people might come back with—not in any aggressive sense; we are not trying to scare the pants off him before he starts. For example, the points made by the noble Lord, Lord Clement-Jones, intrigue me.

I used to have responsibility for devolved issues when I worked at No. 10 for a short period. It was a bit of a joke, really. Whenever anything Welsh happened, I was immediately summoned down to Cardiff and hauled over the coals. You knew when you were in trouble when they all stopped speaking English and started speaking Welsh; then, you knew there really was an issue, whereas before I just had to listen, go back and report. In Scotland, nobody came to me anyway, because they knew that the then Prime Minister was a much more interesting person to talk to about these things. They just went to him instead, so I did not really learn very much.

I noticed some issues in the Marshalled List that I had not picked up on when I worked on this before. I do not know whether the Minister wishes to address this—I do not want to delay the Committee too much—but are we saying that to apply a provision in the Bill to the Bailiwick of Guernsey or the Isle of Man, an Order in Council is required to bypass Parliament? Is that a common way of proceeding in these places? I suspect that the noble and learned Lord, Lord Hope, knows much more about this than I do—he shakes his head—but this is a new one on me. Does it mean that this Parliament has no responsibility for how its laws are applied in those territories, or are there other procedures of which we are unaware?

My second point again picks up what the noble Lord, Lord Clement-Jones, was saying. Could the Minister go through in some detail the process by which a devolved authority would apply to the Secretary of State—presumably for DSIT—to seek consent for a devolved offence to be included in the Online Safety Bill regime? If this is correct, who grants to what? Does this come to the House as a statutory instrument? Is just the Secretary of State involved, or does it go to the Privy Council? Are there other ways that we are yet to know about? It would be interesting to know.

To echo the noble Lord, Lord Clement-Jones, we probably do need a letter from the Minister, if he ever gets this cleared, setting out exactly how the variation in powers would operate across the four territories. If there are variations, we would like to know about them.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - -

My Lords, I am very grateful to my noble friend Lady Fraser of Craigmaddie for her vigilance in this area and for the discussion she had with the Bill team, which they and I found useful. Given the tenor of this short but important debate, I think it may be helpful if we have a meeting for other noble Lords who also want to benefit from discussing some of these things in detail, and particularly to talk about some of the issues the noble Lord, Lord Stevenson of Balmacara, just raised. It would be useful for us to talk in detail about general questions on the operation of the law before we look at this again on Report.

In a moment, I will say a bit about the government amendments which stand in my name. I am sure that noble Lords will not be shy in taking the opportunity to interject if questions arise, as they have not been shy on previous groups.

I will start with the amendments tabled by my noble friend Lady Fraser. Her Amendment 58 seeks to add reference to the Human Rights Act 1998 to Clause 18. That Act places obligations on public authorities to act compatibly with the European Convention on Human Rights. It does not place obligations on private individuals and companies, so it would not make sense for such a duty on internet services to refer to the Human Rights Act.

Under that Act, Ofcom has obligations to act in accordance with the right to freedom of expression under Article 10 of the European Convention on Human Rights. As a result, the codes that Ofcom draws up will need to comply with the Article 10 right to freedom of expression. Schedule 4 to the Bill requires Ofcom to ensure that measures which it describes in a code of practice are designed in light of the importance of protecting the right of users’

“freedom of expression within the law”.

Clauses 44(2) and (3) provide that platforms will be treated as complying with their freedom of expression duty if they take the recommended measures that Ofcom sets out in the codes. Platforms will therefore be guided by Ofcom in taking measures to comply with its duties, including safeguards for freedom of expression through codes of practice.

My noble friend’s Amendment 136 seeks to add offences under the Hate Crime and Public Order (Scotland) Act 2021 to Schedule 7. Public order offences are already listed in Schedule 7 to the Bill, which will apply across the whole United Kingdom. This means that all services in scope will need proactively to tackle content that amounts to an offence under the Public Order Act 1986, regardless of where the content originates or where in the UK it can be accessed.

The priority offences list has been developed with the devolved Administrations, and Clause 194 outlines the parliamentary procedures for updating it. The requirements for consent will be set out in the specific subordinate legislation that may apply to the particular offence being made by the devolved authorities—that is to say, they will be laid down by the enabling statutes that Parliament will have approved.

Amendment 228 seeks to require the inclusion of separate analyses of users’ online experiences in England, Wales, Scotland and Northern Ireland in Ofcom’s transparency reports. These transparency reports are based on the information requested from category 1, 2A and 2B service providers through transparency reporting. I assure my noble friend that Ofcom is already able to request country-specific information from providers in its transparency reports. The legislation sets out high-level categories of information that category 1, 2A and 2B services may be required to include in their transparency reports. The regulator will set out in a notice the information to be requested from the provider, the format of that information and the manner in which it should be published. If appropriate, Ofcom may request specific information in relation to each country in the UK, such as the number of users encountering illegal content and the incidence of such content.

Ofcom is also required to undertake consultation before producing guidance about transparency reporting. In order to ensure that the framework is proportionate and future-proofed, however, it is vital to allow the regulator sufficient flexibility to request the types of information that it sees as relevant, and for that information to be presented by providers in a manner that Ofcom has deemed to be appropriate.

Similarly, Amendment 225A would require separate analyses of users’ online experiences in England, Wales, Scotland and Northern Ireland in Ofcom’s research about users’ experiences of regulated services. Clause 141 requires that Ofcom make arrangements to undertake consumer research to ascertain public opinion and the experiences of UK users of regulated services. Ofcom will already be able to undertake this research on a country-specific basis. Indeed, in undertaking its research and reporting duties, as my noble friend alluded to, Ofcom has previously adopted such an approach. For instance, it is required by the Communications Act 2003 to undertake consumer research. While the legislation does not mandate that Ofcom conduct and publish nation-specific research, Ofcom has done so, for instance through its publications Media Nations and Connected Nations. I hope that gives noble Lords some reassurance of its approach in this regard. Ensuring that Ofcom has flexibility in carrying out its research functions will enable us to future-proof the regulatory framework, and will mean that its research activity is efficient, relevant and appropriate.

I will now say a bit about the government amendments standing in my name. I should, in doing so, highlight that I have withdrawn Amendments 304C and 304D, previously in the Marshalled List, which will be replaced with new amendments to ensure that all the communications offences, including the new self-harm offence, have the appropriate territorial extent when they are brought forward. They will be brought forward as soon as possible once the self-harm offence has been tabled.

Amendments 267A, 267B, 267C, 268A, 268B to 268G, 271A to 271D, 304A, 304B and 304E are amendments to Clauses 160, 162, 164 to 166, 168 and 210 and Schedule 14, relating to the extension of the false and threatening communications offences and the associated liability of corporate officers in Clause 166 to Northern Ireland.

This group also includes some technical and consequential amendments to the false and threatening communications offences and technical changes to the Malicious Communications (Northern Ireland) Order 1988 and Section 127 of the Communications Act 2003. This will minimise overlap between these existing laws and the new false and threatening communications offences in this Bill. Importantly, they mirror the approach taken for England and Wales, providing consistency in the criminal law.

This group also contains technical amendments to update the extent of the epilepsy trolling offence to reflect that it applies to England, Wales and Northern Ireland.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Yes, that would be a sensible way to view it. We will work on that and allow noble Lords to see it before they come to talk to us about it.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I put on record that the withdrawal of Part 3 of the Digital Economy Act 2017 will be greeted with happiness only should the full schedule of AV and harms be put into the Bill. I must say that because the noble Baroness, Lady Benjamin, is not in her place. She worked very hard for that piece of legislation.

--- Later in debate ---
Moved by
64A: Clause 19, page 21, line 36, leave out “all”
Member’s explanatory statement
This is a technical amendment needed because the new duty to supply records of risk assessments to OFCOM (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 1 services.
--- Later in debate ---
Moved by
65A: Clause 19, page 22, line 26, at end insert—
“(8A) As soon as reasonably practicable after making a record of a risk assessment as required by subsection (2), or revising such a record, a duty to supply OFCOM with a copy of the record (in full).”Member’s explanatory statement
This amendment requires providers of Category 1 services to supply copies of their records of risk assessments to OFCOM. The limitation to Category 1 services is achieved by an amendment in the name of the Minister to clause 6.
--- Later in debate ---
Moved by
65B: Clause 20, page 23, line 5, leave out “and (3)” and insert “to (3A)”
Member’s explanatory statement
This technical amendment is consequential on the other changes to clause 20 (arising from the new duties in clauses 23, 25 and 29 which are imposed on providers of Category 2A services only - see the amendments in the Minister’s name to those clauses below).
--- Later in debate ---
Moved by
66A: Clause 20, page 23, line 16, leave out “In addition,”
Member’s explanatory statement
This technical amendment is consequential on the other changes to clause 20 (arising from the new duties in clauses 23, 25 and 29 which are imposed on providers of Category 2A services only - see the amendments in the Minister’s name to those clauses below).
--- Later in debate ---
Moved by
66E: Clause 22, page 24, line 38, after “29(2)” insert “and (8A)”
Member’s explanatory statement
This amendment inserts a signpost to the new duty in clause 29 about supplying records of risk assessments to OFCOM.
--- Later in debate ---
Moved by
66F: Clause 23, page 24, line 42, leave out “all”
Member’s explanatory statement
This is a technical amendment needed because the new duty to summarise illegal content risk assessments in a publicly available statement (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 2A services.
--- Later in debate ---
Moved by
72A: Clause 23, page 25, line 31, at end insert—
“(8A) A duty to summarise in a publicly available statement the findings of the most recent illegal content risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to individuals).”Member’s explanatory statement
This amendment requires providers of Category 2A services to summarise (in a publicly available statement) the findings of their latest risk assessment regarding illegal content. The limitation to Category 2A services is achieved by an amendment in the name of the Minister to clause 20.
--- Later in debate ---
Moved by
75A: Clause 24, page 26, line 45, after “29(2)” insert “and (8A)”
Member’s explanatory statement
This amendment inserts a signpost to the new duty in clause 29 about supplying records of risk assessments to OFCOM.
--- Later in debate ---
Moved by
75B: Clause 25, page 27, line 4, at end insert “(as indicated by the headings).”
Member’s explanatory statement
This amendment provides clarification because the new duty to summarise children’s risk assessments in a publicly available statement (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 2A services.
--- Later in debate ---
Moved by
81A: Clause 25, page 27, line 46, at end insert—
“(8A) A duty to summarise in a publicly available statement the findings of the most recent children’s risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to children).”Member’s explanatory statement
This amendment requires providers of Category 2A services to summarise (in a publicly available statement) the findings of their latest children’s risk assessment. The limitation to Category 2A services is achieved by an amendment in the name of the Minister to clause 20.
--- Later in debate ---
Moved by
88A: Clause 29, page 31, line 4, leave out “all”
Member’s explanatory statement
This is a technical amendment needed because the new duty to supply records of risk assessments to OFCOM (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 2A services.
--- Later in debate ---
Moved by
90A: Clause 29, page 31, line 37, at end insert—
“(8A) As soon as reasonably practicable after making a record of a risk assessment as required by subsection (2), or revising such a record, a duty to supply OFCOM with a copy of the record (in full).”Member’s explanatory statement
This amendment requires providers of Category 2A services to supply copies of their records of risk assessments to OFCOM. The limitation to Category 2A services is achieved by an amendment in the name of the Minister to clause 20.
--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

This has been a very good debate indeed. I have good days and bad days in Committee. Good days are when I feel that the Bill is going to make a difference and things are going to improve and the sun will shine. Bad days are a bit like today, where we have had a couple of groups, and this is one of them, where I am a bit worried about where we are and whether we have enough—I was going to use that terrible word “ammunition” but I do not mean that—of the powers that are necessary in the right place and with the right focus to get us through some of the very difficult questions that come in. I know that bad cases make bad law, but they can also illustrate why the law is not good enough. As the noble Baroness, Lady Kidron, was saying, this is possibly one of the areas we are in.

The speeches in the debate have made the case well and I do not need to go back over it. We have got ourselves into a situation where we want to reduce harm that we see around but do not want to impact freedom of expression. Both of those are so important and we have to hold on to them, but we find ourselves struggling. What do we do about that? We think through what we will end up with this Bill on the statute book and the codes of practice through it. This looks as though it is heading towards the question of whether the terms of service that will be in place will be sufficient and able to restrict the harms we will see affecting people who should not be affected by them. But I recognise that the freedom of expression arguments have won the day and we have to live with that.

The noble Baroness, Lady Kidron, mentioned the riskiness of the smaller sites—categories 2A and 2B and the ones that are not even going to be categorised as high as that. Why are we leaving those to cause the damage that they are? There is something not working here in the structure of the Bill and I hope the Minister will be able to provide some information on that when he comes to speak.

Obviously, if we could find a way of expressing the issues that are raised by the measures in these amendments as being illegal in the real world, they would be illegal online as well. That would at least be a solution that we could rely on. Whether it could be policed and serviced is another matter, but it certainly would be there. But we are probably not going to get there, are we? I am not looking at the Minister in any hope but he has a slight downward turn to his lips. I am not sure about this.

How can we approach a legal but harmful issue with the sort of sensitivity that does not make us feel that we have reduced people’s ability to cope with these issues and to engage with them in an adult way? I do not have an answer to that.

Is this another amplification issue or is it deeper and worse than that? Is this just the internet because of its ability to focus on things to keep people engaged, to make people stay online when they should not, to make them reach out and receive material that they ought not to get in a properly regulated world? Is it something that we can deal with because we have a sense of what is moral and appropriate and want to act because society wants us to do it? I do not have a solution to that, and I am interested to hear what the Minister will say, but I think it is something we will need to come back to.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, like everyone who spoke, I and the Government recognise the tragic consequences of suicide and self-harm, and how so many lives and families have been devastated by it. I am grateful to the noble Baroness and all noble Lords, as well as the bereaved families who campaigned so bravely and for so long to spare others that heartache and to create a safer online environment for everyone. I am grateful to the noble Baroness, Lady Finlay of Llandaff, who raised these issues in her Private Member’s Bill, on which we had exchanges. My noble friend Lady Morgan is right to raise the case of Frankie Thomas and her parents, and to call that to mind as we debate these issues.

Amendments 96 and 296, tabled by the noble Baroness, Lady Finlay, would, in effect, reintroduce the former adult safety duties whereby category 1 companies were required to assess the risk of harm associated with legal content accessed by adults, and to set and enforce terms of service in relation to it. As noble Lords will know, those duties were removed in another place after extensive consideration. Those provisions risked creating incentives for the excessive removal of legal content, which would unduly interfere with adults’ free expression.

However, the new transparency, accountability and freedom of expression duties in Part 4, combined with the illegal and child safety duties in Part 3, will provide a robust approach that will hold companies to account for the way they deal with this content. Under the Part 4 duties, category 1 services will need to have appropriate systems and processes in place to deal with content or activity that is banned or restricted by their terms of service.

Many platforms—such as Twitter, Facebook and TikTok, which the noble Baroness raised—say in their terms of service that they restrict suicide and self-harm content, but they do not always enforce these policies effectively. The Bill will require category 1 companies—the largest platforms—fully to enforce their terms of service for this content, which will be a significant improvement for users’ safety. Where companies allow this content, the user-empowerment duties will give adults tools to limit their exposure to it, if they wish to do so.

The noble Baroness is right to raise the issue of algorithms. As the noble Lord, Lord Stevenson, said, amplification lies at the heart of many cases. The Bill will require providers specifically to consider as part of their risk assessments how algorithms could affect children’s and adults’ exposure to illegal content, and content that is harmful to children, on their services. Providers will need to take steps to mitigate and effectively manage any risks, and to consider the design of functionalities, algorithms and other features to meet the illegal content and child safety duties in the Bill.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

Following our earlier discussion, we were going to have a response on super-complaints. I am curious to understand whether we had a pattern of complaints—such as those the noble Baroness, Lady Kidron, and others received—about a platform saying, under its terms of service, that it would remove suicide and self-harm content but failing to do so. Does the Minister think that is precisely the kind of thing that could be substantive material for an organisation to bring as a super-complaint to Ofcom?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My initial response is, yes, I think so, but it is the role of Ofcom to look at whether those terms of service are enforced and to act on behalf of internet users. The noble Lord is right to point to the complexity of some marginal cases with which companies have to deal, but the whole framework of the Bill is to make sure that terms of service are being enforced. If they are not, people can turn to Ofcom.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I am sorry to enter the fray again on complaints, but how will anyone know that they have failed in this way if there is no complaints system?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I refer to the meeting my noble friend Lord Camrose offered; we will be able to go through and unpick the issues raised in that group of amendments, rather than looping back to that debate now.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

The Minister is going through the structure of the Bill and saying that what is in it is adequate to prevent the kinds of harms to vulnerable adults that we talked about during this debate. Essentially, it is a combination of adherence to terms of service and user-empowerment tools. Is he saying that those two aspects are adequate to prevent the kinds of harms we have talked about?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Yes, they are—with the addition of what I am coming to. In addition to the duty for companies to consider the role of algorithms, which I talked about, Ofcom will have a range of powers at its disposal to help it assess whether providers are fulfilling their duties, including the power to require information from providers about the operation of their algorithms. The regulator will be able to hold senior executives criminally liable if they fail to ensure that their company is providing Ofcom with the information it requests.

However, we must not restrict users’ right to see legal content and speech. These amendments would prescribe specific approaches for companies’ treatment of legal content accessed by adults, which would give the Government undue influence in choosing, on adult users’ behalf, what content they see—

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I wanted to give the Minister time to get on to this. Can we now drill down a little on the terms of service issue? If the noble Baroness, Lady Kidron, is right, are we talking about terms of service having the sort of power the Government suggest in cases where they are category 1 and category 2A but not search? There will be a limit, but an awful lot of other bodies about which we are concerned will not fall into that situation.

Also, I thought we had established, much to our regret, that the terms of service were what they were, and that Ofcom’s powers—I paraphrase to make the point—were those of exposure and transparency, not setting minimum standards. But even if we are talking only about the very large and far-reaching companies, should there not be a power somewhere to engage with that, with a view getting that redress, if the terms of service do not specify it?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

The Bill will ensure that companies adhere to their terms of service. If they choose to allow content that is legal but harmful on their services and they tell people that beforehand—and adults are able and empowered to decide what they see online, with the protections of the triple shield—we think that that strikes the right balance. This is at the heart of the whole “legal but harmful” debate in another place, and it is clearly reflected throughout the approach in the Bill and in my responses to all of these groups of amendments. But there are duties to tackle illegal content and to make sure that people know the terms of service for the sites they choose to interact with. If they feel that they are not being adhered to—as they currently are not in relation to suicide and self-harm content on many of the services—users will have the recourse of the regulator to turn to.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I will plant a flag in reference to the new offences, which I know we will come back to again. It is always helpful to look at real-world examples. There is a lot of meme-based self-harm content. Two examples are the Tide Pods challenge—the eating of detergent capsules—and choking games, both of which have been very common and widespread. It would be helpful, ahead of our debate on the new offences, to understand whether they are below or above the threshold of serious self-harm and what the Government’s intention is. There are arguments both ways: obviously, criminalising children for being foolish carries certain consequences, but we also want to stop the spread of the content. So, when we come to that offence, it would be helpful if the Minister could use specific examples, such as the meme-based self-harm content, which is quite common.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I thank the noble Lord for the advance notice to think about that; it is helpful. It is difficult to talk in general terms about this issue, so, if I can, I will give examples that do, and do not, meet the threshold.

The Bill goes even further for children than it does for adults. In addition to the protections from illegal material, the Government have indicated, as I said, that we plan to designate content promoting suicide, self-harm or eating disorders as categories of primary priority content. That means that providers will need to put in place systems designed to prevent children of any age encountering this type of content. Providers will also need specifically to assess the risk of children encountering it. Platforms will no longer be able to recommend such material to children through harmful algorithms. If they do, Ofcom will hold them accountable and will take enforcement action if they break their promises.

It is right that the Bill takes a different approach for children than for adults, but it does not mean that the Bill does not recognise that young adults are at risk or that it does not have protections for them. My noble friend Lady Morgan was right to raise the issue of young adults once they turn 18. The triple shield of protection in the Bill will significantly improve the status quo by protecting adults, including young adults, from illegal suicide content and legal suicide or self-harm content that is prohibited in major platforms’ terms and conditions. Platforms also have strong commercial incentives, as we discussed in previous groups, to address harmful content that the majority of their users do not want to see, such as legal suicide, eating disorder or self-harm content. That is why they currently claim to prohibit it in their terms and conditions, and why we want to make sure that those terms and conditions are transparently and accountably enforced. So, while I sympathise with the intention from the noble Baroness, Lady Finlay, her amendments raise some wider concerns about mandating how providers should deal with legal material, which would interfere with the careful balance the Bill seeks to strike in ensuring that users are safer online without compromising their right to free expression.

The noble Baroness’s Amendment 240, alongside Amendment 225 in the name of the noble Lord, Lord Stevenson, would place new duties on Ofcom in relation to suicide and self-harm content. The Bill already has provisions to provide Ofcom with broad and effective information-gathering powers to understand how this content affects users and how providers are dealing with it. For example, under Clause 147, Ofcom can already publish reports about suicide and self-harm content, and Clauses 68 and 69 empower Ofcom to require the largest providers to publish annual transparency reports.

Ofcom may require those reports to include information on the systems and processes that providers use to deal with illegal suicide or self-harm content, with content that is harmful to children, or with content which providers’ own terms of service prohibit. Those measures sit alongside Ofcom’s extensive information-gathering powers. It will have the ability to access the information it needs to understand how companies are fulfilling their duties, particularly in taking action against this type of content. Furthermore, the Bill is designed to provide Ofcom with the flexibility it needs to respond to harms—including in the areas of suicide, self-harm and eating disorders—as they develop over time, in the way that the noble Baroness envisaged in her remarks about the metaverse and new emerging threats. So we are confident that these provisions will enable Ofcom to assess this type of content and ensure that platforms deal with it appropriately. I hope that this has provided the sufficient reassurance to the noble Baroness for her not to move her amendment.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I asked a number of questions on specific scenarios. If the Minister cannot answer them straight away, perhaps he could write to me. They all rather called for “yes/no” answers.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

The noble Baroness threw me off with her subsequent question. She was broadly right, but I will write to her after I refresh my memory about what she said when I look at the Official Report.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, protecting women and girls is a priority for His Majesty’s Government, at home, on our streets and online. This Bill will provide vital protections for women and girls, ensuring that companies take action to improve their safety online and protect their freedom of expression so that they can continue to play their part online, as well as offline, in our society.

On Amendments 94 and 304, tabled by my noble friend Lady Morgan of Cotes, I want to be unequivocal: all service providers must understand the systemic risks facing women and girls through their illegal content and child safety risk assessments. They must then put in place measures that manage and mitigate these risks. Ofcom’s codes of practice will set out how companies can comply with their duties in the Bill.

I assure noble Lords that the codes will cover protections against violence against women and girls. In accordance with the safety duties, the codes will set out how companies should tackle illegal content and activity confronting women and girls online. This includes the several crimes that we have listed as priority offences, which we know are predominantly perpetrated against women and girls. The codes will also cover how companies should tackle harmful online behaviour and content towards girls.

Companies will be required to implement systems and processes designed to prevent people encountering priority illegal content and minimise the length of time for which any such content is present. In addition, Ofcom will be required to carry out broad consultation when drafting codes of practice to harness expert opinions on how companies can address the most serious online risks, including those facing women and girls. Many of the examples that noble Lords gave in their speeches are indeed reprehensible. The noble Baroness, Lady Kidron, talked about rape threats and threats of violence. These, of course, are examples of priority illegal content and companies will have to remove and prevent them.

My noble friend Lady Morgan suggested that the Bill misses out the specific course of conduct that offences in this area can have. Clause 9 contains provisions to ensure that services

“mitigate and manage the risk of the service being used for the commission or facilitation of”

an offence. This would capture patterns of behaviour. In addition, Schedule 7 contains several course of conduct offences, including controlling and coercive behaviour, and harassment. The codes will set out how companies must tackle these offences where this content contributes to a course of conduct that might lead to these offences.

To ensure that women’s and girls’ voices are heard in all this, the Bill will, as the right reverend Prelate noted, make it a statutory requirement for Ofcom to consult the Victims’ Commissioner and the domestic abuse commissioner about the formation of the codes of practice. As outlined, the existing illegal content, child safety and child sexual abuse and exploitation codes will already cover protections for women and girls. Creating a separate code dealing specifically with violence against women and girls would mean transposing or duplicating measures from these in a separate code.

In its recent communication to your Lordships, Ofcom stated that it will be consulting quickly on the draft illegal content and child sexual abuse and exploitation codes, and has been clear that it has already started the preparatory work for these. If Ofcom were required to create a separate code on violence against women and girls this preparatory work would need to be revised, with the inevitable consequence of slowing down the implementation of these vital protections.

An additional stand-alone code would also be duplicative and could cause problems with interpretation and uncertainty for Ofcom and providers. Linked to this, the simpler the approach to the codes, the higher the rates of compliance are likely to be. The more codes there are covering specific single duties, the more complicated it will be for providers, which will have to refer to multiple different codes, and the harder for businesses to put in place the right protections for users. Noble Lords have said repeatedly that this is a complex Bill, and this is an area where I suggest we should not make it more complex still.

As the Bill is currently drafted, Ofcom is able to draft codes in a way that addresses a range of interrelated risks affecting different groups of users, such as people affected in more than one way; a number of noble Lords dealt with that in their contributions. For example, combining the measures that companies can take to tackle illegal content targeting women and girls with the measures they can take to tackle racist abuse online could ensure a more comprehensive and effective approach that recognises the point, which a number of noble Lords made, that people with more than one protected characteristic under the Equality Act may be at compound risk of harm. If the Bill stipulated that Ofcom separate the offences that disproportionately affect women and girls from other offences in Schedule 7, this comprehensive approach to tackling violence against women and girls online could be lost.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

Could my noble friend the Minister confirm something? I am getting rather confused by what he is saying. Is it the case that there will be just one mega code of practice to deal with every single problem, or will there be lots of different codes of practice to deal with the problems? I am sure the tech platforms will have sufficient people to be able to deal with them. My understanding is that Ofcom said that, while the Bill might not mandate a code of practice on violence against women and girls, it would in due course be happy to look at it. Is that right, or is my noble friend the Minister saying that Ofcom will never produce a code of practice on violence against women and girls?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

It is up to Ofcom to decide how to set the codes out. What I am saying is that the codes deal with specific categories of threat or problem—illegal content, child safety content, child sexual abuse and exploitation—rather than with specific audiences who are affected by these sorts of problems. There is a circularity here in some of the criticism that we are not reflecting the fact that there are compound harms to people affected in more than one way and then saying that we should have a separate code dealing with one particular group of people because of one particular characteristic. We are trying to deal with categories of harm that we know disproportionately affect women and girls but which of course could affect others, as the noble Baroness rightly noted. Amendment 304—

Baroness Merron Portrait Baroness Merron (Lab)
- Hansard - - - Excerpts

I thank the Minister for giving way. There is a bit of a problem that I would like to raise. I think the Minister is saying that there should not be a code of practice in respect of violence against women and girls. That sounds to me like there will be no code of practice in this one particular area, which seems rather harsh. It also does not tackle the issue on which I thought we were all agreed, even if we do not agree the way forward: namely, that women and girls are disproportionately affected. If it is indeed the case that the Minister feels that way, how does he suggest this is dealt with?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

There are no codes designed for Jewish people, Muslim people or people of colour, even though we know that they are disproportionately affected by some of these harms as well. The approach taken is to tackle the problems, which we know disproportionately affect all of those groups of people and many more, by focusing on the harms rather than the recipients of the harm.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Hansard - - - Excerpts

Can I check something with my noble friend? This is where the illogicality is. The Government have mandated in the Strategic Policing Requirement that violence against women and girls is a national threat. I do not disagree with him that other groups of people will absolutely suffer abuse and online violence, but the Government themselves have said that violence against women and girls is a national threat. I understand that my noble friend has the speaking notes, the brief and everything else, so I am not sure how far we will get on this tonight, but, given the Home Office stance on it, I think that to say that this is not a specific threat would be a mistake.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

With respect, I do not think that that is a perfect comparison. The Strategic Policing Requirement is an operational policing document intended for chief constables and police and crime commissioners in the important work that they do, to make sure they have due regard for national threats as identified by the Home Secretary. It is not something designed for commercial technology companies. The approach we are taking in the Bill is to address harms that can affect all people and which we know disproportionately affect women and girls, and harms that we know disproportionately affect other groups of people as well.

We have made changes to the Bill: the consultation with the Victims’ Commissioner and the domestic abuse commissioner, the introduction of specific offences to deal with cyber-flashing and other sorts of particular harms, which we know disproportionately affect women and girls. We are taking an approach throughout the work of the Bill to reflect those harms and to deal with them. Because of that, respectfully, I do not think we need a specific code of practice for any particular group of people, however large and however disproportionately they are affected. I will say a bit more about our approach. I have said throughout, including at Second Reading, and my right honourable friend the Secretary of State has been very clear in another place as well, that the voices of women and girls have been heard very strongly and have influenced the approach that we have taken in the Bill. I am very happy to keep talking to noble Lords about it, but I do not think that the code my noble friend sets out is the right way to go about solving this issue.

Amendment 304 seeks to adopt the Istanbul convention definition of violence against women and girls. The Government are already compliant with the Convention on Preventing and Combating Violence Against Women and Domestic Violence, which was ratified last year. However, we are unable to include the convention’s definition of violence against women and girls in the Bill, as it extends to legal content and activity that is not in scope of the Bill as drafted. Using that definition would therefore cause legal uncertainty for companies. It would not be appropriate for the Government to require companies to remove legal content accessed by adults who choose to access it. Instead, as noble Lords know, the Government have brought in new duties to improve services’ transparency and accountability.

Amendment 104 in the name of the noble Lord, Lord Stevenson, seeks to require user-to-user services to provide a higher standard of protection for women, girls and vulnerable adults than for other adults. The Bill already places duties on service providers and Ofcom to prioritise responding to content and activity that presents the highest risk of harm to users. This includes users who are particularly affected by online abuse, such as women, girls and vulnerable adults. In overseeing the framework, Ofcom must ensure that there are adequate protections for those who are most vulnerable to harm online. In doing so, Ofcom will be guided by its existing duties under the Communications Act, which requires it to have regard when performing its duties to the

“vulnerability of children and of others whose circumstances appear to OFCOM to put them in need of special protection”.

The Bill also amends Ofcom’s general duties under the Communications Act to require that Ofcom, when carrying out its functions, considers the risks that all members of the public face online, and ensures that they are adequately protected from harm. This will form part of Ofcom’s principal duty and will apply to the way that Ofcom performs all its functions, including when producing codes of practice.

In addition, providers’ illegal content and child safety risk assessment duties, as well as Ofcom’s sectoral risk assessment duties, require them to understand the risk of harm to users on their services. In doing so, they must consider the user base. This will ensure that services identify any specific risks facing women, girls or other vulnerable groups of people.

As I have mentioned, the Bill will require companies to prioritise responding to online activity that poses the greatest risk of harm, including where this is linked to vulnerability. Vulnerability is very broad. The threshold at which somebody may arguably become vulnerable is subjective, context-dependent and maybe temporary. The majority of UK adult users could be defined as vulnerable in particular circumstances. In practice, this would be very challenging for Ofcom to interpret if it were added to the safety objectives in this way. The existing approach allows greater flexibility so that companies and Ofcom can focus on the greatest threats to different groups of people at any given time. This allows the Bill to adapt to and keep pace with changing risk patterns that may affect different groups of people.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Moved by
98A: Clause 36, page 37, line 29, at end insert—
“(ga) the Children’s Commissioner,(gb) the Commissioner for Victims and Witnesses,(gc) the Domestic Abuse Commissioner,”Member’s explanatory statement
This amendment provides that in preparing a draft code of practice or amendments of a code of practice under clause 36, OFCOM must also consult the Children’s Commissioner, the Commissioner for Victims and Witnesses and the Domestic Abuse Commissioner.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - -

My Lords, the amendments in this group consider the role of collaboration and consultation in Ofcom’s approach. The proposals range in their intent, and include mandating additional roles for young people in the framework, adding new formal consultation requirements, and creating powers for Ofcom to work with other organisations.

I reassure noble Lords that the Government take these concerns extremely seriously. That is why the Bill already places the voices of experts, users and victims at the heart of the regime it establishes. In fact, the intent of many of the amendments in this group will already be delivered. That includes Ofcom working with others effectively to deliver the legislation, consulting on draft codes of practice, and having the ability to designate specific regulatory functions to other bodies where appropriate. Where we can strengthen the voices of users, victims or experts—without undermining existing processes, reducing the regulator’s independence or causing unacceptable delays—the Government are open to this. That is why I am moving the amendment today. However, as we have heard in previous debates, this is already a complex regulatory framework, and there is a widespread desire for it to be implemented quickly. Therefore, it is right that we guard against creating additional or redundant requirements which could complicate the regime or unduly delay implementation.

I turn to the amendment in my name. As noble Lords know, Ofcom will develop codes of practice setting out recommended measures for companies to fulfil their duties under the Bill. When developing those codes, Ofcom must consult various persons and organisations who have specific knowledge or expertise related to online harms. This process will ensure that the voices of users, experts and others are reflected in the codes, and, in turn, that the codes contain appropriate and effective measures.

One of the most important goals of the Bill, as noble Lords have heard me say many times, is the protection of children. It is also critical that the codes reflect the views of victims of online abuse, as well as the expertise of those who have experience in managing them. Therefore, the government amendment seeks to name the Commissioner for Victims and Witnesses, the domestic abuse commissioner and the Children’s Commissioner as statutory consultees under Clause 36(6). Ofcom will be required to consult those commissioners when preparing or amending a code of practice.

Listing these commissioners as statutory consultees will guarantee that the voices of victims and those who are disproportionately affected by online abuse are represented when developing codes of practice. This includes, in particular, women and girls—following on from our debate on the previous group—as well as children and vulnerable adults. This will ensure that Ofcom’s codes propose specific and targeted measures, such as on illegal content and content that is harmful to children, that platforms can take to address abuse effectively. I therefore hope that noble Lords will accept it.

I will say a little about some of the other amendments in this group before noble Lords speak to them. I look forward to hearing how they introduce them.

I appreciate the intent of Amendment 220E, tabled by the noble Lord, Lord Clement-Jones, and my noble friend Lady Morgan of Cotes, to address the seriousness of the issue of child sexual exploitation and abuse online. This amendment would allow Ofcom to designate an expert body to tackle such content. Where appropriate and effective, Section 1(7) of the Communications Act 2003 and Part II of the Deregulation and Contracting Out Act 1994 provide a route for Ofcom to enter into co-regulatory arrangements under the online safety framework.

There are a number of organisations that could play a role in the future regulatory framework, given their significant experience and expertise on the complex and important issue of tackling online child sexual exploitation and abuse. This includes the Internet Watch Foundation, which plays a pivotal role in the detection and removal of child sexual abuse material and provides vital tools to support its members to detect this abhorrent content.

A key difference from the proposed amendment is that the existing route, following consultation with Ofcom, requires an order to be made by a Minister, under the Deregulation and Contracting Out Act 1994, before Ofcom can authorise a co-regulator to carry out regulatory functions. Allowing Ofcom to do this, without the need for secondary legislation, would allow Ofcom to bypass existing parliamentary scrutiny when contracting out its regulatory functions under the Bill. By contrast, the existing route requires a draft order to be laid before, and approved by, each House of Parliament.

The noble Lord, Lord Knight of Weymouth, tabled Amendment 226, which proposes a child user advocacy body. The Government are committed to the interests of child users being represented and protected, but we believe that this is already achieved through the Bill’s existing provisions. There is a wealth of experienced and committed representative groups who are engaged with the regulatory framework. As the regulator, Ofcom will also continue to consult widely with a range of interested parties to ensure that it understands the experience of, and risks affecting, children online. Further placing children’s experiences at the centre of the framework, the Government’s Amendment 98A would name the Children’s Commissioner as a statutory consultee for the codes of practice. The child user advocacy body proposed in the noble Lord’s Amendment 226 may duplicate the Children’s Commissioner’s existing functions, which would create uncertainty, undermining the effectiveness of the Children’s Commissioner’s Office. The Government are confident that the Children’s Commissioner will effectively use her statutory duties and powers to understand children’s experiences of the digital realm.

For the reasons that I have set out, I am confident that children’s voices will be placed at the heart of the regime, with their interests defended and advocated for by the regulator, the Children’s Commissioner, and through ongoing engagement with civil society groups.

Similarly, Amendment 256, tabled by the noble Baroness, Lady Bennett of Manor Castle, seeks to require that any Ofcom advisory committees established by direction from the Secretary of State under Clause 155 include at least two young people. Ofcom has considerable experience in setting up committees of this kind. While there is nothing that would preclude committee membership from including at least two young people, predetermining the composition of any committee would not give Ofcom the necessary space and independence to run a transparent process. We feel that candidates should be appointed based on relevant understanding and technical knowledge of the issue in question. Where a board is examining issues with specific relevance to the interests of children, we would expect the committee membership to reflect that appropriately.

I turn to the statement of strategic priorities. As I hope noble Lords will agree, future changes in technology will likely have an impact on the experience people have online, including the nature of online harms. As provided for by Clause 153, the statement of strategic priorities will allow the Secretary of State to set out a statement of the Government’s strategic priorities in relation to online safety. This ensures that the Government can respond to changes in the digital and regulatory landscape at a strategic level. A similar power exists for telecommunications, the management of the radio spectrum, and postal services.

Amendments 251 to 253 seek to place additional requirements on the preparation of a statement before it can be designated. I reassure noble Lords that the existing consultation and parliamentary approval requirements allow for an extensive process before a statement can be designated. These amendments would introduce unnecessary steps and would move beyond the existing precedent in the Communications Act when making such a statement for telecommunications, the management of the radio spectrum, and postal services.

Finally, Amendment 284, tabled by the noble Lord, Lord Stevenson of Balmacara, proposes changes to Clause 171 on Ofcom’s guidance on illegal content judgments. Ofcom is already required to consult persons it considers appropriate before producing or revising the guidance, which could include the groups named in the noble Lord’s amendment. This amendment would oblige Ofcom to run formal public consultations on the illegal content guidance at two different stages: first, at a formative stage in the drafting process, and then before publishing a final version. These consultations would have to be repeated before subsequently amending or updating the guidance in any way. This would impose duplicative, time-consuming requirements on the regulator to consult, which are excessive when looking at other comparable guidance. The proposed consultations under this amendment would ultimately delay the publication of this instrumental guidance.

I will listen to what noble Lords have to say when they speak to their amendments, but these are the reasons why, upon first reading, we are unpersuaded by them.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the Minister for opening the group. This is a slightly novel procedure: he has rebutted our arguments before we have even had a chance to put them—what is new? I hope he has another speech lined up for the end which accepts some of the arguments we put, to demonstrate that he has listened to all the arguments made in the debate.

I will speak mainly to Amendments 220E and 226, ahead of the noble Baroness, Lady Kidron; I understand that the noble Baroness, Lady Merron, will be speaking at the end of the group to Amendment 226. I am very grateful to the noble Baroness, Lady Morgan, for signing Amendment 220E; I know she feels very strongly about this issue as well.

As the Minister said, this amendment is designed to confirm the IWF’s role as the recognised body for dealing with notice and take-down procedures for child sexual abuse imagery in the UK and to ensure that its long experience and expertise continues to be put to best use. In our view, any delay in establishing the roles and responsibilities of expert organisations such as the IWF in working with Ofcom under the new regulatory regime risks leaving a vacuum in which the risks to children from this hateful form of abuse will only increase. I heard what the Minister said about the parliamentary procedure, but that is a much slower procedure than a designation by Ofcom, so I think that is going to be one of the bones of contention between us.

The Internet Watch Foundation is a co-regulatory body with over 25 years of experience working with the internet industry, law enforcement and government to prevent the uploading of, and to disable public access to, known child sexual abuse, and to secure the removal of indecent images and videos of children from the internet. The organisation has had some considerable success over the last 25 years, despite the problem appearing to be getting worse globally.

In 2022, it succeeded in removing a record 255,000 web pages containing child sexual abuse. It has also amassed a database of more than 1.6 million unique hashes of child sexual abuse material, which has been provided to the internet industry to keep its platforms free from such material. In 2020, the Independent Inquiry into Child Sexual Abuse concluded that, in the UK, the IWF

“sits at the heart of the national response to combating the proliferation of indecent images of children. It is an organisation that deserves to be acknowledged publicly as a vital part of how, and why, comparatively little child sexual abuse material is hosted in the UK”.

--- Later in debate ---
All the amendments in the name of my noble friend Lord Stevenson would ensure that relevant voices were heard. There are repeated debates in your Lordships’ House about the need to consult and to get the right people around the table. All these amendments seek to do that, so I hope the Minister will take them in the spirit in which they are intended, which is to strengthen the arm of those who seek to protect children.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am grateful to noble Lords who have spoken to their amendments. Regarding the lead amendment in the group, I take on board what was said about its inevitable pre-emption—something that I know all too well from when the boot is on the other foot in other groups. However, I have listened to the points that were made and will of course respond.

I join the tributes rightly paid by noble Lords to the Internet Watch Foundation. The Government value its work extremely highly and would support the use of its expertise and experience in helping to deliver the aims of the Bill. My noble friend Lady Morgan of Cotes is right to say that it is on the front line of this work and to remind us that it encounters some of the most horrific and abhorrent content in the darkest recesses of the internet—something that I know well from my time as an adviser at the Home Office, as well as in this capacity now. Both the Secretary of State for Science, Innovation and Technology and the Minister for Safeguarding at the Home Office recently provided a foreword to the foundation’s latest annual report.

Clearly, Ofcom will need a wide variety of relationships with a range of organisations. Ofcom has been in regular contact with the Internet Watch Foundation, recognising its significant role in supporting the objectives of online safety regulation, and is discussing a range of options to make the best use of its expertise. The noble Lord, Lord Clement-Jones, asked what consultation and discussion is being had. We support the continuation of that engagement and are in discussions with the Internet Watch Foundation ourselves to understand how it envisages its role in supporting the regulatory environment. No decisions have been made on the co-regulatory role that other organisations may play. The Government will work with Ofcom to understand where it may be effective and beneficial to delivering the regulatory framework. Careful assessment of the governance, independence and funding of any organisations would be needed if co-designation were to be considered, but officials from the Department for Science, Innovation and Technology and the Home Office are in discussion with the IWF in relation to a memorandum of understanding to support ongoing collaboration.

On the designation of regulatory functions, we are satisfied that the powers under the Communications Act and the Deregulation and Contracting Out Act are sufficient, should other bodies be required to deliver specific aspects of the regime, so we do not see a need to amend the Bill in the way the amendments in this group suggest. Those Acts require an order from the Minister in order to designate any functions. The Minister has to consult Ofcom before making the order, and that is the mechanism that was used to appoint the Advertising Standards Authority to regulate broadcast advertising. It remains appropriate for Parliament to scrutinise the delivery of these important regulatory functions; accordingly, such an order cannot be made unless a draft of the order has been laid before, and approved by a resolution of, each House of Parliament.

The noble Baroness, Lady Merron, dwelt on the decision not to include a child user advocacy body. As I said in my earlier remarks and in relation to other groups, the Bill ensures that children’s voices will be heard and that what they say will be acted on. Ofcom will have statutory duties requiring it to understand the opinions and experiences of users, including children, by consulting widely when developing its codes. Ofcom will also have the flexibility to establish other mechanisms for conducting research about users’ experience. Additionally, the super-complaints process, which we began discussing this afternoon, will make sure that entities, including those that represent the interests of children, will have their voices heard and will help Ofcom recognise and eliminate systemic failings.

We are also naming the Children’s Commissioner as a statutory consultee for Ofcom in developing its codes of practice. A further new child user advocacy body would encroach on the wider statutory functions of the Children’s Commissioner. Both bodies would have similar responsibilities and powers to represent the interests of child users of regulated services, to protect and promote the interests of child users of regulated services, and to be a statutory consultee for the drafting and amendment of Ofcom’s codes of practice.

The noble Baroness, Lady Kidron, when discussing the input of the Children’s Commissioner into the regulatory framework, suggested that it was a here and now issue. She is right: the Children’s Commissioner will represent children’s views to Ofcom in preparing the codes of practice to ensure that they are fully informing the regime, but the commissioner will also have a continuing role, as they will be the statutory consultee on any later amendments to the codes of practice relating to children. That will ensure that they can engage in the ongoing development of the regime and can continue to feed in insights on emerging risks identified through the commissioner’s statutory duty to understand children’s experiences.

The Bill further ensures that new harms and risks to children are proactively identified by requiring that Ofcom make arrangements to undertake research about users’ experiences on regulated services. This will build on the significant amount of research that Ofcom already does, better to understand children’s experience online, particularly their experiences of online harms.

The super-complaints process will enable an eligible entity to make a complaint to Ofcom regarding a provider or providers that cause significant harm or significant adverse impact on users, including children. This will help Ofcom to recognise and eliminate systemic failings, including those relating to children, and will ensure that children’s views and voices continue to inform the regime as it is developed.

The Bill will also require that Ofcom undertake consumer consultation in relation to regulated services. This will, in effect, expand the scope of the Communications Consumer Panel to online safety matters, and will ensure that the needs of users, including children, are at the heart of Ofcom’s regulatory approach.

I draw noble Lords’ attention to the provisions of Clause 141(2), which states that Ofcom must make arrangements to ascertain

“the experiences of United Kingdom users of regulated services”.

That, of course, includes children. I hope, therefore, that noble Lords will be satisfied that the voices of children are indeed being listened to throughout the operation of the Bill. However, we have high regard for the work of the Internet Watch Foundation. I hope that noble Lords will be willing not to press their amendments—after the noble Lord, Lord Clement-Jones, asks his question.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I am in the slightly strange position of not having moved the amendment, but I want to quickly respond. I was slightly encouraged by what the Minister said about Ofcom having been in regular contact with the IWF. I am not sure that that is mutual; maybe Ofcom thinks it is in good contact with the IWF, but I am not sure the IWF thinks it is in good contact with Ofcom. However, I am encouraged that the Minister at least thinks that that has been the case and that he is encouraging consultation and the continuation of engagement.

--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

If I might follow up that comment, I agree entirely with what the noble Baroness has just said. It is very tricky for an independent charity to have the sort of relationship addressed in some of the language in this debate. Before the Minister completes his comments and sits down again, I ask him: if Ofcom were to negotiate a contracted set of duties with the IWF—indeed, with many other charities or others who are interested in assisting with this important work—could that be done directly by Ofcom, with powers that it already has? I think I am right to say that it would not require parliamentary approval. It is only if we are talking about co-regulation, which again raises other issues, that we would go through a process that requires what sounded like the affirmative procedure—the one that was used, for example, with the Advertising Standards Authority. Is that right?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

Yes, I think it is. I am happy to confirm that in writing. I am grateful to my noble friend Lady Stowell, who of course is a former chairman of the Charity Commission, for making the point about the charitable status of the foundation. I should clarify that officials from the Department for Science, Innovation and Technology and the Home Office are in touch with the IWF about its role.

Speedily moving on, Ofcom is in discussion with the foundation about a memorandum of understanding. I hope that reassures the noble Lord, Lord Clement-Jones, that they are in reciprocal contact. Obviously, I cannot pre-empt where their discussions are taking them in relation to that MoU, but it is between Ofcom and the foundation. Careful consideration of governance, funding and issues of charity, as my noble friend raised, would have to be thought about if co-designation were being considered.

Amendment 98A agreed.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Lord, Lord Bethell, who is clearly passionate about this aspect. As the noble Baroness, Lady Harding, said, this is one of the most important groups of amendments that we have to debate on the Bill, even though we are on day eight of Committee. As she said, it is about the right assignment of responsibilities, so it is fundamental to the way that the Bill will operate.

My noble friend Lord Allan brilliantly summed up many of the arguments, and he has graphically described the problem of ministerial overreach, as did the noble Baroness, Lady Harding. We on these Benches strongly support the amendments put forward by the noble Lord, Lord Stevenson, and those put forward by the noble Baroness, Lady Stowell. Obviously, there is some difference of emphasis. They each follow the trail of the different committees of which their proposers were members, which is entirely understandable. I recall that the noble Lord, Lord Gilbert, was the hinge between the two committees—and brilliantly he did that. I very much hope that, when we come back at the next stage, if the Minister has not moved very far, we will find a way to combine those two strands. I think they are extremely close—many noble Lords have set out where we are on accountability and oversight.

Strangely, we are not trying to get out of the frying pan of the Secretary of State being overbearing and move to where we have no parliamentary oversight. Both the noble Baroness, Lady Stowell, and the noble Lord, Lord Stevenson, are clearly in favour of greater oversight of Ofcom. The question is whether it is oversight of the codes and regulation or of Ofcom itself. I think we can find a way to combine those two strands. In that respect, I entirely agree with the noble Baroness, Lady Fox: it is all about making sure that we have the right kind of oversight.

I add my thanks to Carnegie UK. The noble Lord, Lord Stevenson, and the noble Baroness, Lady Stowell, set out the arguments, and we have the benefit of the noble Baroness’s letter to the Secretary of State of 30 January, which she mentioned in her speech. They have set out very clearly where speakers in this debate unanimously want to go.

The Government have suggested some compromise on Clause 39. As the noble Lord, Lord Stevenson said, we have not seen any wording for that, but I think it is highly unlikely that that, by itself, will satisfy the House when we come to Report.

There are many amendments here which deal with the Secretary of State’s powers, but I believe that the key ones are the product of both committees, which is about the Joint Committee. If noble Lords read the Government’s response to our Joint Committee on the draft Bill, they will see that the arguments given by the Government are extremely weak. I think it was the noble Baroness, Lady Stowell, who used the phrase “democratic deficit”. That is exactly what we are not seeking: we are trying to open this out and make sure we have better oversight and accountability. That is the goal of the amendments today. We have heard from the noble Viscount, Lord Colville, about the power of lobbying by companies. Equally, we have heard about how the Secretary of State can be overbearing. That is the risk we are trying to avoid. I very much hope that the Minister sees his way to taking on board at least some of whichever set of amendments he prefers.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, the amendments concern the independence of Ofcom and the role of parliamentary scrutiny. They are therefore indeed an important group, as those things will be vital to the success of the regime that the Bill sets up. Introducing a new, ground-breaking regime means balancing the need for regulatory independence with a transparent system of checks and balances. The Bill therefore gives powers to the Secretary of State comprising a power to direct Ofcom to modify a code of practice, a power to issue a statement of strategic priorities and a power to issue non-binding guidance to the regulator.

These powers are important but not novel; they have precedent in the Communications Act 2003, which allows the Secretary of State to direct Ofcom in respect of its network and spectrum functions, and the Housing and Regeneration Act 2008, which allows the Secretary of State to make directions to the Regulator of Social Housing to amend its standards. At the same time, I agree that it is important that we have proportionate safeguards in place for the use of these powers, and I am very happy to continue to have discussions with noble Lords to make sure that we do.

Amendment 110, from the noble Lord, Lord Stevenson, seeks to introduce a lengthier process regarding parliamentary approval of codes of practice, requiring a number of additional steps before they are laid in Parliament. It proposes that each code may not come into force unless accompanied by an impact assessment covering a range of factors. Let me reassure noble Lords that Ofcom is already required to consider these factors; it is bound by the public sector equality duty under the Equality Act 2010 and the Human Rights Act 1998 and must ensure that the regime and the codes of practice are compliant with rights under the European Convention on Human Rights. It must also consult experts on matters of equality and human rights when producing its codes.

Amendment 110 also proposes that any designated Select Committee in either House has to report on each code and impact assessment before they can be made. Under the existing process, all codes must already undergo scrutiny by both Houses before coming into effect. The amendment would also introduce a new role for the devolved Administrations. Let me reassure noble Lords that the Government are working closely with them already and will continue to do so over the coming months. As set out in Schedule 5 to the Scotland Act 1998, however, telecommunications and thereby internet law and regulation is a reserved policy area, so input from the devolved Administrations may be more appropriately sought through other means.

Amendments 111, 113, 114, 115, and 117 to 120 seek to restrict or remove the ability of the Secretary of State to issue directions to Ofcom to modify draft codes of practice. Ofcom has great expertise as a regulator, as noble Lords noted in this debate, but there may be situations where a topic outside its remit needs to be reflected in a code of practice. In those situations, it is right for the Government to be able to direct Ofcom to modify a draft code. This could, for example, be to ensure that a code reflects advice from the security services, to which Ofcom does not have access. Indeed, it is particularly important that the Secretary of State be able to direct Ofcom on matters of national security and public safety, where the Government will have access to information which Ofcom will not.

I have, however, heard the concerns raised by many in your Lordships’ House, both today and on previous occasions, that these powers could allow for too much executive control. I can assure your Lordships that His Majesty’s Government are committed to protecting the regulatory independence of Ofcom, which is vital to the success of the framework. With this in mind, we have built a number of safeguards into the use of the powers, to ensure that they do not impinge on regulatory independence and are used only in limited circumstances and for the appropriate reasons.

I have heard the strong feelings expressed that this power must not unduly restrict regulatory independence, and indeed share that feeling. In July, as noble Lords noted, the Government announced our intention to make substantive changes to the power; these changes will make it clear that the power is for use only in exceptional circumstances and will replace the “public policy” wording in Clause 39 with a defined list of reasons for which a direction can be made. I am happy to reiterate that commitment today, and to say that we will be making these changes on Report when, as the noble Lord, Lord Clement-Jones, rightly said, noble Lords will be able to see the wording and interrogate it properly.

Additionally, in light of the debate we have just had today—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

Can my noble friend the Minister clarify what he has just said? When he appeared in front of the Communications and Digital Committee, I think he might have been road-testing some of that language. In the specific words used, he would still have allowed the Secretary of State to direct Ofcom for economic reasons. Is that likely to remain the case? If it is, I feel it will not actually meet what I have heard is the will of the Committee.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

When we publish the wording, we will rightly have an opportunity to discuss it before the debate on Report. I will be happy to discuss it with noble Lords then. On the broader points about economic policy, that is a competency of His Majesty’s Government, not an area of focus for Ofcom. If the Government had access to additional information that led them to believe that a code of practice as drafted could have a significant, disproportionate and adverse effect on the livelihoods of the British people or to the broader economy, and if it met the test for exceptional circumstances, taking action via a direction from the Secretary of State could be warranted. I will happily discuss that when my noble friend and others see the wording of the changes we will bring on Report. I am sure we will scrutinise that properly, as we should.

I was about to say that, in addition to the commitment we have already made, in the light of the debate today we will also consider whether transparency about the use of this power could be increased further, while retaining the important need for government oversight of issues that are genuinely beyond Ofcom’s remit. I am conscious that, as my noble friend Lady Stowell politely said, I did not convince her or your Lordships’ committee when I appeared before it with my honourable friend Paul Scully. I am happy to continue our discussions and I hope that we may reach some understanding on this important area.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - - - Excerpts

I am sorry to interrupt, but may I clarify what my noble friend just said? I think he said that, although he is open to increasing the transparency of the procedure, he does not concede a change—from direction to a letter about guidance which Ofcom should take account of. Is he willing to consider that as well?

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am happy to continue to discuss it, and I will say a bit more about the other amendments in this group, but I am not able to say much more at this point. I will happily follow this up in discussion with my noble friend, as I know it is an issue of interest to her and other members of your Lordships’ committee.

The noble Lord, Lord Stevenson, asked about our international obligations. As noble Lords noted, the Government have recognised the importance of regulatory independence in our work with international partners, such as the Council of Europe’s declaration on the independence of regulators. That is why we are bringing forward the amendments previously announced in another place. Ensuring that powers of direction can be issued only in exceptional circumstances and for a set of reasons defined in the Bill will ensure that the operational independence of Ofcom is not put at risk. That said, we must strike a balance between parliamentary oversight and being able to act quickly where necessary.

Regarding the amendment tabled by my noble friend Lady Stowell, which calls for all codes which have been altered by a direction to go through the affirmative procedure, as drafted, the negative procedure is used only if a direction is made to a code of practice relating to terrorism or child sexual exploitation or abuse, for reasons of national security or public safety. It is important that the parliamentary process be proportionate, particularly in cases involving national security or public safety, where a code might need to be amended quickly to protect people from harm. We therefore think that, in these cases, the negative procedure is more appropriate.

On timing, the Government are committed to ensuring that the framework is implemented quickly, and this includes ensuring that the codes of practice are in force. The threshold of exceptional circumstances for the power to direct can lead to a delay only in situations where there would otherwise be significant consequences for national security or public safety, or for the other reasons outlined today.

My noble friend Lord Moylan was not able to be here for the beginning of the debate on this group, but he is here now. Let me say a little about his Amendment 254. Under Clause 153, the Secretary of State can set out a statement of the Government’s strategic priorities in relation to matters of online safety. This power is necessary, as future technological changes are likely to shape online harms, and the Government must be able to state their strategic priorities in relation to them. My noble friend’s amendment would go beyond the existing precedent for the statement of strategic priorities in relation to telecommunications, management of the radio spectrum, and postal services outlined in the Communications Act. The Secretary of State must consult Ofcom and other appropriate persons when preparing this statement. This provides the opportunity for widespread scrutiny of a draft statement before it can be designated through a negative parliamentary procedure. We consider that the negative procedure is appropriate, in line with comparable existing arrangements.

Amendment 257 from the noble Lord, Lord Stevenson, seeks to remove the Secretary of State’s power to issue guidance to Ofcom about the exercise of its online safety functions. Issuing guidance of this kind, with appropriate safeguards, including consultation and limitations on its frequency, is an important part of future-proofing the regime. New information—for example, resulting from parliamentary scrutiny or technological developments—may require the Government to clarify the intent of the legislation.

Amendments 258 to 260 would require the guidance to be subject to the affirmative procedure in Parliament. Currently, Ofcom must be consulted, and any guidance must be laid before Parliament. The Bill does not subject the guidance to a parliamentary procedure because the guidance does not create any statutory requirements, and Ofcom is required only to have had regard to it. We think that remains the right approach.

The noble Lord, Lord Stevenson, has made clear his intention to question Clause 156, which grants the Secretary of State the power to direct Ofcom’s media literacy activity only in special circumstances. This ensures that the regulatory framework is equipped to respond to significant future threats—for example, to the health or safety of the public, or to national security. I have already set out, in relation to other amendments, why we think it is right that the Secretary of State can direct Ofcom in these circumstances.

The delegated powers in the Bill are crucial to ensuring that the regulatory regime keeps pace with changes in this area. Amendment 290 from the noble Lord, Lord Stevenson, would go beyond the existing legislative process for these powers, by potentially providing for additional committees to be, in effect, inserted into the secondary legislative process. Established committees themselves are able to decide whether to scrutinise parts of a regime in more detail, so I do not think they need a Parkinson rule to do that.

Noble Lords have expressed a common desire to see this legislation implemented as swiftly as possible, so I hope they share our wariness of any amendments which could slow that process down. The process as envisaged in this amendment is an open-ended one, which could delay implementation. Of course, however, it is important that Parliament is able to scrutinise the work of the regulator. Like most other regulators, Ofcom is accountable to Parliament on how it exercises its functions. The Secretary of State is required to present its annual report and accounts before both Houses. Ministers from Scotland, Wales and Northern Ireland must also lay a copy of the report before their respective Parliament or Assembly. Moreover, the officers of Ofcom can be required to appear before Select Committees to answer questions about its operations on an annual basis. Parliament will also have a role in approving a number of aspects of the regulatory framework through its scrutiny of both the primary and secondary legislation. This will include the priority categories for harms and Ofcom’s codes of practice.

More broadly, we want to ensure that this ground-breaking legislation has the impact we intend. Ongoing parliamentary scrutiny of it will be crucial to help to ensure that. There is so much expertise in both Houses, and it has already helped to improve this legislation, through the Joint Committee on the draft Bill, the DCMS Select Committee in another place and, of course, your Lordships’ Communications and Digital Committee.

As my noble friend Lady Stowell said, we must guard against fragmentation and duplication, which we are very mindful of. Although we do not intend to legislate for a new committee—as I set out on previous occasions, including at Second Reading and before the Communications and Digital Committee—we remain happy to discuss possible mechanisms for oversight to ensure that we make best use of the expertise in both Houses of Parliament so that the Bill delivers what we want. With that, I hope that Members of the Committee will be happy to continue the discussions in this area and not press their amendments.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

I am grateful to the noble Lord for his comprehensive response and for the welcome change in tone and the openness to further debate and discussions. I thank all those who spoke in the debate. The noble Baroness, Lady Harding, was right: we are getting into a routine where we know roughly where our places are and, if we have contributions to make, we make them in the right order and make them comprehensive. We did our bit quite well, but I am afraid that the Minister’s response made me a bit confused. As I said, I welcome the change of tone, the sense of engagement with some of the issues and the ability to meet to discuss ways forward in some of those areas. But he then systematically and rather depressingly shut off just about everything that I thought we were going to discuss. I may be overstating that, so I will read Hansard carefully to make sure that there are still chinks of light in his hitherto impenetrable armour. I really must stop using these metaphors— I thought that the noble Baroness, Lady Harding, had managed to get me off the hook with her question about whether we were an island of concrete rock, and about whether the boat was going to end up in the stormy sea that we were creating. I decided that I could not follow that, so I will not.

We ought to take forward and address three things, which I will briefly go through in the response. One that we did not nail down was the good point made by the noble Baroness, Lady Kidron, that we had focused on regulatory structures in the form of set bodies relating—or not relating—to parliamentary procedures and to Ministers and their operations. She pointed out that, actually, the whole system has a possible drag effect that we also need to think about. I note that good point because we probably need a bit of time to think about how that would work in the structures that come forward.

The noble Lord, Lord Allan, said that we are trying to look at the changing of the accountability model. I disagree with the word “changing” because we are not trying to change anything; we have a model that works, but the new factor that we are trying to accommodate is the intensity of interaction and, as we said, the amplification that comes from the internet. I worry that this was not being picked up enough in the Minister’s response, but we will pick it up later and see if we can get through it.

The three points I wanted to make sure of were as follows. Following the line taken by the noble Baroness, Lady Stowell, one point is on trying to find a proper balance between the independence of the regulator; the Secretary of State’s right, as an elected leader of this aspect of the Government, to make recommendations and proposals to that regulator on how the system can be better; and Parliament’s ability to find a place in that structure, which is still eluding us a little, so we will need to spend more time on it. There is enough there to be reassured that we will find a way of balancing the independence of the regulator and the role of the Secretary of State. It does not need as many mentions in the legislation as it currently has. There is clearly a need for the Secretary of State to be able to issue direction in cases of national security et cetera—but it is the “et cetera” that I worry about: what are these instances? Until they are nailed down and in the Bill, there has to be a question about that.

--- Later in debate ---
Moved by
122A: Clause 47, page 46, line 10, after “29” insert “, except the duty set out in subsection (8A) of those sections”
Member’s explanatory statement
This amendment ensures that OFCOM need not produce guidance about the new duties in clauses 19 and 29 to supply records of risk assessments to OFCOM.
--- Later in debate ---
Thirdly, I worry about some things that have crept into the debate on the proportionality issue. If “a small number” means that we will somehow let a few children see something, that will not be acceptable. Everybody has said this. Let us be clear about it: this is either 100% or it is not worth doing. If so, the question of whether we do it is not about finding the right form of words, such as “beyond reasonable doubt”; it is about certainty.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

As the noble Baroness, Lady Kidron, set out at the beginning of this debate, the amendments in this group have involved extensive discussions among Members in both Houses of Parliament, who sit on all sides of both Houses. I am very grateful for the way noble Lords and Members in another place have done that. They have had those preliminary discussions so that our discussions in the debate today and in preparation for it could be focused and detailed. I pay particular tribute to the noble Baroness, Lady Kidron, and my noble friends Lord Bethell and Lady Harding, who have been involved in extensive discussions with others and then with us in government. These have been very helpful indeed; they continue, and I am happy to commit to their continuing.

Age-assurance technologies will play an important role in supporting the child safety duties in this Bill. This is why reference is made to them on the face of the Bill—to make it clear that the Government expect these measures to be used for complying with the duties to protect children from harmful content and activity online. Guidance under Clause 48 will already cover pornographic content. While this is not currently set out in the legislation, the Government intend, as noble Lords know, to designate pornographic content as a category of primary priority content which is harmful to children. As I set out to your Lordships’ House during our debate on harms to children, we will amend the Bill on Report to list the categories of primary and primary priority content on the face of the Bill.

I am very grateful to noble Lords for the engagement we have had on some of the points raised in Amendments 142 and 306 in recent weeks. As we have been saying in those discussions, the Government are confident that the Bill already largely achieves the outcomes sought here, either through existing provisions in it or through duties in other legislation, including data protection legislation, the Human Rights Act 1998 and the Equality Act 2010. That is why we think that re-stating duties on providers which are already set out in the Bill, or repeating duties set out in other legislation, risks causing uncertainty, and why we need to be careful about imposing specific timelines on Ofcom by which it must produce age-assurance guidance. It is essential that we protect Ofcom’s ability robustly to fulfil its consultation duties for the codes of practice. If Ofcom is given insufficient time to fulfil these duties, the risk of legal challenge being successful is increased.

I welcome Ofcom’s recent letter to your Lordships, outlining its implementation road map, which I hope provides some reassurance directly from the regulator on this point. Ofcom will prioritise protecting children from pornography and other harmful content. It intends to publish, this autumn, draft guidance for Part 5 pornography duties and draft codes of practice for Part 3 illegal content duties, including for child sexual exploitation and abuse content. Draft codes of practice for children’s safety duties will follow next summer. These elements of the regime are being prioritised ahead of others, such as the category 1 duties, to reflect the critical importance of protecting children.

Although we believe that the Bill already largely achieves the outcomes sought, we acknowledge the importance of ensuring that there are clear principles for Ofcom to apply when recommending or requiring the use of age-assurance technologies. I am happy to reassure noble Lords that the Government will continue to consider this further and are happy to continue our engagement on this issue, although any amendment must be made in a way that sits alongside existing legislation and within the framework of the Bill.

I turn to Amendments 161 and 183. First, I will take the opportunity to address some confusion about the requirements in Parts 3 and 5 of the Bill. The Bill ensures that companies must prevent children accessing online pornography, regardless of whether it is regulated in Part 3 or Part 5. The Government are absolutely clear on this point; anything less would be unacceptable. The most effective approach to achieving this is to focus on the outcome of preventing children accessing harmful content, which is what the Bill does. If providers do not prevent children accessing harmful content, Ofcom will be able to bring enforcement action against them.

I will address the point raised by my noble friend Lord Bethell about introducing a standard of “beyond reasonable doubt” for age verification for pornography. As my noble friend knows, we think this a legally unsuitable test which would require Ofcom to determine the state of mind of the provider, which would be extremely hard to prove and would therefore risk allowing providers to evade their duties. A clear, objective duty is the best way to ensure that Ofcom can enforce compliance effectively. The Bill sets clear outcomes which Ofcom will be able to take action on if these are not achieved by providers. A provider will be compliant only if it puts in place systems and processes which meet the objective requirements of the child safety duties.

The provisions in the Bill on proportionality are important to ensure that the requirements in the child safety duties are tailored to the size and capacity of providers. Smaller providers or providers with less capacity are still required to meet the child safety duties where their services pose a risk to children. They will need to put in place sufficiently stringent systems and processes that reflect the level of risk on their services and will need to make sure these systems and processes achieve the required outcomes of the child safety duties.

The Government expect companies to use age-verification technologies to prevent children accessing services which pose the highest risk of harm to children, such as online pornography. However, companies may use another approach if it is proportionate to the findings of the child safety risk assessment and a provider’s size and capacity. This is an important element to ensure that the regulatory framework remains risk-based and proportionate.

Age verification may not always be the most appropriate or effective approach for user-to-user companies to comply with their duties. For example, if a user-to-user service such as a social medium does not allow—

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I am sorry to interrupt. The Minister said that he would bear in mind proportionality in relation to size and capacity. Is that not exactly the point that the noble Baroness, Lady Harding, was trying to make? In relation to children, why will that be proportionate? A single child being damaged in this way is too much.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

The issue was in relation to a provider’s size and capacity; it is an issue of making sure it is effective and enforceable, and proportionate to the size of the service in question. It may also not be the most effective approach for companies to follow to comply with their duties. If there is a company such as a user-to-user service in social media that says it does not allow pornography under its terms of service, measures such as content moderation and user reporting might be more appropriate and effective for protecting children than age verification in those settings. That would allow content to be better detected and taken down, while—

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I understand that, but it is an important point to try to get on the record. It is an outcome-based solution that we are looking for, is it not? We are looking for zero activity where risks to children are there. Clearly, if the risk assessment is that there is no risk that children can be on that site, age verification may not be required— I am extending it to make a point—but, if there is a risk, we need to know that the outcome of that process will be zero. That is my point, and I think we should reflect on that.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am very happy to, and the noble Lord is right that we must be focused on the outcomes here. I am very sympathetic to the desire to make sure that providers are held to the highest standards, to keep children protected from harmful content online.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

I know the Minister said that outcomes are detailed in the Bill already; I wonder whether he could just write to us and describe where in the Bill those outcomes are outlined.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I shall happily do that, and will happily continue discussions with my noble friend and others on this point and on the appropriate alternative to the language we have discussed.

On the matter of Ofcom independently auditing age- assurance technologies, which my noble friend also raised, the regulator already has the power to require a company to undertake and pay for a report from a skilled person about a regulated service. This will assist Ofcom in identifying and assessing non-compliance, and will develop its understanding of the risk of failure to comply. We believe that this is therefore already provided for.

I reassure noble Lords that the existing definition of pornographic content in the Bill already captures the same content that Amendment 183ZA, in the name of the noble Baroness, Lady Ritchie of Downpatrick, intends to capture. The definition in the Bill shares the key element of the approach Ofcom is taking for pornography on UK-established video-sharing platforms. This means that the industry will be familiar with this definition and that Ofcom will have experience in regulating content which meets it.

The definition is also aligned with that used in existing legislation. I take on board the point she made about her trawl of the statute book for it, but the definition is aligned elsewhere in statute, such as in the Coroners and Justice Act 2009. This means that, in interpreting the existing definition in the Bill, the courts may be able to draw on precedent from the criminal context, giving greater certainty about its meaning. The definition of pornography in Part 5 is also consistent with the British Board of Film Classification’s guidelines for the definition of sex works, which is

“works whose primary purpose is sexual arousal or stimulation”

and the BBFC’s definition of R18. We therefore think it is not necessary to refer to BBFC standards in this legislation. Including the definition in the Bill also retains Parliament’s control of the definition, and therefore also which content is subject to the duties in Part 5. That is why we believe that the definition as outlined in the Bill is more straightforward for both service providers and Ofcom to apply.

I turn to Amendments 184 and 185. The Government share the concerns raised in today’s debate about the wider regulation of online pornography. It is important to be clear that extreme pornography, so-called revenge pornography and child sexual exploitation and abuse are already illegal and are listed as priority offences in the Bill. This means that under the illegal content duties, Part 3 providers, which will include some of the most popular commercial pornography services, must take proactive, preventive measures to limit people’s exposure to this criminal content and behaviour.

--- Later in debate ---
Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

Does my noble friend the Minister recognise that those laws have been in place for the 30 years of the internet but have not successfully been used to protect the rights of those who find their images wrongly used, particularly those children who have found their images wrongly used in pornographic sites? Does he have any reflections on how that performance could be improved?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I would want to take advice and see some statistics, but I am happy to do that and to respond to my noble friend’s point. I was about to say that my noble friend Lady Jenkin of Kennington asked a number of questions, but she is not here for me to answer them.

I turn to Amendment 232 tabled by the noble Lord, Lord Allan of Hallam. Because of the rapid development of age-assurance technologies, it is right that they should be carefully assessed to ensure that they are used effectively to achieve the outcomes required. I am therefore sympathetic to the spirit of his amendment, but must say that Ofcom will undertake ongoing research into the effectiveness of age-assurance technologies for its various codes and guidance, which will be published. Moreover, when preparing or updating the codes of practice, including those that refer to age-assurance technologies, Ofcom is required by the Bill to consult a broad range of people and organisations. Parliament will also have the opportunity to scrutinise the codes before they come into effect, including any recommendations regarding age assurance. We do not think, therefore, that a requirement for Ofcom to produce a separate report into age-assurance technologies is a necessary extra burden to impose on the regulator.

In relation to this and all the amendments in this group, as I say, I am happy to carry on the discussions that we have been having with a number of noble Lords, recognising that they speak for a large number of people in your Lordships’ House and beyond. I reiterate my thanks, and the Government’s thanks, to them for the way in which they have been going about that. With that, I encourage them not to press their amendments.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I regret that my noble friend Lord Lipsey is unable to be here. I wish him and the noble Lord, Lord McNally, well. I also regret that my noble friend Lord Stevenson is not here to wind up this debate and introduce his Amendment 127. Our inability to future-proof these proceedings means that, rather than talking to the next group, I am talking to this one.

I want to make four principal points. First, the principle of press freedom, as discussed by the noble Lords, Lord Black and Lord Faulks, in particular, is an important one. We do not think that this is the right Bill to reopen those issues. We look forward to the media Bill as the opportunity to discuss these things more fully across the House.

Secondly, I have some concerns about the news publisher exemption. In essence, as the noble Lord, Lord Clement-Jones, set out, as long as you have a standards code, a complaints process, a UK address and a team of contributors, the exemption applies. That feels a bit loose to me, and it opens up the regime to some abuse. I hear what the noble Baronesses, Lady Gohir and Lady Grey-Thompson, said about how we already see pretty dodgy outfits allowing racist and abusive content to proliferate. I look forward to the Minister’s comments on whether the bar we have at the moment is too low and whether there is some reflection to be done on that.

The third point is on my noble friend Lord Stevenson’s Amendment 127, which essentially says that we should set a threshold around whether complaints are dealt with in a timely manner. In laying that amendment, my noble friend essentially wanted to probe. The noble Lord, Lord Faulks, is here, so this is a good chance to have him listen to me say that we think that complaints should be dealt with more swiftly and that the organisation that he chairs could do better at dealing with that.

My fourth comment is about comments, particularly after listening to the speech of the noble Baroness, Lady Grey-Thompson, about some of the hateful comment that is hidden away inside the comments that news publishers carry. I was very much struck by what she said in respect of some of the systems of virality that are now being adopted by those platforms. There, I think Amendment 227 is tempting. I heard what the noble Baroness, Lady Stowell, said, and I think I agree that this is better addressed by Parliament.

For me, that just reinforces the need for this Bill, more than any other that I have ever worked on in this place, to have post-legislative scrutiny by Parliament so that we, as a Parliament, can review whether the regime we are setting up is running appropriately. It is such a novel regime, in particular around regulating algorithms and artificial intelligence. It would be an opportunity to see whether, in this case, the systems of virality were creating an amplification of harm away from the editorial function that the news publishers are able to exercise over the comments.

On that basis, and given the hour, I am happy to listen with care to the wise words of the Minister.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Digital, Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, I join noble Lords who have sent their best wishes to the noble Lords, Lord Lipsey and Lord McNally.

His Majesty’s Government are committed to defending the invaluable role of a free media. We are clear that our online safety legislation must protect the vital role of the press in providing people with reliable and accurate information.

We have included strong protections for news publishers’ and journalistic content in the Bill, which extends to the exemption from the Bill’s safety duties for users’ comments and reviews on news publishers’ sites. This reflects a wider exemption for comments and reviews on provider content more generally. For example, reviews of products on retailers’ sites are also exempt from regulation. This is designed to avoid disproportionate regulatory burden on low-risk services.

Amendment 124 intends to modify that exemption, so that the largest news websites no longer benefit and are subject to the Bill’s regulatory regime. Below-the-line comments are crucial for enabling reader engagement with the news and encouraging public debate, as well as for the sustainability—and, as the noble Baroness, Lady Fox, put it, the accountability—of the news media. We do not consider it proportionate, necessary or compatible with our commitment to press freedom to subject these comment sections to oversight by Ofcom.

We recognise that there can sometimes be unpleasant or abusive below-the-line comments. We have carefully considered the risks of this exemption against the need to protect freedom of speech and media freedoms on matters of public interest. Although comment functions will not be subject to online regulation, I reassure the Members of the Committee who raised concerns about some of the comments which have attracted particular attention that sites hosting such comments can, in some circumstances, be held liable for any illegal content appearing on them, where they have actual knowledge of the content in question and fail to remove it expeditiously.

The strong protections for recognised news publishers in the Bill include exempting their content from the Bill’s safety duties, requiring category 1 platforms to notify recognised news publishers and to offer a right of appeal before removing or moderating any of their content. Clause 50 stipulates the clear criteria that publishers will have to meet to be considered a “recognised news publisher” and to benefit from those protections. When drafting these criteria, the Government have been careful to ensure that established news publishers are captured, while limiting the opportunity for bad actors to qualify.

Amendment 126 seeks to restrict the criteria for recognised news publishers in the Bill, so that only members of an approved regulator within the meaning of Section 42 of the Crime and Courts Act 2013 benefit from the protections offered by the Bill. This would create strong incentives for publishers to join specific press regulators. We do not consider this to be compatible with our commitment to a free press. We will repeal existing legislation that could have that effect, specifically Section 40 of the Crime and Courts Act 2013, through the media Bill, as noble Lords have noted, which has recently been published. Without wanting to make a rod for my own back when we come to that Bill, I agree with my noble friend Lord Black of Brentwood that it would be the opportunity to have this debate, if your Lordships so wished.

The current effect of this amendment would be to force all news publishers to join a single press regulator—namely Impress, the only UK regulator which has sought approval by the Press Recognition Panel—if they were to benefit from the exclusion for recognised news publishers. Requiring a publisher to join specific regulators is, in the view of His Majesty’s Government, not only incompatible with protecting press freedom in the UK but unnecessary given the range of detailed criteria which a publisher must meet to qualify for the additional protections, as set out in Clause 50 of the Bill.

As part of our commitment to media freedom, we are committed to independent self-regulation of the press. As I have indicated, Clause 50 stipulates the clear criteria which publishers will have to meet to be considered a “recognised news publisher” and to benefit from the protections in the Bill. One of those criteria is for entities to have policies and procedures for handling and resolving complaints. Amendment 127 from the noble Lord, Lord Stevenson, adds a requirement that these policies and procedures must cover handling and resolving complaints “in a timely manner”. To include such a requirement will place the responsibility on Ofcom to decide what constitutes “timely”, and, in effect, put it in the position of press regulator. That is not something that we would like. We believe that the criteria set out in Clause 50 are already strong, and we have taken significant care to ensure that established news publishers are captured, while limiting the opportunity for bad actors to benefit.

I turn now to Amendment 227. We recognise that, as legislation comes into force, it will be necessary to ensure that the protections we have put in place for journalistic and news publisher content are effective. We need to ensure that the regulatory framework does not hinder access to such content, particularly in the light of the fact that, in the past, news content has sometimes been removed or made less visible by social media moderators or algorithms for unclear reasons, often at the height of news cycles. That is why we have required Ofcom to produce a specific report, under Clause 144, assessing the impact of the Bill on the availability and treatment of news publisher and journalistic content on category 1 services.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

Before the Minister closes his folder and sits down, perhaps I could say that I listened carefully and would just like him to reflect a little more for us on my question of whether the bar is set too low and there is too much wriggle room in the exemption around news publishers. A tighter definition might be something that would benefit the Bill and the improvement of the Bill when we come back to it on Report.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

Looking at the length of Clause 50—and I note that the noble Lord, Lord Allan of Hallam, made much the same point in his speech—I think the definitions set out in Clause 50 are extensive. Clause 50(1) sets out a number of recognised news publishers, obviously including

“the British Broadcasting Corporation, Sianel Pedwar Cymru”—

self-evidently, as well as

“the holder of a licence under the Broadcasting Act 1990 or 1996”

or

“any other entity which … meets all of the conditions in subsection (2), and … is not an excluded entity”

as set out in subsection (3). Subsection (2) sets out a number of specific criteria which I think capture the recognised news publishers we want to see.

Noble Lords will be aware of the further provisions we have brought forward to make sure that entities that are subject to a sanction are not able to qualify, such as—

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

I think it is actually quite important that there is—to use the language of the Bill—a risk assessment around the notion that people might game it. I thought the noble Baroness, Lady Gohir, made a very good point. People are very inventive and, if you have ever engaged with the people who run some of those big US misinformation sites—let us just call them that—you will know that they have very inventive, very clever people. They will be looking at this legislation and if they figure out that by opening a UK office and ticking all the boxes they will now get some sorts of privileges in terms of distributing their misinformation around the world, they will do it. They will try it, so I certainly think it is worth there being at least some kind of risk assessment against that happening.

In two years’ time we will be able to see whether the bad thing happened, but whether or not it is the Minister having a conversation with Ofcom now, I just think that forewarned is forearmed. We know that that is a possibility and it would be helpful for some work to be done now to make sure that that is not a loophole that none of us want, I think.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I am mindful of the examples the noble Lord gave in his speech. Looking at some of the provisions set out in subsection (2) about a body being

“subject to a standards code”

or having

“policies and procedures for handling and resolving complaints”,

I think on first response that those examples he gave would be covered. But I will certainly take on board the comments he made and those the noble Baroness, Lady Gohir, made as well and reflect on them. I hope—

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

On a final point of clarification, in contrast, I think the exemption may be too narrow, not too broad. With the emergence of blogs and different kinds of news organisations—I think the noble Lord, Lord Allan, described well the complexity of what we have—and some of the grimmer, grosser examples of people who might play the system, does the Minister acknowledge that that might be dealt with by the kind of exemptions that have been used for RT? When somebody is really an extremist representative of, I do not know, ISIS, pretending to be a media organisation, the sensible thing to do would be to exempt them, rather than to overtighten the exemptions, so that new, burgeoning, widely read online publications can have press freedom protection.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I will certainly take on board the points the noble Baroness raises. Hearing representations in both directions on the point would, on first consideration, reassure me that we have it right, but I will certainly take on board the points which the noble Baroness, the noble Lord and others have raised in our debate on this. As the noble Lord, Lord Allan, suggests, I will take the opportunity to discuss it with Ofcom, as we will do on many of the issues which we are discussing in this Committee, to make sure that its views are taken on board before we return to these and other issues on Report.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, this has been a grim but important debate to open the Committee’s proceedings today. As my noble friend Lady Harding of Winscombe and others have set out, some of the issues and materials about which we are talking are abhorrent indeed. I join other noble Lords in thanking my noble friend Lord Harlech for his vigilance and consideration for those who are watching our proceedings today, to allow us to talk about them in the way that we must in order to tackle them, but to ensure that we do so sensitively. I thank noble Lords for the way they have done that.

I pay tribute also to those who work in this dark corner of the internet to tackle these harms. I am pleased to reassure noble Lords that the Bill has been designed in a way that responds to emerging and new technologies that may pose a risk of harm. In our previous debates, we have touched on explicitly naming certain technologies and user groups or making aspects of the legislation more specific. However, one key reason why the Government have been resistant to such specificity is to ensure that the legislation remains flexible and future-proofed.

The Bill has been designed to be technology-neutral in order to capture new services that may arise in this rapidly evolving sector. It confers duties on any service that enables users to interact with each other, as well as search services, meaning that any new internet service that enables user interaction will be caught by it.

Amendment 125, tabled by the noble Baroness, Lady Kidron—whose watchful eye I certainly feel on me even as she takes a rare but well-earned break today—seeks to ensure that machine-generated content, virtual reality content and augmented reality content are regulated content under the Bill. I am happy to confirm to her and to my noble friend Lady Harding who moved the amendment on her behalf that the Bill is designed to regulate providers of user-to-user services, regardless of the specific technologies they use to deliver their service, including virtual reality and augmented reality content. This is because any service that allows its users to encounter content generated, uploaded or shared by other users is in scope unless exempt. “Content” is defined very broadly in Clause 207(1) as

“anything communicated by means of an internet service”.

This includes virtual or augmented reality. The Bill’s duties therefore cover all user-generated content present on the service, regardless of the form this content takes, including virtual reality and augmented reality content. To state it plainly: platforms that allow such content—for example, the metaverse—are firmly in scope of the Bill.

The Bill also ensures that machine-generated content on user-to-user services created by automated tools or machine bots will be regulated by the Bill where appropriate. Specifically, Clause 49(4)(b) means that machine-generated content is regulated unless the bot or automated tool producing the content is controlled by the provider of the service. This approach ensures that the Bill covers scenarios such as malicious bots on a social media platform abusing users, or when users share content produced by new tools, such as ChatGPT, while excluding functions such as customer service chatbots which are low risk. Content generated by an artificial intelligence bot and then placed by a user on a regulated service will be regulated by the Bill. Content generated by an AI bot which interacts with user-generated content, such as bots on Twitter, will be regulated by the Bill. A bot that is controlled by the service provider, such as a customer service chatbot, is out of scope; as I have said, that is low risk and regulation would therefore be disproportionate. Search services using AI-powered features will be in scope of the search duties.

The Government recognise the need to act both to unlock the opportunities and to address the potential risks of this technology. Our AI regulation White Paper sets out the principles for the responsible development of AI in the UK. These principles, such as safety and accountability, are at the heart of our approach to ensuring the responsible development and use of artificial intelligence. We are creating a horizon-scanning function and a central risk function which will enable the Government to monitor future risks.

The Bill does not distinguish between the format of content present on a service. Any service that allows its users to encounter content generated, uploaded or shared by other users is in scope unless exempt, regardless of the format of that content. This includes virtual and augmented reality material. Platforms that allow such content, such as the metaverse, are firmly in scope of the Bill and must take the required steps to protect their users from harm. I hope that gives the clarity that my noble friend and others were seeking and reassurance that the intent of Amendment 125 is satisfied.

The Bill will require companies to take proactive steps to tackle all forms of online child sexual abuse, including grooming, live streaming, child sexual abuse material and prohibited images of children. If AI-generated content amounts to a child’s sexual exploitation or abuse offence in the Bill, it will be subject to the illegal content duties. Regulated providers will need to take steps to remove this content. We will shortly bring forward, and have the opportunity to debate in Committee, a government amendment to address concerns relating to the sending of intimate images. This will cover the non-consensual sharing of manufactured images—more commonly known as deepfakes. The possession and distribution of altered images that appear to be indecent photographs of children is ready covered by the indecent images of children offences, which are very serious offences with robust punishment in law.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

Will the review also cover an understanding of what has been happening in criminal cases where, in some of the examples that have been described, people have tried to take online activity to court? We will at that point understand whether the judges believe that existing offences cover some of these novel forms of activity. I hope the review will also extend not just to what Ofcom does as a regulator but to understand what the courts are doing in terms of the definitions of criminal activity and whether they are being effective in the new online spaces.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I believe it will. Certainly, both government and Parliament will take into account judgments in the court on this Bill and in related areas of law, and will, I am sure, want to respond.

Baroness Berridge Portrait Baroness Berridge (Con)
- Hansard - - - Excerpts

It is not just the judgments of the courts; it is about how the criminal law as a very basic point has been framed. I invite my noble friend the Minister to please meet with the Dawes Centre, because it is about future crime. We could end up with a situation in which more and more violence, particularly against women and girls, is being committed in this space, and although it may be that the Bill has made it regulated, it may not fall within the province of the criminal law. That would be a very difficult situation for our law to end up in. Can my noble friend the Minister please meet with the Dawes Centre to talk about that point?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am happy to reassure my noble friend that the director of the Dawes Centre for Future Crime sits on the Home Office’s Science Advisory Council, whose work is very usefully fed into the work being done at the Home Office. Colleagues at the Ministry of Justice keep criminal law under constant review, in light of research by such bodies and what we see in the courts and society. I hope that reassures my noble friend that the points she raised, which are covered by organisations such as the Dawes Centre, are very much in the mind of government.

The noble Lord, Lord Allan of Hallam, explained very effectively the nuances of how behaviour translates to the virtual world. He is right that we will need to keep both offences and the framework under review. My noble friend Lady Berridge asked a good and clear question, to which I am afraid I do not have a similarly concise answer. I can reassure her that generated child sexual abuse and exploitation material is certainly illegal, but she asked about sexual harassment via a haptic suit; that would depend on the specific circumstances. I hope she will allow me to respond in writing, at greater length and more helpfully, to the very good question she asked.

Under Clause 56, Ofcom will also be required to undertake periodic reviews into the incidence and severity of content that is harmful to children on the in-scope services, and to recommend to the Secretary of State any appropriate changes to regulations based on its findings. Clause 141 also requires Ofcom to carry out research into users’ experiences of regulated services, which will likely include experiences of services such as the metaverse and other online spaces that allow user interaction. Under Clause 147, Ofcom may also publish reports on other online safety matters.

The questions posed by the noble Lord, Lord Russell of Liverpool, about international engagement are best addressed in a group covering regulatory co-operation, which I hope we will reach later today. I can tell him that we have introduced a new information-sharing gateway for the purpose of sharing information with overseas regulators, to ensure that Ofcom can collaborate effectively with its international counterparts. That builds on existing arrangements for sharing information that underpin Ofcom’s existing regulatory regimes.

The amendments tabled by the noble Lord, Lord Knight of Weymouth, relate to providers’ judgments about when content produced by bots is illegal content, or a fraudulent advertisement, under the Bill. Clause 170 sets out that providers will need to take into account all reasonably available relevant information about content when making a judgment about its illegality. As we discussed in the group about illegal content, providers will need to treat content as illegal when this information gives reasonable grounds for inferring that an offence was committed. Content produced by bots is in scope of providers’ duties under the Bill. This includes the illegal content duties, and the same principles for assessing illegal content will apply to bot-produced content. Rather than drawing inferences about the conduct and intent of the user who generated the content, the Bill specifies that providers should consider the conduct and the intent of the person who can be assumed to have controlled the bot at the point it created the content in question.

The noble Lord’s amendment would set out that providers could make judgments about whether bot-produced content is illegal, either by reference to the conduct or mental state of the person who owns the bot or, alternatively, by reference to the person who controls it. As he set out in his explanatory statement and outlined in his speech, I understand he has brought this forward because he is concerned that providers will sometimes not be able to identify the controller of a bot, and that this will impede providers’ duties to take action against illegal content produced by them. Even when the provider does not know the identity of the person controlling the bot, however, in many cases there will still be evidence from which providers can draw inferences about the conduct and intent of that person, so we are satisfied that the current drafting of the Bill ensures that providers will be able to make a judgment on illegality.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My concern is also whether or not the bot is out of control. Can the Minister clarify that issue?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

It depends on what the noble Lord means by “out of control” and what content the bot is producing. If he does not mind, this may be an issue which we should go through in technical detail and have a more free-flowing conservation with examples that we can work through.

--- Later in debate ---
Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- Hansard - - - Excerpts

This is a very interesting discussion; the noble Lord, Lord Knight, has hit on something really important. When somebody does an activity that we believe is criminal, we can interrogate them and ask how they came to do it and got to the conclusion that they did. The difficulty is that those of us who are not super-techy do not understand how you can interrogate a bot or an AI which appears to be out of control on how it got to the conclusion that it did. It may be drawing from lots of different places and there may be ownership of lots of different sources of information. I wonder whether that is why we are finding how this will be monitored in future so concerning. I am reassured that the noble Lord, Lord Knight of Weymouth, is nodding; does the Minister concur that this may be a looming problem for us?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I certainly concur that we should discuss the issue in greater detail. I am very happy to do so with the noble Lord, the noble Baroness and others who want to do so, along with officials. If we can bring some worked examples of what “in control” and “out of control” bots may be, that would be helpful.

I hope the points I have set out in relation to the other issues raised in this group and the amendments before us are satisfactory to noble Lords and that they will at this point be content not to press their amendments.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I thank all noble Lords who have contributed to a thought-provoking and, I suspect, longer debate than we had anticipated. At Second Reading, I think we were all taken aback when this issue was opened up by my noble friend Lord Sarfraz; once again, we are realising that this requires really careful thought. I thank my noble friend the Minister for his also quite long and thoughtful response to this debate.

I feel that I owe the Committee a small apology. I am very conscious that I talked in quite graphic detail at the beginning when there were still children in the Gallery. I hope that I did not cause any harm, but it shows how serious this is that we have all had to think so carefully about what we have been saying—only in words, without any images. We should not underestimate how much this has demonstrated the importance of our debates.

On the comments of the noble Baroness, Lady Fox, I am a huge enthusiast, like the noble Lord, Lord Knight, for the wonders of the tech world and what it can bring. We are managing the balance in this Bill to make sure that this country can continue to benefit from and lead the opportunities of tech while recognising its real and genuine harms. I suggest that today’s debate has demonstrated the potential harm that the digital world can bring.

I listened carefully—as I am certain the noble Baroness, Lady Kidron, has been doing in the digital world—to my noble friend’s words. I am encouraged by what he has put on the record on Amendment 125, but there are some specific issues that it would be helpful for us to talk about, as he alluded to, after this debate and before Report. Let me highlight a couple of those.

First, I do not really understand the technical difference between a customer service bot and other bots. I am slightly worried that we are defining in the specific one type of bot that would not be captured by this Bill. I suspect that there might be others in future. We must think carefully through whether we are getting too much into the specifics of the technology and not general enough in making sure we capture where it could go. That is one example.

Secondly, as my noble friend Lady Berridge would say, I am not sure that we have got to the bottom of whether this Bill, coupled with the existing body of criminal law, will really enable law enforcement officers to progress the cases as they see fit and protect vulnerable women—and men—in the digital world. I very much hope we can extend the conversation there. We perhaps risk getting too close to the technical specifics if we are thinking about whether a haptic suit is in or out of scope of the Bill; I am certain that there will be other technologies that we have not even thought about yet that we will want to make sure that the Bill can capture.

I very much welcome the spirit in which this debate has been held. When I said that I would do this for the noble Baroness, Lady Kidron, I did not realise quite what a huge debate we were opening up, but I thank everyone who has contributed and beg leave to withdraw the amendment.

--- Later in debate ---
Moved by
126A: Clause 50, page 48, line 31, at end insert “, and
(iii) is not a sanctioned entity (see subsection (3A)).”Member’s explanatory statement
The effect of this amendment, combined with the next amendment in the Minister’s name, is that any entity which is designated for the purposes of sanctions regulations is not a “recognised news publisher” under this Bill, with the result that the Bill’s protections which relate to “news publisher content” don’t apply.
--- Later in debate ---
Moved by
127A: Clause 50, page 49, line 9, at end insert—
“(3A) A “sanctioned entity” is an entity which—(a) is designated by name under a power contained in regulations under section 1 of the Sanctions and Anti-Money Laundering Act 2018 that authorises the Secretary of State or the Treasury to designate persons for the purposes of the regulations or of any provisions of the regulations, or (b) is a designated person under any provision included in such regulations by virtue of section 13 of that Act (persons named by or under UN Security Council Resolutions).”Member’s explanatory statement
The effect of this amendment, combined with the preceding amendment in the Minister’s name, is that any entity which is designated for the purposes of sanctions regulations is not a “recognised news publisher” under this Bill, with the result that the Bill’s protections which relate to “news publisher content” don’t apply.
--- Later in debate ---
Moved by
127B: Clause 52, page 50, line 23, after second “the” insert “voluntary”
Member’s explanatory statement
This amendment and the next amendment in the Minister’s name ensure that restrictions on a user’s access to content resulting from the user voluntarily activating any feature of a service do not count as restrictions on users’ access for the purposes of Part 3 of the Bill.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Finally, I do not want to anticipate the Minister in introducing the amendments in his name, but we have no objections to them. I am sure that they will work exactly as he proposes and that they will be acceptable.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, this has been miscellany, indeed. We must be making progress if we are picking up amendments such as these. I thank noble Lords who have spoken to the amendments and the issues covered in them.

I turn first to Amendment 185A brought to us by the noble Lord, Lord Bassam of Brighton, which seeks to add duties on online marketplaces to limit children’s access to the sale of knives, and proactively to identify and remove listings which appear to encourage the sale of knives for the purposes of violence or self-harm. Tackling knife crime is a priority for His Majesty’s Government; we are determined to crack down on this violent scourge, which is devastating our communities. I hope that he will forgive me for not drawing on the case he mentioned, as it is still sub judice. However, I certainly take the point he makes; we are all too aware of cases like it up and down the country. I received an email recently from Amanda and Stuart Stephens, whose son, Olly, was murdered by two boys, one of whom was armed with a knife. All these cases are very much in our minds as we debate the Bill.

Let me try to reassure them and the noble Lord as well as other Members of the Committee that the Bill, through its existing duties and other laws on the statute book, already achieves what the noble Lord seeks with his amendment. The sale of offensive weapons and of knives to people under the age of 18 are criminal offences. Any online retailer which directly sells these prohibited items can already be held criminally liable. Once in force, the Bill will ensure that technology platforms, including online marketplaces, prevent third parties from using their platform to sell offensive weapons or knives to people under the age of 18. The Bill lists both these offences as priority offences, meaning that user-to-user services, including online marketplaces, will have a statutory obligation proactively to prevent these offences taking place on their services.

--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I am sorry to interrupt. The Minister has twice given a positive response, but he limited it to child sexual exploitation; he did not mention terrorism, which is in fact the bigger issue. Could he confirm that it is both?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Yes, and as I say, I am happy to talk with the noble Lord about this in greater detail. Under the Bill, category 1 companies will have a new duty to safeguard all journalistic content on their platform, which includes citizen journalism. But I will have to take all these points forward with him in our further discussions.

My noble friend Lord Bethell is not here to move his Amendment 220D, which would allow Ofcom to designate online safety regulatory duties under this legislation to other bodies. We have previously discussed a similar issue relating to the Internet Watch Foundation, so I shall not repeat the points that we have already made.

On the amendments on supposedly gendered language in relation to Ofcom advisory committees in Clauses 139 and 155, I appreciate the intention to make it clear that a person of either sex should be able to perform the role of chairman. The Bill uses the term “chairman” to be consistent with the terminology in the Office of Communications Act 2002, and we are confident that this will have no bearing on Ofcom’s decision-making on who will chair the advisory committees that it must establish, just as, I am sure, the noble Lord’s Amendment 56 does not seek to be restrictive about who might be an “ombudsman”.

I appreciate the intention of Amendment 262 from the noble Baroness, Lady Bennett of Manor Castle. It is indeed vital that the review reflects the experience of young people. Clause 159 provides for a review to be undertaken by the Secretary of State, and published and laid before Parliament, to assess the effectiveness of the regulatory framework. There is nothing in the existing legislation that would preclude seeking the views of young people either as part of an advisory group or in other ways. Moreover, the Secretary of State is required to consult Ofcom and other persons she considers appropriate. In relation to young people specifically, it may be that a number of different approaches will be effective—for example, consulting experts or representative groups on children’s experiences online. That could include people of all ages. The regulatory framework is designed to protect all users online, and it is right that we take into account the full spectrum of views from people who experience harms, whatever their age and background, through a consultation process that balances all their interests.

Amendment 268AA from the noble Lord, Lord Bassam, relates to reporting requirements for online abuse and harassment, including where this is racially motivated—an issue we have discussed in Questions and particularly in relation to sport. His amendment would place an additional requirement on all service providers, even those not in scope of the Bill. The Bill’s scope extends only to user-to-user and search services. It has been designed in this way to tackle the risk of harm to users where it is highest. Bringing additional companies in scope would dilute the efforts of the legislation in this important regard.

Clauses 16 and 26 already require companies to set up systems and processes that allow users easily to report illegal content, including illegal online abuse and harassment. This amendment would therefore duplicate this existing requirement. It also seeks to create an additional requirement for companies to report illegal online abuse and harassment to the Crown Prosecution Service. The Bill does not place requirements on in-scope companies to report their investigations into crimes that occur online, other than child exploitation and abuse. This is because the Bill aims to prevent and reduce the proliferation of illegal material and the resulting harm it causes to so many. Additionally, Ofcom will be able to require companies to report on the incidence of illegal content on their platforms in its transparency reports, as well as the steps they are taking to tackle that content.

I hope that reassures the noble Lord that the Bill intends to address the problems he has outlined and those explored in the exchange with the noble Lord, Lord Clement-Jones. With that, I hope that noble Lords will support the government amendments in this group and be satisfied not to press theirs at this point.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I listened very carefully to the Minister’s response to both my amendments. He has gone some way to satisfying my concerns. I listened carefully to the concerns of the noble Baroness, Lady Fox, and noble Lords on the Lib Dem Benches. I am obviously content to withdraw my amendment.

I do not quite agree with the Minister’s point about dilution on the last amendment—I see it as strengthening —but I accept that the amendments themselves slightly stretch the purport of this element of the legislation. I shall review the Minister’s comments and I suspect that I shall be satisfied with what he said.

--- Later in debate ---
Moved by
186A: Clause 79, page 71, line 20, leave out paragraph (b)
Member’s explanatory statement
This amendment omits a provision about recouping OFCOM’s preparatory costs via fees under Part 6 of the Bill, because it is now intended to recoup all preparatory costs incurred before the fees regime is in operation via the charging of additional fees under Schedule 10 (see also the amendment to Schedule 10 in the Minister’s name).
--- Later in debate ---
Moved by
186B: Clause 80, page 71, line 26, leave out from “incurred” to end of line 27 and insert “before the first day of the initial charging year.”
Member’s explanatory statement
This amendment is to the clause introducing Schedule 10 (recovery of OFCOM’s initial costs). The amendment reflects the change to Schedule 10 proposed by the amendment of that Schedule in the Minister’s name.
--- Later in debate ---
Moved by
186C: Schedule 10, page 212, line 37, leave out from “before” to end of line 39 and insert “the first day of the initial charging year on—
(a) preparations for the exercise of their online safety functions, or(b) the exercise of their online safety functions;”Member’s explanatory statement
Schedule 10 enables OFCOM to charge additional fees to recover certain online safety costs which are met by the retention of receipts under the Wireless Telegraphy Act 2006. This amendment extends the Schedule 10 regime to cover all costs incurred before the main fees regime under Part 6 of the Bill is in operation (as opposed to only covering preparatory costs incurred before the commencement of clause 79).
--- Later in debate ---
I believe that there should be zero tolerance on whether children should be accessing material which is illegal for them, but the Bill does not say that. It says that all Ofcom’s work has to be done in proportion to the impact, not only in the direct work of trying to mitigate harms or illegality that could occur but taking into account the economic size of the company and the impact that the work would have on its activities. I do not think we can square that off, so I appeal to the Minister, when he comes to respond, to look at it from the other end. Why is it not possible to have a structure which is driven by the risk? If the risk assessment reveals risks that require action, there should not be a constraint simply because the categorisation hurdle has been met. The risk is what matters. Does he agree?
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I am grateful to noble Lords for helping us to reach our target for the first time in this Committee, especially to do so in a way which has given us a good debate on which to send us off into the Whitson Recess. I am off to the Isle of Skye, so I will make a special detour to Balmacara in honour of the noble Lord.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

The noble Lord does not believe anything that I say at this Dispatch Box, but I will send a postcard.

As noble Lords are by now well aware, all services in scope of the Bill, regardless of their size, will be required to take action against illegal content and all services likely to be accessed by children must put in place protections for children. Companies designated as category 1 providers have significant additional duties. These include the overarching transparency, accountability and freedom of expression duties, as well as duties on content of democratic importance, news publishers’ content, journalistic content and fraudulent advertising. It is right to put such duties only on the largest platforms with features enabling the greatest reach, as they have the most significant influence over public discourse online.

I turn first to Amendment 192 in the name of my noble friend Lady Morgan of Cotes and Amendment 192A from the noble Lord, Lord Griffiths of Burry Port, which are designed to widen category 1 definitions to include services that pose a risk of harm, regardless of their number of users. Following removal of the legal but harmful provisions in another place, the Bill no longer includes the concept of risk of harm in Category 1 designation. As we set out, it would not be right for the Government to define what legal content it considers harmful to adults, and it follows that it would not be appropriate for the Government to categorise providers and to require them to carry out duties based on this definition.

In addition, requiring all companies to comply with the full range of Category 1 duties would pose a disproportionate burden on services which do not exert the same influence over public discourse online. I appreciate the point made by the noble Baroness, Lady Bull, with regard to regulatory burden. There is a practical element to this as well. Services, particularly smaller ones, have finite resources. Imposing additional duties on them would divert them from complying with their illegal and child safety duties, which address the most serious online harms. We do not want to weaken their ability to tackle criminal activity or to protect children.

As we discussed in detail in a previous debate, the Bill tackles suicide and self-harm content in a number of ways. The most robust protections in the Bill are for children, while those for adults strike a balance between adults being protected from illegal content and given more choice over what legal content they see. The noble Lord, Lord Stevenson, asked why we do not start with the highest risk rather than thinking about the largest services, but we do. We start with the most severe harms—illegal activity and harm to children. We are focusing on the topics of greatest risk and then, for other categories, allowing adults to make decisions about the content with which they interact online.

A number of noble Lords referred to suicide websites and fora. We are concerned about the widespread availability of content online which promotes and advertises methods of suicide and self-harm, which can be easily accessed by young or vulnerable people. Under the Bill, where suicide and self-harm websites host user-generated content, they will be in scope of the legislation. These sites will need proactively to prevent users from being exposed to priority illegal content, including content which encourages or assists suicide under the terms of the Suicide Act 1961. Additionally, it is an offence under Section 4(3) of the Misuse of Drugs Act 1971 for a website to offer to sell controlled drugs to consumers in England and Wales. Posting advice on how to obtain such drugs in England and Wales is also likely to be an offence, regardless of where the person providing the advice is located.

The Bill also limits the availability of such content by placing illegal content duties on search services, including harmful content which affects children or where this content is shared on user-to-user services. This will play a key role in reducing traffic that directs people to websites which encourage or assist suicide, and reduce the likelihood of users encountering such content. The noble Baroness, Lady Bull, asked about starvation. Encouraging people to starve themselves or not to take prescribed medication will be covered.

Amendment 194 tabled by the noble Lord, Lord Stevenson of Balmacara, seeks to ensure that Ofcom can designate companies as category 1, 2A or 2B on a provisional basis, when it considers that they are likely to meet the relevant thresholds. This would mean that the relevant duties can be applied to them, pending a full assessment by Ofcom. The Government recognise the concern highlighted by the noble Lord, Lord Allan, about the rapid pace of change in the technology sector and how that can make it challenging to keep the register of the largest and most influential services up to date. I assure noble Lords that the Bill addresses this with a duty which the Government introduced during the Bill’s recommittal in another place. This duty, at Clause 88, requires Ofcom proactively to identify and publish a list of companies which are close to category 1 thresholds. This will reduce any delays in Ofcom adding additional obligations on companies which grow rapidly, or which introduce new high-risk features. It will also ensure that the regime remains agile and adaptable to emerging threats.

Platforms with the largest reach and greatest influence over public discourse will be designated as category 1. The Bill sets out a clear process for determining category 1 providers, based on thresholds relating to these criteria, which will be set by the Secretary of State in secondary legislation. The process has been designed to ensure that it is transparent and evidence-based. We expect the main social media platforms and possibly some others to be designated as category 1 services, but we do not wish to prejudge the process set out above by indicating which specific services are likely to be designated, as I have set out on previous groups.

The amendment would enable Ofcom to place new duties on companies without due process. Under the approach that we take in the Bill, Ofcom can designate companies as belonging to each category based only on an objective assessment of evidence against thresholds approved by Parliament. The Government’s approach also provides greater certainty for companies, as is proposed in this amendment. We have heard concerns in previous debates about when companies will have the certainty of knowing their category designation. These amendments would introduce continuous uncertainty and subjectivity into the designation process and would give Ofcom significant discretion over which companies should be subject to which duties. That would create a very uncertain operating environment for businesses and could reduce the attractiveness of the UK as a place to do business.

I hope that explains why we are not taken by these amendments but, in the spirit of the Whitsun Recess, I will certainly think about them on the train as I head north. I am very happy to discuss them with noble Lords and others between now and our return.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

Before the Minister sits down, he did let slip that he was going on the sleeper, so I do not think that there will be much thinking going on—although I did not sleep a wink the last time I went, so I am sure that he will have plenty of time.

I am sure that the noble Baroness, Lady Morgan, will want to come in—but could he repeat that again? Risk assessment drives us, but the risk assessment for a company that will not be regarded as a category 1 provider because it does not meet categorisation thresholds means that, even though it is higher risk than perhaps even some of the category 1 companies, it will not be subject to the requirements to pick up the particular issues raised by the noble Baroness and the noble Lord, and their concerns for those issues, which are clearly social harms, will not really be considered on a par.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

In the response I gave, I said that we are making the risk assessment that the riskiest behaviour is illegal content and content which presents a harm to children. That is the assessment and the approach taken in the Bill. In relation to other content which is legal and for adults to choose how they encounter it, there are protections in the Bill to enforce terms of service and empower users to curate their own experience online, but that assessment is made by adult users within the law.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

I thank all noble Lords who spoke in this short but important debate. As we heard, some issues relating to risk and harm have been returned to and will no doubt be again, and we note the impact of the absence of legal but harmful as a concept. As the noble Baroness, Lady Bull, said, I know that the noble Baroness, Lady Parminter, was very sad that she could not be here this afternoon due to another engagement.

I will not keep the House much longer. I particularly noted the noble Baroness’s point that there should not be, and is not, a direct relationship between the size of the platform and its ability to cause harm. There is a balance to be struck between the regulatory burden placed on platforms versus the health and well-being of those who are using them. As I have said before, I am not sure that we have always got that particular balance right in the Bill.

The noble Lord, Lord Allan, was very constructive: it has to be a good thing if we are now beginning to think about the Bill’s implementation, although we have not quite reached the end and I do not want to prejudge any further stages, in the sense that we are now thinking about how this would work. Of course, he is right to say that some of these platforms have no intention of complying with these rules at all. Ofcom and the Government will have to work out what to do about that.

Ultimately, the Government of the day—whoever it might be—will want the powers to be able to say that a small platform is deeply harmful in terms of its content and reach. When the Bill has been passed, there will be pressure at some point in the future on a platform that is broadcasting or distributing or amplifying content that is deeply harmful. Although I will withdraw the amendment today, my noble friend’s offer of further conversations, and more detail on categorisation and of any review of the platforms as categorised as category 1, 2 and beyond, would be very helpful in due course. I beg leave to withdraw.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am all that is left between us and hearing from the Minister with his good news, so I will constrain my comments accordingly.

The noble Baroness, Lady Kidron, begin by paying tribute to the parents of Olly, Breck, Molly, Frankie and Sophie. I very much join her in doing that; to continually have to come to this place and share their trauma and experience comes at a great emotional cost. We are all very grateful to them for doing it and for continuing to inform and motivate us in trying to do the right thing. I am grateful to my noble friend Lady Healy and in particular to the noble Baroness, Lady Newlove, for amplifying that voice and talking about the lost opportunity, to an extent, of our failure to find a way of imposing a general duty of care on the platforms, as was the original intention when the noble Baroness, Lady Morgan, was the Secretary of State.

I also pay a big tribute to the noble Baroness, Lady Kidron. She has done the whole House, the country and the world a huge service in her campaigning around this and in her influence on Governments—not just this one—on these issues. We would not be here without her tireless efforts, and it is important that we acknowledge that.

We need to ensure that coroners can access the information they need to do their job, and to have proper sanctions available to them when they are frustrated in being able to do it. This issue is not without complication, and I very much welcome the Government’s engagement in trying to find a way through it. I too look forward to the good news that has been trailed; I hope that the Minister will be able to live up to his billing. Like the noble Baroness, Lady Harding, I would love to see him embrace, at the appropriate time, the “safety by design” amendments and some others that could complete this picture. I also look forward to his answers on issues such as data preservation, which the noble Lord, Lord Allan, covered among the many other things in his typically fine speech.

I very much agree that we should have a helpline and do more about that. Some years ago, when my brother-in-law sadly died in his 30s, it fell to me to try to sort out his social media accounts. I was perplexed that the only way I could do it was by fax to these technology companies in California. That was very odd, so to have proper support for bereaved families going through their own grief at that moment seems highly appropriate.

As we have discussed in the debates on the Bill, a digital footprint is an asset that is exploited by these companies. But it is an asset that should be regarded as part of one’s estate that can be bequeathed to one’s family; then some of these issues would perhaps be lessened. On that basis, and in welcoming a really strong and moving debate, I look forward to the Minister’s comments.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - -

My Lords, this has been a strong and moving debate, and I am grateful to the noble Baroness, Lady Kidron, for bringing forward these amendments and for the way she began it. I also echo the thanks that the noble Baroness and others have given to the families of Breck Bednar, Sophie Parkinson, Molly Russell, Olly Stephens, Frankie Thomas and all the young people whose names she rightly held in remembrance at the beginning of this debate. There are too many others who find themselves in the same position. The noble Lord, Lord Knight, is right to pay tribute to their tirelessness in campaigning, given the emotional toll that we know it has on them. I know that they have followed the sometimes arcane processes of legislation and, as my noble friend Lady Morgan said, we all look forward to the Bill becoming an Act of Parliament so that it can make a difference to families who we wish to spare from the heartache they have had.

Every death is sorrowful, but the death of a child is especially heartbreaking. The Government take the issues of access to information relating to a deceased child very seriously. We have undertaken extensive work across government and beyond to understand the problems that parents, and coroners who are required to investigate such deaths, have faced in the past in order to bring forward appropriate solutions. I am pleased to say that, as a result of that work, and thanks to the tireless campaigning of the noble Baroness, Lady Kidron, and our discussions with those who, very sadly, have first-hand experience of these problems, we will bring forward a package of measures on Report to address the issues that parents and coroners have faced. Our amendments have been devised in close consultation with the noble Baroness and bereaved families. I hope the measures will rise to the expectations they rightly have and that they will receive their support.

The package of amendments will ensure that coroners have access to the expertise and information they need to conduct their investigations, including information held by technology companies, regardless of size, and overseas services such as Wattpad, mentioned by the noble Baroness, Lady Healy of Primrose Hill, in her contribution. This includes information about how a child interacted with specific content online as well as the role of wider systems and processes, such as algorithms, in promoting it. The amendments we bring forward will also help to ensure that the process for accessing data is more straightforward and humane. The largest companies must ensure that they are transparent with parents about their options for accessing data and respond swiftly to their requests. We must ensure that companies cannot stonewall parents who have lost a child and that those parents are treated with the humanity and compassion they deserve.

I take the point that the noble Baroness, Lady Kidron, rightly makes: small does not mean safe. All platforms will be required to comply with Ofcom’s requests for information about a deceased child’s online activity. That will be backed by Ofcom’s existing enforcement powers, so that where a company refuses to provide information without a valid excuse it may be subject to enforcement action, including sanctions on senior managers. Ofcom will also be able to produce reports for coroners following a Schedule 5 request on matters relevant to an investigation or inquest. This could include information about a company’s systems and processes, including how algorithms have promoted specific content to a child. This too applies to platforms of any size and will ensure that coroners are provided with information and expertise to assist them in understanding social media.

Where this Bill cannot solve an issue, we are exploring alternative avenues for improving outcomes as well. For example, the Chief Coroner has committed to consider issuing non-legislative guidance and training for coroners about social media, with the offer of consultation with experts.

Baroness Newlove Portrait Baroness Newlove (Con)
- Hansard - - - Excerpts

I am sorry to interrupt my noble friend. On the coroners’ training and national guidelines, the Chief Coroner has no powers across the nation over all the coroners. How is he or she going to check that the coroners are keeping up with their training and are absolutely on the ball? The Chief Coroner has no powers across the country and everything happens in London; we are talking about outside London. How can we know that no other family has to suffer, considering that we have this legislation?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My noble friend rightly pulled me up for not responding to her letter as speedily as we have been dealing with the questions raised by the noble Baroness, Lady Kidron. We have had some useful meetings with Ministers at the Ministry of Justice, which the noble Baroness has attended. I would be very happy to provide some detail on this to my noble friend—I am conscious of her experience as Victims’ Commissioner—either in writing or to organise a briefing if she would welcome that.

The noble Lord, Lord Allan of Hallam, rightly raised data protection. Where Ofcom and companies are required to respond to coroners’ requests for information, they are already required to comply with personal data protection legislation, which protects the privacy of other users. This may include the redaction of information that would identify other users. We are also exploring whether guidance from the Information Commissioner's Office could support technology companies to understand how data protection law applies in such cases.

The noble Lord mentioned the challenges of potential conflicts of law around the world. Where there is a conflict of laws—for example, due to data protection laws in other jurisdictions—Ofcom will need to consider the best way forward on a case-by-case basis. For example, it may request alternative information which could be disclosed, and which would provide insight into a particular issue. We will seek to engage our American counterparts to understand any potential and unintended barriers created by the US Stored Communications Act. I can reassure the noble Lord that these matters are in our mind.

We are also aware of the importance of data preservation to both coroners and bereaved parents. The Government agree with the principle of ensuring that these are preserved. We will be working towards solving this in the Data Protection and Digital Information Bill. In addition, we will explore whether there are further options to improve outcomes for parents in that Bill as well. I want to assure noble Lords and the families watching this debate closely that we will do all we can to deliver the necessary changes to give coroners and parents the information that they seek and to ensure a more straightforward and humane process in the future.

I turn in detail to the amendments the noble Baroness, Lady Kidron, brought forward. First, Amendments 215 and 216 include new requirements on Ofcom, seeking to ensure that coroners and parents can obtain data from social media companies after the death of a child. Amendment 215 would give Ofcom the ability to impose senior management liability on an individual in cases where a coroner has issued a notice requiring evidence to be provided in an inquest into the death of a child. Amendment 216 would put Ofcom’s powers at the disposal of a coroner or close relatives of a deceased child so that Ofcom would be obliged to require information from platforms or other persons about the social media activity of a deceased child. It also requires service providers to provide a point of contact. Amendments 198 and 199 are consequential to this.

As I said, we agree with the intent of the noble Baroness’s amendments and we will deal with it in the package that we will bring forward before Report. Our changes to the Bill will seek to ensure that Ofcom has the powers it needs to support coroners and their equivalents in Scotland, so that they have access to the information they need to conduct investigations into a child’s death where social media may have played a part.

--- Later in debate ---
Moved by
200A: After Clause 97, insert the following new Clause—
“Amendment of Criminal Justice and Police Act 2001
(1) The Criminal Justice and Police Act 2001 is amended as follows.(2) In section 57(1) (retention of seized items), after paragraph (t) insert—“(u) paragraph 8 of Schedule 12 to the Online Safety Act 2023.”(3) In section 65 (meaning of “legal privilege”)—(a) after subsection (8B) insert—“(8C) An item which is, or is comprised in, property which has been seized in exercise or purported exercise of the power of seizure conferred by paragraph 7(f), (j) or (k) of Schedule 12 to the Online Safety Act 2023 is to be taken for the purposes of this Part to be an item subject to legal privilege if, and only if, the seizure of that item was in contravention of paragraph 17(3) of that Schedule (privileged information or documents).”;(b) in subsection (9)—(i) at the end of paragraph (d) omit “or”;(ii) at the end of paragraph (e) insert “or”;(iii) before the closing words insert—“(g) paragraph 7(f), (j) or (k) of Schedule 12 to the Online Safety Act 2023.”(4) In Part 1 of Schedule 1 (powers of seizure to which section 50 of the Act applies), after paragraph 73U insert—“Online Safety Act 202373V Each of the powers of seizure conferred by paragraph 7(f), (j) and (k) of Schedule 12 to the Online Safety Act 2023.””Member’s explanatory statement
This amendment has the effect of providing that section 50 of the Criminal Justice and Police Act 2001 (additional powers of seizure from premises) applies to the powers of seizure under paragraph 7(f), (j) and (k) of Schedule 12 to the Bill; and makes related amendments to that Act.
--- Later in debate ---
Moved by
205A: Clause 110, page 95, line 11, leave out “relating to terrorism content present on a service” and insert “that relates to a user-to-user service (or to the user-to-user part of a combined service) and requires the use of technology in relation to terrorism content”
Member’s explanatory statement
This amendment makes it clear that the requirement in clause 110(7) regarding which content is communicated publicly is relevant to user-to-user services and may apply in both the cases mentioned in clause 110(2)(a)(i) and (ii).
--- Later in debate ---
Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to the noble Lords, Lord Bethell, Lord Curry and Lord Allan for introducing their amendments, to the noble Baroness, Lady Morgan, for her direct question, and to the noble Baroness, Lady Kidron, for her equally direct question. I am sure they will be of great assistance to the Minister when he replies. I will highlight the words of the noble Lord, Lord Allan, who said “We are looking for services to succeed”. I think that is right, but what is success? It includes compliance and enforcement, and that is what this group refers to.

The amendments introduced by the noble Lord, Lord Bethell, seek to strengthen what is already in the Bill about Ofcom’s Chapter 6 powers of enforcement, otherwise known as business disruption powers, and they focus on what happens in the event of a breach; they seek to be more prescriptive than what we already have. I am sure the Minister will remember that the same issue came up in the Digital Economy Bill, around the suggestion that the Government should take specific powers. There, the Government argued they had assurances from credit card companies that, if and when action was required, they would co-operate. In light of that previous discussion, it will be interesting to hear what the Minister has to say.

In respect of the amendments introduced by the noble Lord, Lord Curry, on the need to toughen up requirements on Ofcom to act, I am sure the Minister will say that these powers are not required and that the Bill already makes provision for Ofcom blocking services which are failing in their duties. I echo the concern of the noble Lord, Lord Clement-Jones, about being overly prescriptive and not allowing Ofcom to do its job. The truth is that Ofcom may need discretion but it also needs teeth, and I will be interested to hear what the Minister has to say about whether he feels, in the light of the debate today and other conversations, that there is sufficient toughness in the Bill and that Ofcom will be able to do the job it is required to do. There is an issue of the balance of discretion versus requirement, and I know he will refer to this. I will also be interested to hear from the Minister about the view of Ofcom with respect to what is in the Bill, and whether it feels that it has sufficient powers.

I will raise a final point about the amendments in the name of the noble Lord, Lord Curry. I think they ask a valid question about the level of discretion that Ofcom will have. I ask the Minister this: if, a few years down the line, we find that Ofcom has not used the powers suitably, despite clear failures, what would the Government seek to do? With that, I look forward to hearing from the Minister.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, where necessary, the regulator will be able to apply to the courts for business disruption measures. These are court orders which will require third-party ancillary services and access facilities to withdraw their services from, or impede users’ access to, non-compliant regulated services. These are strong, flexible powers which will ensure that Ofcom can take robust action to protect users. At the same time, we have ensured that due process is followed. An application for a court order will have to specify the non-compliant provider, the grounds and evidence on which the application is based and the steps that third parties must take to withdraw services or block users’ access. Courts will consider whether business disruption measures are an appropriate way of preventing harm to users and, if an order is granted, ensure it is proportionate to the risk of harm. The court will also consider the interests of all relevant parties, which may include factors such as contractual terms, technical feasibility and the costs of the measures. These powers will ensure that services can be held to account for failure to comply with their duties under the Bill, while ensuring that Ofcom’s approach to enforcement is proportionate and upholds due process.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I am reminded by my noble friend Lord Foster of Bath, particularly relating to the gambling sector, that some of these issues may run across various regulators that are all seeking business disruption. He reminded me that if you type into a search engine, which would be regulated and subject to business disruption measures here, “Casinos not regulated by GAMSTOP”, you will get a bunch of people who are evading GAMSTOP’s regulation. Noble Lords can imagine similar for financial services—something that I know the noble Baroness, Lady Morgan of Cotes, is also very interested in. It may not be for answer now, but I would be interested to understand what thinking the Government have on how all the different business disruption regimes—financial, gambling, Ofcom-regulated search services, et cetera—will all mesh together. They could all come before the courts under slightly different legal regimes.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

When I saw the noble Lord, Lord Foster of Bath, and the noble Baroness, Lady Armstrong of Hill Top, in their places, I wondered whether they were intending to raise these points. I will certainly take on board what the noble Lord says and, if there is further information I can furnish your Lordships with, I certainly will.

The noble Baroness, Lady Kidron, asked whether the powers can be used on out-of-scope services. “No” is the direct answer to her direct question. The powers can be used only in relation to regulated services, but if sites not regulated by the Bill are publishing illegal content, existing law enforcement powers—such as those frequently deployed in cases of copyright infringement—can be used. I could set out a bit more in writing if that would be helpful.

My noble friend Lord Bethell’s amendments seek to set out in the Bill that Ofcom will be able to make a single application to the courts for an order enabling business disruption measures that apply against multiple platforms and operators. I must repeat, as he anticipated, the point made by my right honourable friend Chris Philp that the civil procedure rules allow for a multi-party claim to be made. These rules permit any number of claimants or defendants and any number of claims to be covered by one claim form. The overriding objective of the civil procedure rules is that cases are dealt with justly and proportionately. I want to reassure my noble friend that the Government are confident that the civil procedure rules will provide the necessary flexibility to ensure that services can be blocked or restricted.

The amendment in the name of the noble Lord, Lord Allan of Hallam, seeks to clarify what services might be subject to access restriction orders by removing the two examples provided in the Bill: internet access services and application stores. I would like to reassure him that these are simply indicative examples, highlighting two kinds of service on which access restriction requirements may be imposed. It is not an exhaustive list. Orders could be imposed on any services that meet the definition—that is, a person who provides a facility that is able to withdraw, adapt or manipulate it in such a way as to impede access to the regulated service in question. This provides Ofcom with the flexibility to identify where business disruption measures should be targeted, and it future-proofs the Bill by ensuring that the power remains functional and effective as technologies develop.

As the noble Lord highlighted, these are significant powers that can require that services be blocked in the UK. Clearly, limiting access to services in this way substantially affects the business interests of the service in question and the interests of the relevant third-party service, and it could affect users’ freedom of expression. It is therefore essential that appropriate safeguards are included and that due process is followed. That is why Ofcom will be required to seek a court order to be able to use these powers, ensuring that the courts have proper oversight.

To ensure that due process is upheld, an application by the regulator for a court order will have to specify the non-compliant provider, the grounds of the order and the steps that Ofcom considers should be imposed on the third parties in order to withdraw services and block users’ access. These requirements will ensure that the need to act quickly to tackle harm is appropriately balanced against upholding fundamental rights.

It might be useful to say a little about how blocking works—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Before the Minister does that, can he say whether he envisages that operating against VPNs as well?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

If I may, I will take advice on that and write to the noble Lord.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

That would be useful.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Yes; he made a helpful point, and I will come back on it.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

We share a common interest in understanding whether it would be used against VPNs, but we may not necessarily have the same view about whether it should be. Do not take that as an encouragement—take it as a request for information.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I thank the noble Lord.

The term “blocking” is used to describe measures that will significantly impede or restrict access to non-compliant services—for example, internet service providers blocking websites or app stores blocking certain applications. These measures will be used only in exceptional circumstances, where the service has committed serious failures in meeting its duties and where no other action would reasonably prevent online harm to users in the UK.

My noble friend Lord Bethell’s Amendments 218F and 218L seek to ensure that Ofcom can request that an interim service or access restriction order endures for a period of six months in cases where a service hosts pornographic content. I reassure him that the court will already be able to make an order which can last up to six months. Indeed, the court’s interim order can have effect until either the date on which the court makes a service or access restriction order, or an expiry date specified by the court in the order. It is important that sanctions be determined on a case-by-case basis, which is why no limitations are set for these measures in the Bill.

As my noble friend knows, in the Bill there are clear duties on providers to ensure that children are not able to access pornography, which Ofcom will have a robust set of powers to enforce. It is important, however, that Ofcom’s powers and its approach to enforcement apply equally and consistently across the range of harms in scope of the Bill, rather than singling out one form of content in particular.

I hope that that is useful to noble Lords, along with the commitment to write on the further points which were raised. With that, I urge my noble friend to withdraw his amendment.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

My Lords, to be honest, this debate has been an incredible relief to me. Here we have been taking a step away from some of the high-level conversations we had about what we mean by the internet and safety, looking at the far horizon, and instead looking at the moment when the Bill has real traction to try to change behaviours and improve the environment of the internet. I am extremely grateful to the Minister for his fulsome reply on a number of the issues.

The reason why it is so important is the two big areas where enforcement and compliance are going to be really tricky. First, there is Ofcom’s new relationship with the really big behemoths of the internet. It has a long tradition of partnership with big companies such as ITV, the radio sector—with the licensed authorities. However, of course it has licences, and it can pull them. I have worked for some of those companies, and it is quite a thing to go to see your regulator when you know that it can pull your licence. Obviously, that is within legal reason, but at the end of the day it owns your licence, and that is different to having a conversation where it does not.

The second class is the Wild West: the people living in open breach of regular societal norms who care not for the intentions of either the regulator, the Government or even mainstream society. Bringing those people back into reasonable behaviour will be a hell of a thing. My noble friend Lord Grade spoke, reasonably but with a degree of trepidation, about the challenge faced by Ofcom there. I am extremely grateful to the Minister for addressing those points.

Ofcom will step up to having a place next to the FCA and the MHRA. The noble Lord, Lord Curry, spoke about some of the qualities needed of one of the big three regulators. Having had some ministerial oversight of the MHRA, I can tell your Lordships that it has absolutely no hesitation about tackling big pharmaceutical companies and is very quick, decisive and clear. It wields a big stick—or, to use the phrase of the noble Baroness, Lady Merron, big teeth—in order to conduct that. That is why I ask the Minister just to keep in mind some of the recommendations embedded in these amendments.

The noble Baroness, Lady Kidron, mentioned illegal content, and I appreciate the candour of the Minister’s reply. However, business disruption measures offer an opportunity to address the challenge of illegal content, which is something that I know the Secretary of State has spoken about very interestingly, in terms of perhaps commissioning some kind of review. If such a thing were to happen, I ask that business disruption measures and some way of employing them might be brought into that.

We should look again at enforcement and compliance. I appreciate the Minister saying that it is important to let the regulator make some of these decisions, but the noble Lord, Lord Allan, was right: the regulator needs to know what the Government’s intentions are. I feel that we have opened the book on this, but there is still a lot more to be said about where the Government see the impact of regulation and compliance ending up. In all the battles in other jurisdictions—France, Germany, the EU, Canada, Louisiana and Utah—it all comes down to enforcement and compliance. We need to know more of what the Government hope to achieve in that area. With that, I beg leave to withdraw my amendment.

--- Later in debate ---
Moved by
218A: After Clause 125, insert the following new Clause—
“Confirmation decisions: offence
(1) A person to whom a confirmation decision is given commits an offence if, without reasonable excuse, the person fails to comply with a requirement imposed by the decision which—(a) is of a kind described in section 121(1), and(b) relates (whether or not exclusively) to a children’s online safety duty.(2) A “children’s online safety duty” means a duty set out in—(a) section 11(3)(a),(b) section 11(3)(b),(c) section 72(2), or(d) section 72(3).(3) A person who commits an offence under this section is liable—(a) on summary conviction in England and Wales, to imprisonment for a term not exceeding the general limit in a magistrates’ court or a fine (or both);(b) on summary conviction in Scotland, to imprisonment for a term not exceeding 12 months or a fine not exceeding the statutory maximum (or both);(c) on summary conviction in Northern Ireland, to imprisonment for a term not exceeding 6 months or a fine not exceeding the statutory maximum (or both);(d) on conviction on indictment, to imprisonment for a term not exceeding 2 years or a fine (or both).”Member’s explanatory statement
This amendment creates a new offence of failure to comply with requirements of a confirmation decision that relate to specified duties to protect children’s online safety.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, the Government are supportive of improving data sharing and encouraging greater collaboration between companies and researchers, subject to the appropriate safeguards. However, the data that companies hold about users can, of course, be sensitive; as such, mandating access to data that are not publicly available would be a complex matter, as noble Lords noted in their contributions. The issue must be fully thought through to ensure that the risks have been considered appropriately. I am grateful for the consideration that the Committee has given this matter.

It is because of this complexity that we have given Ofcom the task of undertaking a report on researchers’ access to information. Ofcom will conduct an in-depth assessment of how researchers can currently access data. To the point raised by the noble Lord, Lord Knight, and my noble friend Lord Bethell, let me provide reassurance that Ofcom will assess the impact of platforms’ policies that restrict access to data in this report, including where companies charge for such access. The report will also cover the challenges that constrain access to data and how such challenges might be addressed. These insights will provide an evidence base for any guidance that Ofcom may issue to help improve data access for researchers in a safe and secure way.

Amendments 230 and 231 seek to require Ofcom to publish a report into researchers’ access to data more rapidly than within the currently proposed two years. I share noble Lords’ desire to develop the evidence base on this issue swiftly, but care must be taken to balance Ofcom’s timelines to ensure that it can deliver its key priorities in establishing the core parts of the regulatory framework that the Bill will bring in; for example, the illegal content and child safety duties. Implementing these duties must be the immediate priority for Ofcom to ensure that the Bill meets its objective of protecting people from harm. It is crucial that we do not divert attention away from these areas and that we allow Ofcom to carry out this work as soon as is practicable.

Further to this, considering the complex matter of researchers’ access to data will involve consultation with interested parties, such as the Information Commissioner’s Office, the Centre for Data Ethics and Innovation, UK Research and Innovation, representatives of regulated services and others—including some of those parties mentioned by noble Lords today—as set out in Clause 146(3). This is an extremely important issue that we need to get right. Ofcom must be given adequate time to consult as it sees necessary and undertake the appropriate research.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Before the Minister succeeds in disappointing us, can he clarify something for us? Once Ofcom has published the report, it has the power to issue guidance. What requirement is there for platforms to abide by that guidance? We want there to be some teeth at the end of all this. There is a concern that a report will be issued, followed by some guidance, but that nothing much else will happen.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

It is guidance rather than direction, but it will be done openly and transparently. Users will be able to see the guidance which Ofcom has issued, to see whether companies have responded to it as they see fit and, through the rest of the framework of the Bill, be empowered to make their decisions about their experiences online. This being done openly and transparently, and informed by Ofcom’s research, will mean that everyone is better informed.

We are sympathetic to the amendment. It is complex, and this has been a useful debate—

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I wonder whether the Minister has an answer to the academic community, who now see their European colleagues getting ahead through being able to access data through other legislation in other parts of the world. Also, we have a lot of faith in Ofcom, but it seems a mistake to let it be the only arbiter of what needs to be seen.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

We are very aware that we are not the only jurisdiction looking at the important issues the Bill addresses. The Government and, I am sure, academic researchers will observe the implementation of the European Union’s Digital Services Act with interest, including the provisions about researchers’ access. We will carefully consider any implications of our own online safety regime. As noble Lords know, the Secretary of State will be required to undertake a review of the framework between two and five years after the Bill comes into force. We expect that to include an assessment of how the Bill’s existing transparency provisions facilitate researcher access.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I do not expect the Minister to have an answer to this today, but it will be useful to get this on the record as it is quite important. Can he let us know the Government’s thinking on the other piece of the equation? We are getting the platforms to disclose the data, and an important regulatory element is the research organisations that receive it. In the EU, that is being addressed with a code of conduct, which is a mechanism enabled by the general data protection regulation that has been approved by the European Data Protection Board and creates this legal framework. I am not aware of equivalent work having been done in the UK, but that is an essential element. We do not want to find that we have the teeth to persuade the companies to disclose the data, but not the other piece we need—probably overseen by the Information Commissioner’s Office rather than Ofcom—which is a mechanism for approving researchers to receive and then use the data.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

We are watching with interest what is happening in other jurisdictions. If I can furnish the Committee with any information in the area the noble Lord mentions, I will certainly follow up in writing.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I have a question, in that case, in respect of the jurisdictions. Why should we have weaker powers for our regulator than others?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I do not think that we do. We are doing things differently. Of course, Ofcom will be looking at all these matters in its report, and I am sure that Parliament will have an ongoing interest in them. As jurisdictions around the world continue to grapple with these issues, I am sure that your Lordships’ House and Parliament more broadly will want to take note of those developments.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

But surely, there is no backstop power. There is the review but there is no backstop which would come into effect on an Ofcom recommendation, is there?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

We will know once Ofcom has completed its research and examination of these complex issues; we would not want to pre-judge its conclusions.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Again, that would require primary legislation.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

With that, if there are no further questions, I invite the noble Lord to withdraw his amendment.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, this was a short but important debate with some interesting exchanges at the end. The noble Baroness, Lady Harding, mentioned the rapidly changing environment generated by generative AI. That points to the need for wider ecosystem-level research on an independent basis than we fear we might get as things stand, and certainly wider than the skilled persons we are already legislating for. The noble Lord, Lord Bethell, referred to the access that advertisers already have to insight. It seems a shame that we run the risk, as the noble Baroness, Lady Kidron, pointed out, of researchers in other jurisdictions having more privileged access than researchers in this country, and therefore becoming dependent on those researchers and whistleblowers to give us that wider view. We could proceed with a report and guidance as set out in the Bill but add in some reserved powers in order to take action if the report suggests that Ofcom might need and want that. The Minister may want to reflect on that, having listened to the debate. On that basis, I am happy to beg leave to withdraw the amendment.

--- Later in debate ---
I will leave the Minister with a few questions. It would be helpful to hear what consultation there has been with self-harm specific organisations and how the government amendments differ from the broader “glamorisation” offence, which was rejected by the Law Commission. It would also be helpful to hear examples of content that are intended to be criminalised by the offence. That would be of interest to your Lordships’ Committee and the coalition of very key organisations and individuals who are keen, as we all are, to see this Bill end up in the right form and place. I look forward to hearing from the Minister.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, this has been a broad and mixed group of amendments. I will be moving the amendments in my name, which are part of it. These introduce the new offence of encouraging or assisting serious self-harm and make technical changes to the communications offences. If there can be a statement covering the group and the debate we have had, which I agree has been well informed and useful, it is that this Bill will modernise criminal law for communications online and offline. The new offences will criminalise the most damaging communications while protecting freedom of expression.

Amendments 264A, 266 and 267, tabled by the noble Lord, Lord Clement-Jones, and my noble friend Lady Buscombe, would expand the scope of the false communications offence to add identity theft and financial harm to third parties. I am very grateful to them for raising these issues, and in particular to my noble friend Lady Buscombe for raising the importance of financial harm from fake reviews. This will be addressed through the Digital Markets, Competition and Consumers Bill, which was recently introduced to Parliament. That Bill proposes new powers to address fake and misleading reviews. This will provide greater legal clarity to businesses and consumers. Where fake reviews are posted, it will allow the regulator to take action quickly. The noble Baroness is right to point out the specific scenarios about which she has concern. I hope she will look at that Bill and return to this issue in that context if she feels it does not address her points to her satisfaction.

Identity theft is dealt with by the Fraud Act 2006, which captures those using false identities for their own benefit. It also covers people selling or using stolen personal information, such as banking information and national insurance numbers. Adding identity theft to the communications offences here would duplicate existing law and expand the scope of the offences too broadly. Identity theft, as the noble Lord, Lord Clement-Jones, noted, is better covered by targeted offences rather than communications offences designed to protect victims from psychological and physical harm. The Fraud Act is more targeted and therefore more appropriate for tackling these issues. If we were to add identity theft to Clause 160, we would risk creating confusion for the courts when interpreting the law in these areas—so I hope the noble Lord will be inclined to side with clarity and simplicity.

Amendment 265, tabled by my noble friend Lord Moylan, gives me a second chance to consider his concerns about Clause 160. The Government believe that the clause is necessary and that the threshold of harm strikes the right balance, robustly protecting victims of false communications while maintaining people’s freedom of expression. Removing “psychological” harm from Clause 160 would make the offence too narrow and risk excluding communications that can have a lasting and serious effect on people’s mental well-being.

But psychological harm is only one aspect of Clause 160; all elements of the offence must be met. This includes a person sending a knowingly false message with an intention to cause non-trivial harm, and without reasonable excuse. It has also been tested extensively as part of the Law Commission’s report Modernising Communications Offences, when determining what the threshold of harm should be for this offence. It thus sets a high bar for prosecution, whereby a person cannot be prosecuted solely on the basis of a message causing psychological harm.

The noble Lord, Lord Allan, rightly recalled Section 127 of the Communications Act and the importance of probing issues such as this. I am glad he mentioned the Twitter joke trial—a good friend of mine acted as junior counsel in that case, so I remember it well. I shall spare the blushes of the noble Baroness, Lady Merron, in recalling who the Director of Public Prosecutions was at the time. But it is important that we look at these issues, and I am happy to speak further with my noble friend Lord Moylan and the noble Baroness, Lady Fox, about this and their broader concerns about freedom of expression between now and Report, if they would welcome that.

My noble friend Lord Moylan said that it would be unusual, or novel, to criminalise lying. The offence of fraud by false representation already makes it an offence dishonestly to make a false representation—to breach the ninth commandment—with the intention of making a gain or causing someone else a loss. So, as my noble and learned friend Lord Garnier pointed out, there is a precedent for lies with malicious and harmful intent being criminalised.

Amendments 267AA, 267AB and 268, tabled my noble friend Lady Buscombe and the noble Baroness, Lady Kennedy of The Shaws, take the opposite approach to those I have just discussed, as they significantly lower and expand the threshold of harm in the false and threatening communications offences. The first of these would specify that a threatening communications offence is committed even if someone encountering the message did not fear that the sender specifically would carry out the threat. I am grateful to the noble Baroness for her correspondence on this issue, informed by her work in Scotland. The test here is not whether a message makes a direct threat but whether it conveys a threat—which can certainly cover indirect or implied threats.

I reassure the noble Baroness and other noble Lords that Clause 162 already captures threats of “death or serious harm”, including rape and disfigurement, as well as messages that convey a threat of serious harm, including rape and death threats, or threats of serious injury amounting to grievous bodily harm. If a sender has the relevant intention or recklessness, the message will meet the required threshold. But I was grateful to see my right honourable friend Edward Argar watching our debates earlier, in his capacity as Justice Minister. I mentioned the matter to him and will ensure that his officials have the opportunity to speak to officials in Scotland to look at the work being done with regard to Scots law, and to follow the points that the noble Baroness, Lady Bennett, made about pictures—

Baroness Kennedy of Shaws Portrait Baroness Kennedy of The Shaws (Lab)
- Hansard - - - Excerpts

I am grateful to the Minister. I was not imagining that the formulations that I played with fulfilled all of the requirements. Of course, as a practising lawyer, I am anxious that we do not diminish standards. I thank the noble Baroness, Lady Fox, for raising concerns about freedom of speech, but this is not about telling people that they are unattractive or ugly, which is hurtful enough to many women and can have very deleterious effects on their self-confidence and willingness to be public figures. Actually, I put the bar reasonably high in describing the acts that I was talking about: threats that somebody would kill, rape, bugger or disfigure you, or do whatever to you. That was the shocking thing: the evidence showed that it was often at that high level. It is happening not just to well-known public figures, who can become somewhat inured to this because they can find a way to deal with it; it is happening to schoolgirls and young women in universities, who get these pile-ons as well. We should reckon with the fact that it is happening on a much wider basis than many people understand.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Yes, we will ensure that, in looking at this in the context of Scots law, we have the opportunity to see what is being done there and that we are satisfied that all the scenarios are covered. In relation to the noble Baroness’s Amendment 268, the intentional encouragement or assistance of a criminal offence is already captured under Sections 44 to 46 of the Serious Crime Act 2007, so I hope that that satisfies her that that element is covered—but we will certainly look at all of this.

I turn to government Amendment 268AZA, which introduces the new serious self-harm offence, and Amendments 268AZB and 268AZC, tabled by the noble Lords, Lord Allan and Lord Clement-Jones. The Government recognise that there is a gap in the law in relation to the encouragement of non-fatal self-harm. The new offence will apply to anyone carrying out an act which intends to, and is capable of, encouraging or assisting another person seriously to self-harm by means of verbal or electronic communications, publications or correspondence.

I say to the noble Baroness, Lady Finlay of Llandaff, that the new clause inserted by Amendment 268AZA is clear that, when a person sends or publishes a communication that is an offence, it is also clear that, when a person forwards on another person’s communication, that will be an offence too. The new offence will capture only the most serious behaviour and avoid criminalising vulnerable people who share their experiences of self-harm. The preparation of these clauses was informed by extensive consultation with interested groups and campaign bodies. The new offence includes two key elements that constrain the offence to the most culpable offending; namely, that a person’s act must be intended to encourage or assist the serious self-harm of another person and that serious self-harm should amount to grievous bodily harm. If a person does not intend to encourage or assist serious self-harm, as will likely be the case with recovery and supportive material, no offence will be committed. The Law Commission looked at this issue carefully, following evidence from the Samaritans and others, and the implementation will be informed by an ongoing consultation as well.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I am sorry to interrupt the Minister, but the Law Commission recommended that the DPP’s consent should be required. The case that the Minister has made on previous occasions in some of the consultations that he has had with us is that this offence that the Government have proposed is different from the Law Commission one, and that is why they have not included the DPP’s consent. I am rather baffled by that, because the Law Commission was talking about a high threshold in the first place, and the Minister is talking about a high threshold of intent. Even if he cannot do so now, it would be extremely helpful to tie that down. As the noble Baroness and my noble friend said, 130 organisations are really concerned about the impact of this.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

The Law Commission recommended that the consent, but not the personal consent, of the Director of Public Prosecutions should be required. We believe, however, that, because the offence already has tight parameters due to the requirement for an intention to cause serious self-harm amounting to grievous bodily harm, as I have just outlined, an additional safeguard of obtaining the personal consent of the Director of Public Prosecutions is not necessary. We would expect the usual prosecutorial discretion and guidance to provide sufficient safeguards against inappropriate prosecutions in this area. As I say, we will continue to engage with those groups that have helped to inform the drafting of these clauses as they are implemented to make sure that that assessment is indeed borne out.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I will follow up in writing on that point.

Before I conclude, I will mention briefly the further government amendments in my name, which make technical and consequential amendments to ensure that the communications offences, including the self-harm offence, have the appropriate territorial extent. They also set out the respective penalties for the communications offences in Northern Ireland, alongside a minor adjustment to the epilepsy trolling offence, to ensure that its description is more accurate.

I hope that noble Lords will agree that the new criminal laws that we will make through this Bill are a marked improvement on the status quo. I hope that they will continue to support the government amendments. I express my gratitude to the Law Commission and to all noble Lords—

Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- Hansard - - - Excerpts

Just before the Minister sits down—I assume that he has finished his brief on the self-harm amendments; I have been waiting—I have two questions relating to what he said. First, if I heard him right, he said that the person forwarding on is also committing an offence. Does that also apply to those who set up algorithms that disseminate, as opposed to one individual forwarding on to another individual? Those are two very different scenarios. We can see how one individual forwarding to another could be quite targeted and malicious, and we can see how disseminating through an algorithm could have very widespread harms across a lot of people in a lot of different groups—all types of groups—but I am not clear from what he said that that has been caught in his wording.

Secondly—I will ask both questions while I can—I asked the Minister previously why there have been no prosecutions under the Suicide Act. I understood from officials that this amendment creating an offence was to reflect the Suicide Act and that suicide was not included in the Bill because it was already covered as an offence by the Suicide Act. Yet there have been no prosecutions and we have had deaths, so I do not quite understand why I have not had an answer to that.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I will have to write on the second point to try to set that out in further detail. On the question of algorithms, the brief answer is no, algorithms would not be covered in the way a person forwarding on a communication is covered unless the algorithm has been developed with the intention of causing serious self-harm; it is the intention that is part of the test. If somebody creates an algorithm intending people to self-harm, that could be captured, but if it is an algorithm generally passing it on without that specific intention, it may not be. I am happy to write to the noble Baroness further on this, because it is a good question but quite a technical one.

Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- Hansard - - - Excerpts

It needs to be addressed, because these very small websites already alluded to are providing some extremely nasty stuff. They are not providing support to people and helping decrease the amount of harm to those self-harming but seem to be enjoying the spectacle of it. We need to differentiate and make sure that we do not inadvertently let one group get away with disseminating very harmful material simply because it has a small website somewhere else. I hope that will be included in the Minister’s letter; I do not expect him to reply now.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

Some of us are slightly disappointed that my noble friend did not respond to my point on the interaction of Clause 160 with the illegal content duty. Essentially, what appears to be creating a criminal offence could simply be a channel for hyperactive censorship on the part of the platforms to prevent the criminal offence taking place. He has not explained that interaction. He may say that there is no interaction and that we would not expect the platforms to take any action against offences under Clause 160, or that we expect a large amount of action, but nothing was said.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

If my noble friend will forgive me, I had better refresh my memory of what he said—it was some time ago—and follow up in writing.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I will be extremely brief. There is much to chew on in the Minister’s speech and this was a very useful debate. Some of us will be happier than others; the noble Baroness, Lady Buscombe, will no doubt look forward to the digital markets Bill and I will just have to keep pressing the Minister on the Data Protection and Digital Information Bill.

There is a fundamental misunderstanding about digital identity theft. It will not necessarily always be fraud that is demonstrated—the very theft of the identity is designed to be the crime, and it is not covered by the Fraud Act 2006. I am delighted that the Minister has agreed to talk further with the noble Baroness, Lady Kennedy, because that is a really important area. I am not sure that my noble friend will be that happy with the response, but he will no doubt follow up with the Minister on his amendments.

The Minister made a very clear statement on the substantive aspect of the group, the new crime of encouraging self-harm, but further clarification is still needed. We will look very carefully at what he said in relation to what the Law Commission recommended, because it is really important that we get this right. I know that the Minister will talk further with the noble Baroness, Lady Finlay, who is very well versed in this area. In the meantime, I beg leave to withdraw my amendment.

--- Later in debate ---
Moved by
267A: Clause 160, page 138, line 25, leave out from “liable” to end of line 27 and insert “—
(a) on summary conviction in England and Wales, to imprisonment for a term not exceeding the maximum term for summary offences or a fine (or both);(b) on summary conviction in Northern Ireland, to imprisonment for a term not exceeding 6 months or a fine not exceeding level 5 on the standard scale (or both).”Member’s explanatory statement
This amendment sets out the penalties for the false communications offence in Northern Ireland, since the offence is now to extend to Northern Ireland as well as England and Wales.
--- Later in debate ---
Moved by
267B: Clause 162, page 139, line 38, after “conviction” insert “in England and Wales”
Member’s explanatory statement
This amendment adds a reference to England and Wales to differentiate the provision from the similar provision applying to Northern Ireland (see the next amendment in the Minister’s name).
--- Later in debate ---
Moved by
268A: Clause 164, page 142, line 30, leave out subsection (14)
Member’s explanatory statement
This is a technical amendment about extent - the extent of the epilepsy trolling offence in clause 164 is now dealt with by amendments of clause 210 (see the amendments of that clause in the Minister’s name).
--- Later in debate ---
Moved by
268AZA: After Clause 164, insert the following new Clause—
“Offence of encouraging or assisting serious self-harm
(1) A person (D) commits an offence if—(a) D does a relevant act capable of encouraging or assisting the serious self-harm of another person, and(b) D’s act was intended to encourage or assist the serious self-harm of another person.(2) D “does a relevant act” if D—(a) communicates in person,(b) sends, transmits or publishes a communication by electronic means,(c) shows a person such a communication,(d) publishes material by any means other than electronic means,(e) sends, gives, shows or makes available to a person—(i) material published as mentioned in paragraph (d), or(ii) any form of correspondence, or(f) sends, gives or makes available to a person an item on which data is stored electronically.(3) “Serious self-harm” means self-harm amounting to—(a) in England and Wales and Northern Ireland, grievous bodily harm within the meaning of the Offences Against the Person Act 1861, and(b) in Scotland, severe injury,and includes successive acts of self-harm which cumulatively reach that threshold.(4) The person referred to in subsection (1)(a) and (b) need not be a specific person (or class of persons) known to, or identified by, D.(5) D may commit an offence under this section whether or not serious self-harm occurs.(6) If a person (D1) arranges for a person (D2) to do an act that is capable of encouraging or assisting the serious self-harm of another person and D2 does that act, D1 is to be treated as also having done it.(7) In the application of subsection (1) to an act by D involving an electronic communication or a publication in physical form, it does not matter whether the content of the communication or publication is created by D (so for example, in the online context, the offence under this section may be committed by forwarding another person’s direct message or sharing another person’s post).(8) In the application of subsection (1) to the sending, transmission or publication by electronic means of a communication consisting of or including a hyperlink to other content, the reference in subsection (2)(b) to the communication is to be read as including a reference to content accessed directly via the hyperlink.(9) In the application of subsection (1) to an act by D involving an item on which data is stored electronically, the reference in subsection (2)(f) to the item is to be read as including a reference to content accessed by means of the item to which the person in receipt of the item is specifically directed by D.(10) A provider of an internet service by means of which a communication is sent, transmitted or published is not to be regarded as a person who sends, transmits or publishes it.(11) Any reference in this section to doing an act that is capable of encouraging the serious self-harm of another person includes a reference to doing so by threatening another person or otherwise putting pressure on another person to seriously self-harm. “Seriously self-harm” is to be interpreted consistently with subsection (3).(12) Any reference to an act in this section, except in subsection (3), includes a reference to a course of conduct, and references to doing an act are to be read accordingly.(13) In subsection (3) “act” includes omission.(14) A person who commits an offence under this section is liable—(a) on summary conviction in England and Wales, to imprisonment for a term not exceeding the general limit in a magistrates’ court or a fine (or both);(b) on summary conviction in Scotland, to imprisonment for a term not exceeding 12 months or a fine not exceeding the statutory maximum (or both);(c) on summary conviction in Northern Ireland, to imprisonment for a term not exceeding 6 months or a fine not exceeding the statutory maximum (or both);(d) on conviction on indictment, to imprisonment for a term not exceeding 5 years or a fine (or both).”Member’s explanatory statement
This amendment inserts a new offence of encouraging or assisting another person to seriously self-harm, with intent to do so, by means of verbal or electronic communications, publications or correspondence.
--- Later in debate ---
Moved by
268B: Clause 165, page 142, line 32, leave out subsections (1) and (2)
Member’s explanatory statement
This amendment omits provisions which relate to offences that extended to England and Wales only, as the offences in question are now to extend to Northern Ireland as well.
--- Later in debate ---
Moved by
268FA: Clause 166, page 143, line 10, leave out “or 164” and insert “, 164 or (Offence of encouraging or assisting serious self-harm)”
Member’s explanatory statement
This amendment ensures that clause 166, which is about the liability of corporate officers for offences, applies in relation to the new offence proposed by the amendment in the Minister’s name to be inserted after clause 164.
--- Later in debate ---
Moved by
271A: Clause 168, page 144, line 17, after “Wales” insert “and Northern Ireland”
Member’s explanatory statement
This amendment ensures that section 127(2)(a) and (b) of the Communications Act 2003 is repealed for Northern Ireland as well as England and Wales (because the false communications offence in clause 160 is now to extend to Northern Ireland as well).
--- Later in debate ---
Moved by
271BA: Clause 169, page 144, line 25, at end insert—
“(1A) Part 1A of Schedule 14 contains amendments consequential on section (Offence of encouraging or assisting serious self-harm).” Member’s explanatory statement
This amendment introduces a Part of Schedule 14 containing consequential amendments related to the new offence proposed by the amendment in the Minister’s name to be inserted after clause 164.
--- Later in debate ---
Moved by
271C: Schedule 14, page 231, line 33, leave out from “2003” to “after” in line 34 and insert “, in the list of offences for England and Wales,”
Member’s explanatory statement
This amendment makes it clearer that changes to the Sexual Offences Act 2003 in paragraph 2 of Schedule 14 to the Bill relate to England and Wales only (since the next amendment in the Minister’s name makes equivalent amendments for Northern Ireland).
--- Later in debate ---
Moved by
283A: Clause 171, page 145, line 43, at end insert “, and
(b) judgements by providers about whether news publisher content amounts to a relevant offence (see section 14(5) and (10)).”Member’s explanatory statement
This amendment, in effect, re-states the provision currently in clause 14(11), requiring OFCOM’s guidance under clause 171 to cover the judgements described in the amendment.
--- Later in debate ---
Moved by
284A: After Clause 176, insert the following new Clause—
“Offence of failure to comply with confirmation decision: supplementary
(1) Where a penalty has been imposed on a person by a penalty notice under section 126 in respect of a failure constituting an offence under section (Confirmation decisions: offence)(failure to comply with certain requirements of a confirmation decision), no proceedings may be brought against the person for that offence.(2) A penalty may not be imposed on a person by a penalty notice under section 126 in respect of a failure constituting an offence under section (Confirmation decisions: offence) if—(a) proceedings for the offence have been brought against the person but have not been concluded, or(b) the person has been convicted of the offence.(3) Where a service restriction order under section 131 or an access restriction order under section 133 has been made in relation to a regulated service provided by a person in respect of a failure constituting an offence under section (Confirmation decisions: offence), no proceedings may be brought against the person for that offence.” Member’s explanatory statement
This amendment ensures, among other things, that a person cannot be prosecuted for the new offence created by the new clause to be inserted after clause 125 in the Minister’s name if OFCOM have imposed a financial penalty for the same conduct instead, and vice versa.
--- Later in debate ---
Moved by
284B: Clause 180, page 150, line 23, leave out “Section 121(7)” and insert “Sections 121(7) and 137(11)”
Member’s explanatory statement
This amendment adds a reference to clause 137(11) so that that provision (which is about enforcement by civil proceedings) has extra-territorial application.
--- Later in debate ---
Moved by
284D: Clause 181, page 150, line 29, at end insert—
“(2A) Section (Confirmation decisions: offence) applies to acts done by a person in the United Kingdom or elsewhere (offence of failure to comply with confirmation decision).”Member’s explanatory statement
This amendment gives wide extra-territorial effect to the new offence created by the new clause to be inserted after clause 125 in the Minister’s name (failure to comply with certain requirements of a confirmation decision).
--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this is a real hit-and-run operation from the noble Lord, Lord Stevenson. He has put down an amendment on my favourite subject in the last knockings of the Bill. It is totally impossible to deal with this now—I have been thinking and talking about the whole area of AI governance and ethics for the past seven years—so I am not going to try. It is important, and the advisory committee under Clause 139 should take it into account. Actually, this is much more a question of authenticity and verification than of content. Trying to work out whether something is ChatGPT or GPT-4 content is a hopeless task; you are much more likely to be able to identify whether these are automated users such as chatbots than you are to know about the content itself.

I will leave it there. I missed the future-proofing debate, which I would have loved to have been part of. I look forward to further debates with the noble Viscount, Lord Camrose, on the deficiencies in the White Paper and to the Prime Minister’s much more muscular approach to AI regulation in future.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I am sure that the noble Lord, Lord Stevenson of Balmacara, is smiling over a sherry somewhere about the debate he has facilitated. His is a useful probing amendment and we have had a useful discussion.

The Government certainly recognise the potential challenges posed by artificial intelligence and digitally manipulated content such as deepfakes. As we have heard in previous debates, the Bill ensures that machine-generated content on user-to-user services created by automated tools or machine bots will be regulated where appropriate. Clause 49(4)(b) means that machine-generated content is regulated unless the bot or automated tool producing the content is controlled by the provider of the service.

The labelling of this content via draft legislation is not something to which I can commit today. The Government’s AI regulation White Paper sets out the principles for the responsible development of artificial intelligence in the UK. These principles, such as safety, transparency and accountability, are at the heart of our approach to ensuring the responsible development and use of AI. As set out in the White Paper, we are building an agile approach that is designed to be adaptable in response to emerging developments. We do not wish to introduce a rigid, inflexible form of legislation for what is a flexible and fast-moving technology.

The public consultation on these proposals closed yesterday so I cannot pre-empt our response to it. The Government’s response will provide an update. I am joined on the Front Bench by the Minister for Artificial Intelligence and Intellectual Property, who is happy to meet with the noble Baroness, Lady Kidron, and others before the next stage of the Bill if they wish.

Beyond labelling such content, I can say a bit to make it clear how the Bill will address the risks coming from machine-generated content. The Bill already deals with many of the most serious and illegal forms of manipulated media, including deepfakes, when they fall within scope of services’ safety duties regarding illegal content or content that is potentially harmful to children. Ofcom will recommend measures in its code of practice to tackle such content, which could include labelling where appropriate. In addition, the intimate image abuse amendments that the Government will bring forward will make it a criminal offence to send deepfake images.

In addition to ensuring that companies take action to keep users safe online, we are taking steps to empower users with the skills they need to make safer choices through our work on media literacy. Ofcom, for example, has an ambitious programme of work through which it is funding several initiatives to build people’s resilience to harm online, including initiatives designed to equip people with the skills to identify disinformation. We are keen to continue our discussions with noble Lords on media literacy and will keep an open mind on how it might be a tool for raising awareness of the threats of disinformation and inauthentic content.

With gratitude to the noble Lords, Lord Stevenson and Lord Knight, and everyone else, I hope that the noble Lord, Lord Knight, will be content to withdraw his noble friend’s amendment.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to everyone for that interesting and quick debate. It is occasionally one’s lot that somebody else tables an amendment but is unavoidably detained in Jerez, drinking sherry, and monitoring things in Hansard while I move the amendment. I am perhaps more persuaded than my noble friend might have been by the arguments that have been made.

We will return to this in other fora in response to the need to regulate AI. However, in the meantime, I enjoyed in particular the John Booth quote from the noble Baroness, Lady Bennett. In respect of this Bill and any of the potential harms around generative AI, if we have a Minister who is mindful of the need for safety by design when we have concluded this Bill then we will have dealt with the bits that we needed to deal with as far as this Bill is concerned.

--- Later in debate ---
Moved by
286A: Schedule 17, page 239, line 36, after “19(2)” insert “and (8A)”
Member’s explanatory statement
This amendment ensures that, during the transitional period when video-sharing platform services continue to be regulated by Part 4B of the Communications Act 2003, providers of such services are not exempt from the new duty in clause 19 to supply records of risk assessments to OFCOM.
--- Later in debate ---
Moved by
286B: Clause 188, page 154, line 1, after “119(10)” insert “and (11)”
Member’s explanatory statement
This amendment effects the repeal of a provision of the Digital Economy Act 2017 which solely relates to another provision of that Act being repealed.
--- Later in debate ---
Moved by
290A: Clause 202, page 166, line 3, leave out “moderation” and insert “identification”
Member’s explanatory statement
This amendment re-names “content moderation technology” as “content identification technology” as that term is more accurate.
--- Later in debate ---
Moved by
290H: Clause 203, page 167, line 38, at end insert “, or
(ii) users of another internet service.”Member’s explanatory statement
This amendment concerns the factors that OFCOM must particularly consider when deciding if content is communicated publicly or privately. The change ensures that one such factor is how easily the content may be shared with users of another service.
--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, what more can I say than that I wish to be associated with the comments made by the noble Baroness and then by the noble Lord, Lord Clement-Jones? I look forward to the Minister’s reply.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I am very grateful to the noble Baroness for her amendment, which is a useful opportunity for us to state publicly and share with the Committee the progress we have been making in our helpful discussions on these issues in relation to these amendments. I am very grateful to her and to my noble friends Lord Bethell and Lady Harding for speaking as one on this, including, as is well illustrated, in this short debate this evening.

As the noble Baroness knows, discussions continue on the precise wording of these definitions. I share her optimism that we will be able to reach agreement on a suitable way forward, and I look forward to working with her, my noble friends and others as we do so.

The Bill already includes a definition of age assurance in Clause 207, which is

“measures designed to estimate or verify the age or age-range of users of a service”.

As we look at these issues, we want to avoid using words such as “checking”, which suggests that providers need to take a proactive approach to checking age, as that may inadvertently preclude the use of technologies which determine age through other means, such as profiling. It is also important that any definition of age assurance does not restrict the current and future use of innovative and accurate technologies. I agree that it is important that there should be robust definitions for terms which are not currently defined in the Bill, such as age verification, and recommit to the discussions we continue to have on what terms need to be defined and the best way to define them.

This has been a very helpful short debate with which to end our deliberations in Committee. I am very grateful to noble Lords for all the points that have been raised over the past 10 days, and I am very glad to be ending in this collaborative spirit. There is much for us still to do, and even more for the Office of the Parliamentary Counsel to do, before we return on Report, and I am grateful to it and to the officials working on the Bill. I urge the noble Baroness to withdraw her amendment.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I beg leave to withdraw the amendment.

--- Later in debate ---
Moved by
304A: Clause 210, page 175, line 24, leave out “Except as provided by subsections (2) to (7)” and insert “Subject to the following provisions of this section”
Member’s explanatory statement
This amendment avoids any implication that the power proposed to be inserted by the amendment of the extent clause in the Minister’s name giving power to extend provisions of the Bill to the Crown Dependencies, and related provisions, are limited in extent to the United Kingdom.
--- Later in debate ---
Moved by
304CA: Clause 210, page 175, line 29, leave out subsection (3) and insert—
“(3) The following provisions extend to England and Wales and Northern Ireland—(a) sections 160 to 164;(b) section 168(1).” Member’s explanatory statement
This amendment revises the extent clause as a result of changes to the extent of the communications offences in Part 10 of the Bill.
--- Later in debate ---
Moved by
304E: Clause 210, page 175, line 35, leave out subsection (6) and insert—
“(6) The following provisions extend to Northern Ireland only—(a) section 168(3);(b) section 190(7) to (9).”Member’s explanatory statement
This amendment revises the extent clause so that the amendments of Northern Ireland legislation in clause 168 extend to Northern Ireland only.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Moved by
1: Before Clause 1, insert the following new Clause—
“Introduction
(1) This Act provides for a new regulatory framework which has the general purpose of making the use of internet services regulated by this Act safer for individuals in the United Kingdom.(2) To achieve that purpose, this Act (among other things)—(a) imposes duties which, in broad terms, require providers of services regulated by this Act to identify, mitigate and manage the risks of harm (including risks which particularly affect individuals with a certain characteristic) from—(i) illegal content and activity, and(ii) content and activity that is harmful to children, and(b) confers new functions and powers on the regulator, OFCOM.(3) Duties imposed on providers by this Act seek to secure (among other things) that services regulated by this Act are—(a) safe by design, and(b) designed and operated in such a way that—(i) a higher standard of protection is provided for children than for adults,(ii) users’ rights to freedom of expression and privacy are protected, and(iii) transparency and accountability are provided in relation to those services.”Member’s explanatory statement
This amendment provides for a new introductory Clause.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - -

My Lords, I am pleased that we are on Report, and I thank all noble Lords who took part in Committee and those with whom I have had the pleasure of discussing issues arising since then, particularly for their constructive and collaborative nature, which we have seen throughout the passage of Bill.

In Committee, I heard the strength of feeling and the desire for an introductory clause. It was felt that this would help make the Bill less complex to navigate and make it less easy for providers to use this complexity to try to evade their duties under it. I have listened closely to these concerns and thank the noble Lord, Lord Stevenson of Balmacara, the noble Baroness, Lady Merron, and others for their work on this proposal. I am particularly grateful for their collaborative approach to ensuring the new clause has the desired effect without causing legal uncertainty. In that spirit, I am pleased to introduce government Amendment 1. I am grateful too to the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, who have signed their names to it. That is a very good start to our amendments here on Report.

Amendment 1 inserts an introductory clause at the start of the Bill, providing an overarching statement about the main objectives of the new regulatory framework. The proposed new clause describes the main broad objectives of the duties that the Bill imposes on providers of regulated services and that the Bill confers new functions and powers on Ofcom.

The clause makes clear that regulated services must identify, mitigate and manage risks that particularly affect people with a certain characteristic. This recognises that people with certain characteristics, or more than one such characteristic, are disproportionately affected by online harms and that providers must account for and protect them from this. The noble Baroness, Lady Merron, raised the example of Jewish women, as did the noble Baroness, Lady Anderson of Stoke-on-Trent. Sadly, they have first-hand experience of the extra levels of abuse and harm that some groups of people can face when they have more than one protected characteristic. It could just as easily be disabled women or queer people of colour. The noble Baroness, Lady Merron, has tabled several amendments highlighting this problem, which I will address further in response to the contribution I know she will make to this debate.

Subsection 3 of the proposed new clause outlines the main outcomes that the duties in the Bill seek to secure. It is a fundamental principle of the legislation that the design of services can contribute to the risk of users experiencing harm online. I thank the noble Lord, Lord Russell of Liverpool, for continuing to raise this issue. I am pleased to confirm that this amendment will state clearly that a main outcome of the legislation is that services must be safe by design. For example, providers must choose and design their functionalities so as to limit the risk of harm to users. I know this is an issue to which we will return later on Report, but I hope this provides reassurance about the Government’s intent and the effect of the Bill’s framework.

Services must also be designed and operated in a way which ensures that a higher standard of protection is provided for children than for adults, that users’ rights to freedom of expression and privacy are protected and that transparency and accountability are enhanced. It should be noted that we have worked to ensure that this clause provides clarity to those affected by the Bill without adversely affecting the interpretation or effect of the substantive provisions of the rest of the Bill. As we debated in Committee, this is of the utmost importance, to ensure that this clause does not create legal uncertainty or risk with the interpretation of the rest of the Bill’s provisions.

I hope that your Lordships will welcome this amendment and I beg to move.

Amendment 2 (to Amendment 1)

Moved by
--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, needless to say, I disagree with what the noble Lord, Lord Moylan, has just been saying precisely because I believe that the new clause that the Minister has put forward, which I have signed and has support across the House, expresses the purpose of the Bill in the way that the original Joint Committee wanted. I pay tribute to the Minister, who I know has worked extremely hard, in co-operation with the noble Lord, Lord Stevenson of Balmacara, to whom I also pay tribute for getting to grips with a purpose clause. The noble Baronesses, Lady Kidron and Lady Harding, have put their finger on it: this is more about activity and design than it is about content, and that is the reason I fundamentally disagree with the noble Lord, Lord Moylan. I do not believe that will be the impact of the Bill; I believe that this is about systemic issues to do with social media, which we are tackling.

I say this slightly tongue-in-cheek, but if the Minister had followed the collective wisdom of the Joint Committee originally, perhaps we would not have worked at such breakneck speed to get everything done for Report stage. I believe that the Bill team and the Minister have worked extremely hard in a very few days to get to where we are on many amendments that we will be talking about in the coming days.

I also want to show my support for the noble Baroness, Lady Merron. I do not believe it is just a matter of the Interpretation Act; I believe this is a fundamental issue and I thank her for raising it, because it was not something that was immediately obvious. The fact is that a combination of characteristics is a particular risk in itself; it is not just about having several different characteristics. I hope the Minister reflects on this and can give a positive response. That will set us off on a very good course for the first day of Report.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, this has indeed set us on a good course, and I am grateful to noble Lords for their questions and contributions. I apologise to my noble friend Lord Moylan, with whom I had the opportunity to discuss a number of issues relating to freedom of expression on Monday. We had tabled this amendment, and I apologise if I had not flagged it and sought his views on it explicitly, though I was grateful to him and the noble Baroness, Lady Fox of Buckley, for their time in discussing the issues of freedom of expression more broadly.

I am grateful to my noble friend Lady Harding and to the noble Baroness, Lady Kidron, for their tireless work over many months on this Bill and for highlighting the importance of “content” and “activity”. Both terms have been in the Bill since its introduction, for instance in Clauses 5(2) and (3), but my noble friend Lady Harding is right to highlight it in the way that she did. The noble Baroness, Lady Kidron, asked about the provisions on safety by design. The statement in the new clause reflects the requirements throughout the Bill to address content and activity and ensure that services are safe by design.

On the amendments tabled by the noble Baroness, Lady Merron, which draw further attention to people who have multiple characteristics and suffer disproportionately because of it, let me start by saying again that the Government recognise that this is, sadly, the experience for many people online, and that people with multiple characteristics are often at increased risk of harm. The Bill already accounts for this, and the current drafting captures people with multiple characteristics because of Section 6 of the Interpretation Act 1978. As she says, this was a new one to me—other noble Lords may be more familiar with this legacy of the Callaghan Government—but it does mean that, when interpreting statute, words in the singular include the plural and words in the plural include the singular.

If we simply amended the references that the noble Baroness highlights in her amendments, we would risk some uncertainty about what those provisions cover. I sympathise with the concern which lies behind her amendments, and I am grateful for her time in discussing this matter in detail. I agree that it would be helpful to make it clearer that the Bill is designed to protect people with multiple characteristics. This clause is being inserted to give clarity, so we should seek to do that throughout.

We have therefore agreed to add a provision in Clause 211—the Bill’s interpretation clause—to make clear that all the various references throughout the Bill to people with a certain characteristic include people with a combination of characteristics. This amendment was tabled yesterday and will be moved at a later day on Report, so your Lordships’ House will have an opportunity to look at and vote on that. I hope that that provision clarifies the intention of the wording used in the Bill and puts the issue beyond doubt. I hope that the noble Baroness will be satisfied, and I am grateful to all noble Lords for their support on this first amendment.

Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to the Minister for his response. It is a very practical response and certainly one that I accept as a way forward. I am sure that the whole House is glad to hear of his acknowledgement of the true impact that having more than one protected characteristic can have, and of his commitment to wanting the Bill to do the job it is there to do. With that, I am pleased to withdraw the amendment in my name.

--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, this has been an interesting debate that in a curious way moves us from the debate on the first group, which was about the high level of aspiration for this Bill, for the work of those involved in it and indeed for Parliament as a whole, down to some of the nitty-gritty points that emerge from some of the Bill’s proposals. I am very much looking forward to the Minister’s response.

In a sense, where the noble Lord, Lord Clement-Jones, ends, I want to start. The noble and learned Lord, Lord Garnier, did a good job of introducing the points made previously by his colleague, the noble Baroness, Lady Buscombe, in relation to those unfortunate exercises of public comment on businesses, and indeed individuals, that have no reason to receive them. There does not seem to be a satisfactory sanction for that. In a sense he was drawn by the overarching nature of Clause 1, but I think we have established between us that Clause 1 does not have legal effect in the way that he would like, so we would probably need to move further forward. The Government probably need to pick up his points in relation to some of the issues that are raised further down, because they are in fact not dissimilar and could be dealt with.

The key issue is the one that my noble friend Lady Kennedy ended on, in the sense that the law online and the law offline, as mentioned by the noble Lord, Lord Clement-Jones, seem to be at variance about what you can and cannot do in relation to threats issued, whether or not they are general, to a group or groups in society. This is a complex area that needs further thought of the nature that has been suggested, and may well refer back to the points made by the noble Baroness, Lady Morgan. There is something here that we are not tackling correctly. I look forward to the Government’s response. We would support movement in that area should that agreement be made.

Unfortunately, the noble Lord, Lord Russell, whom I am tempted to call my noble friend because he is a friend, has just moved out of his seat—I do not need to give him a namecheck any more—but he and I went to a meeting yesterday, I think, although I have lost track of time. It was called by Luke Pollard MP and related to the incel movement or, as the meeting concluded, what we should call the alleged incel movement, because by giving it a name we somehow give it a position. I wanted to make that point because a lot of what we are talking about here is in the same territory. It was an informal research-focused meeting to hear all the latest research being done on the group of activities going under the name of the alleged incel movement.

I mention that because it plays into a lot of the discussion here. The way in which those who organise it do so—the name Andrew Tate has already been mentioned—was drawn into the debate in a much broader context by that research, particularly because representatives from the Home Office made the interesting point that the process by which the young men who are involved in this type of activity are groomed to join groups and are told that by doing so they are establishing a position that has been denied to them by society in general, and allegedly by women in particular, is very similar to the methods used by those who are cultivating terrorism activity. That may seem to be a big stretch but it was convincing, and the argument and debate around that certainly said to me that there are things operating within the world of social media, with its ability to reach out to those who often feel alone, even if they are not, and who feel ignored, and to reach them in a way that causes them to overreact in the way they deal with the issues they face.

That point was picked up by others, including my noble friend Lady Kennedy and the noble Baroness, Lady Burt, in relation to the way in which the internet itself is in some way gendered against women. I do not in any sense want to apportion blame anywhere for that; it is a much more complex issue than single words can possibly address, but it needs to be addressed. As was said in the meeting and has been said today, there are cultural, educational and holistic aspects here. We really do not tackle the symptoms or the effects of it, but we should also look at what causes people to act in the way they have because of, or through the agency of, the internet.

Having said that, I support the amendments from the noble Lord, Lord Allan, and I look forward to the Government’s response to them. Amendment 5B raises the issue that it will be detrimental to society if people stop posting and commenting on things because they fear that they will be prosecuted—or not even prosecuted but attacked. The messages that they want to share will be lost as a result, and that is a danger that we do not want to encourage. It will be interesting to hear the Minister’s response to that.

The noble Baroness, Lady Burt, made powerful points about the way in which the offence of cyberflashing is going to be dealt with, and the differences between that and the intimate image abuse that we are coming on to in the next group. It may well be that this is the right way forward, and indeed we support the Government in the way that they are going, but it is important to recognise her point that we need a test of whether it is working. The Government may well review the impact of the Bill in the normal way of things, but this aspect needs particular attention; we need to know whether there are prosecutions and convictions and whether people understand the implication of the change in practice. We need publicity, as has been said, otherwise it will not be effective in any case. These issues, mentioned by the noble Baroness, Lady Burt, and picked up by the noble Baroness, Lady Morgan, are important. We will have other opportunities to discuss them, but at this stage we should at least get a response to that.

If it is true that in Northern Ireland there is now a different standard for the way in which cyberflashing offences are to be undertaken—taking into account the points made very well by the noble Baroness, Lady Fox, and the worry about encouraging more offences for which crimes may not necessarily be appropriate at this stage, particularly the one about recklessness—do the Government not have a slight problem here? In the first case, do we really accept that we want differences between the various regions and nations of our country in these important issues? We support devolution but we also need to have a sense of what the United Kingdom as a whole stands for in its relationship with these types of criminal offence, if they are criminal. If that happens, do we need a better understanding of why one part of the country has moved in a particular way, and is that something that we are missing in picking up action that is perhaps necessary in other areas? As my noble friend Lady Kennedy has also said, some of the work she has been doing in Scotland is ahead of the work that we have been doing in this part of the United Kingdom, and we need to pick up the lessons from that as well.

As I said at the beginning, this is an interesting range of amendments. They are not as similar as the grouping might suggest, but they point in a direction that needs government attention, and I very much look forward to the Minister’s comments on them.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am grateful to my noble friends Lady Buscombe and Lord Leicester and my noble and learned friend Lord Garnier for the amendments that they have tabled, with which we began this helpful debate, as well as for their time earlier this week to discuss them. We had a good debate on this topic in Committee and I had a good discussion with my noble friend Lady Buscombe and my noble and learned friend Lord Garnier on Monday. I will explain why the Government cannot accept the amendments that they have brought forward today.

I understand my noble friends’ concerns about the impact that fake reviews can have on businesses, but the Bill and the criminal offences it contains are not the right place to address this issue. The amendments would broaden the scope of the offences and likely result in overcriminalisation, which I know my noble friends would not want to see.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

I appreciate the Minister’s response. Could he also respond to my suggestion that it would be helpful for some of the people working on the front line to meet officials to go through their concerns in more detail?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I am very happy to make that commitment. It would be useful to have their continued engagement, as we have had throughout the drafting of the Bill.

The noble Baroness, Lady Burt of Solihull, has tabled a number of amendments related to the new offence of cyberflashing. I will start with her Amendment 6. We believe that this amendment reduces the threshold of the new offence to too great an extent. It could, for example, criminalise a person sending a picture of naked performance art to a group of people, where one person might be alarmed by the image but the sender sends it anyway because he or she believes that it would be well received. That may be incorrect, unwise and insensitive, but we do not think it should carry the risk of being convicted of a serious sexual offence.

Crucially, the noble Baroness’s amendment requires that the harm against the victim be proven in court. Not only does this add an extra step for the prosecution to prove in order for the perpetrator to be convicted, it creates an undue burden on the victim, who would be cross-examined about his or her—usually her—experience of harm. For example, she might have to explain why she felt humiliated; this in itself could be retraumatising and humiliating for the victim. By contrast, Clause 170 as drafted means that the prosecution has only to prove and focus on the perpetrator’s intent.

Baroness Burt of Solihull Portrait Baroness Burt of Solihull (LD)
- View Speech - Hansard - - - Excerpts

I am very grateful for the Minister’s comments. This is the crux of my confusion: I am not entirely sure why it is necessary for the victim to appear in court. In intimate image abuse, is it not the case that the victim does not have to make an appearance in court? What is the difference between intimate image abuse and cyberflashing abuse? I do not get why one attracts a physical court appearance and the other does not. They seem to be different sides of the same coin to me.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

If a defendant said that he—usually he—had sent an image believing that the consent of the recipient was implied, the person making the complaint would be cross-examined on whether or not she had indeed given that consent. If an offence predicated on proof of non-consent or proof of harm were made out, the victim could be called to give evidence and be cross-examined in court. The defence would be likely to lead evidence challenging the victim’s characteristics and credibility. We do not want that to be a concern for victims; we do not want that to be a barrier to victims coming forward and reporting abuse for fear of having their sexual history or intentions cross-examined.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, we are coming to this in the next group, but that is a consent-based offence, is it not?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

It is—and I shall explain more in that group why we take that approach. But the offence of cyberflashing matches the existing offence of flashing, which is not a consent-based offence. If somebody flashes at someone in public, it does not matter whether the person who sees that flashing has consented to it—it is the intent of the flasher that is the focus of the court. That is why the Law Commission and we have brought the cyberflashing offence forward in the same way, whereas the sharing of intimate images without somebody’s consent relies on the consent to sharing. But I shall say a bit more when we get to that group, if the noble Lord will allow.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

I am sure that the noble and learned Lord, Lord Garnier, is going to come in, and he knows a great deal more about this than I do. But we are getting into the territory where we talk about whether or not somebody needs to appear in court in order to show consent. That was all that I was trying to point out, in a way—that, if the Minister accepted the amendment on behalf of my noble friend, and then the complainant had to appear in court, why is that not the case with intimate abuse?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

Perhaps I can respond to the point about intimate abuse when we come on to the next group—that might be helpful.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

It might be helpful—except for the refusal to accept my noble friend’s amendment.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

If the defendant said that they had sent an image because they thought that consent had been obtained, the person whose consent was under question would find themselves cross-examined on it in a way that we do not want to see. We do not want that to be a barrier to people reporting this, in the same way that it is not for people who report flashing on the streets.

Lord Garnier Portrait Lord Garnier (Con)
- Hansard - - - Excerpts

My Lords, I do not want to interfere in private grief, but the courts have powers to protect witnesses, particularly in cases where they are vulnerable or will suffer acute distress, by placing screens in the way and controlling the sorts of cross-examinations that go on. I accept the concern expressed by the noble Baroness, Lady Burt, but I think that my noble friend the Minister will be advised that there are protective measures in place already for the courts to look after people of the sort that she is worried about.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

There are indeed but, as my noble and learned friend’s interjection makes clear, those are still means for people to be cross-examined and give their account in court, even with those mitigations and protections. That is really the crux of the issue here.

We have already debated the risk that the approach that the noble Baroness sets out in her Amendments 5C and 7A criminalises sending messages, and people whom we would not deem to be criminal. I want to reassure her and your Lordships’ House that the intent-based offence, as drafted at Clause 170, provides the comprehensive protections for victims that we all want to see, including situations where the perpetrator claims it was “just for a joke”. The offence is committed if a perpetrator intended to cause humiliation, and that captures many supposed “joke” motives, as the perverted form of humour in this instance is often derived from the victim’s humiliation, alarm or distress.

Indeed, it was following consultation with victims’ groups and others that the Law Commission added humiliation as a form of intent to the offence to address those very concerns. Any assertions made by a defendant in this regard would not be taken at face value but would be considered and tested by the police and courts in the usual way, alongside the evidence. The Crown Prosecution Service and others are practised in prosecuting intent, and juries and magistrates may infer intention from the context of the behaviour and its foreseeable consequences.

The addition of defences, as the noble Baroness suggests in her Amendment 7A, is unfortunately still not sufficient to ensure that we are not overcriminalising here. Even with the proposed defences, sending a picture of genitalia without consent for medical reasons would still risk being considered a criminal Act and potentially compel a medical professional to justify that he or she has an adequate defence.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

It is about the burden on the medical professionals and the question of whether it comes to court when the police investigate it and the prosecution make out. We do not want to see that sort of behaviour being overly criminalised or the risk of prosecution hanging over people for reasons where it is not needed. We want to make sure that the offence is focused on the behaviour that we all want to tackle here.

The Law Commission has looked at this extensively—and I am glad the noble Baroness has had the opportunity to speak to it directly—and brought forward these proposals, which mirror the offence of flashing that already exists in criminal law. We think that is the right way of doing it and not risking the overcriminalisation of those whom noble Lords would not want to capture.

Contrary to some concerns that have been expressed, the onus is never on the victim to marshal evidence or prove the intent of the perpetrator. It is for the police and the Crown Prosecution Service when investigating the alleged offence or prosecuting the case in court. That is why we and the Law Commission consulted the police and the CPS extensively in bringing the offence forward.

By contrast, as I say, the consent-based approach is more likely to put onerous pressure on the victim by focusing the case on his or her behaviour and sexual history instead of the behaviour of the perpetrator. I know and can tell from the interjections that noble Lords still have some concerns or questions about this offence as drafted. I reassure them, as my noble friend Lady Morgan of Cotes urged, that we will be actively monitoring and reviewing the implementation of this offence, along with the Crown Prosecution Service and the police, to ensure that it is working effectively and bringing perpetrators to justice.

The noble Baroness, Lady Burt, also raised the importance of public engagement and education in this regard. As she may know, the Government have a long-term campaign to tackle violence against women and girls. The Enough campaign covers a range of online and offline forms of abuse, including cyberflashing. The campaign includes engaging with the public to deepen understanding of this offence. It focuses on educating young people about healthy relationships, on targeting perpetrators and on ensuring that victims of violence against women and girls can access support. Future phases of the Enough campaign will continue to highlight the abusive nature and unacceptability of these behaviours, and methods for people safely to challenge them.

In addition, in our tackling violence against women and girls strategy the Government have committed to invest £3 million better to understand what works to prevent violence against women and girls, to invest in high-quality, evidence-informed prevention projects, including in schools, aiming to educate and inform children and young people about violence against women and girls, healthy relationships and the consequences of abuse.

With that commitment to keep this under review—to ensure that it is working in the way that the Law Commission and the Government hope and expect it to—and with that explanation of the way we will be encouraging the public to know about the protections that are there through the law and more broadly, I hope noble Lords will be reassured and will not press their amendments.

Baroness Kennedy of Shaws Portrait Baroness Kennedy of The Shaws (Lab)
- Hansard - - - Excerpts

Before the Minister sits down, I express my gratitude that he has indicated that my amendment would have some serious impact. I thank the noble Lord, Lord Clement-Jones, for saying that there should be some learning among men in the House and in wider society about what puts real fear in the hearts of women and how it affects how women conduct their lives. I thank those who said that some change is necessary.

We have to remember that this clause covers a threatening communications offence. I know that something is going to be said about the particular vulnerability of women and girls—the noble Baroness, Lady Morgan, mentioned it, and I am grateful for that—but this offence is not specific to one gender. It is a general offence that someone commits if a message they send conveys a threat of death or serious harm.

I reassure the noble Baroness, Lady Fox, that we are not talking about a slight—saying to a woman that she is ugly or something. This is not about insults but about serious threats. The business about it being reckless as to whether or not it is going to be carried out is vital. Clause 164(1)(c)(i) says an offence is committed if it is intended that an individual encountering the message would fear that the threat would be carried out. I would like to see added the words, “whether or not by the person sending the message”.

Just think of this in the Irish context of years gone by. If someone sent a message saying, “You should be kneecapped”, it is very clear that we would be talking about something that would put someone in terror and fear. It is a serious fear, so I am glad that this is supported by the Minister, and I hope we will progress it to the next stage.

Lord Harlech Portrait Lord Harlech (Con)
- View Speech - Hansard - - - Excerpts

My Lords, without wishing to disrupt the very good nature of this debate, I remind the House that the Companion advises against speaking more than once on Report, except for specific questions or points of elucidation.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

None the less, I am grateful to the noble Baroness for her clarification and expansion of this point. I am glad that she is satisfied with the approach we have set out.

Baroness Kennedy of Shaws Portrait Baroness Kennedy of The Shaws (Lab)
- Hansard - - - Excerpts

It is not specific to women; it is general.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

The issue the noble Baroness has highlighted will protect all victims against people trying to evade the law, and I am grateful to her. We will bring forward an amendment at Third Reading.

Lord Garnier Portrait Lord Garnier (Con)
- Hansard - - - Excerpts

My Lords, I will be incredibly brief because everything that needs to be said has been said at least twice. I am grateful to those who have taken the trouble to listen to what I had to say, and I am grateful to the Minister for his response. I beg leave to withdraw my amendment.

--- Later in debate ---
Moved by
7: Clause 170, page 149, line 25, after “made” insert “or altered”
Member’s explanatory statement
This amendment provides that “photograph” and “film” in the new offence of sending a photograph or film of genitals (and, by extension the new offences of sharing an intimate photograph or film) includes an image which has been altered and which appears to be a photograph or film.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, I am grateful for the opportunity to continue some of the themes we touched on in the last group and the debate we have had throughout the passage of the Bill on the importance of tackling intimate image abuse. I shall introduce the government amendments in this group that will make a real difference to victims of this abhorrent behaviour.

Before starting, I take the opportunity again to thank the Law Commission for the work it has done in its review of the criminal law relating to the non-consensual taking, making and sharing of intimate images. I also thank my right honourable friend Dame Maria Miller, who has long campaigned for and championed the victims of online abuse. Her sterling efforts have contributed greatly to the Government’s approach and to the formulation of policy in this sensitive area, as well as to the reform of criminal law.

As we announced last November, we intend to bring forward a more expansive package of measures based on the Law Commission’s recommendations as soon as parliamentary time allows, but the Government agree with the need to take swift action. That is why we are bringing forward these amendments now, to deliver on the recommendations which fall within the scope of the Bill, thereby ensuring justice for victims sooner.

These amendments repeal the offence of disclosing private sexual photographs and films with intent to cause distress and replace it with four new sexual offences in the Sexual Offences Act 2003. The first is a base offence of sharing an intimate photograph or film without consent or reasonable belief in consent. This recognises that the sharing of such images, whatever the intent of the perpetrator, should be considered a criminal violation of the victim’s bodily autonomy.

The amendments create two more serious offences of sharing an intimate photograph or film without consent with intent to cause alarm, distress or humiliation, or for the purpose of obtaining sexual gratification. Offenders committing the latter offence may also be subject to notification requirements, commonly referred to as being on the sex-offenders register. The amendments create an offence of threatening to share an intimate image. These new sharing offences are based on the Law Commission’s recommended approach to the idea of intimate photographs or films to include images which show or appear to show a person nude or partially nude, or which depict sexual or toileting activity. This will protect more victims than the current Section 33 offence, which protects only images of a private and sexual nature.

Finally, these clauses will, for the first time, make it a criminal offence to share a manufactured or so-called deepfake image of another person without his or her consent. This form of intimate image abuse is becoming more prevalent, and we want to send a clear message that it will not be tolerated.

By virtue of placing these offences in the Sexual Offences Act 2003, we are extending to these offences also the current special measures, so that victims can benefit from them in court, and from anonymity provisions, which are so important when something so intimate has been shared without consent. This is only the first stage in our reform of the law in this area. We are committed to introducing additional changes, giving effect to further recommendations of the Law Commission’s report which are beyond the scope of the Bill, when parliamentary time allows.

I hope that noble Lords from across your Lordships’ House will agree that these amendments represent an important step forward in tackling intimate image abuse and protecting victims. I commend them to the House, and I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, I welcome these new offences. From my professional experience, I know that what came to be known as “sextortion” created some of the most distressing cases you could experience, where an individual would obtain intimate images, often by deception, and then use them to make threats. This is where a social network is particularly challenging; it enables people to access a network of all the family and friends of an individual whose photo they now hold and to threaten to distribute it to their nearest and dearest. This affects men and women; many of the victims were men who were honey-potted into sharing intimate images and in the worst cases it led to suicide. It was not uncommon that people would feel that there was no way out; the threat was so severe that they would take their own lives. It is extremely welcome that we are doing something about it, and making it more obvious to anyone who is thinking about committing this kind of offence that they run the risk of criminal prosecution.

I have a few specific questions. The first is on the definitions in proposed new Section 66D, inserted by government Amendment 8, where the Government are trying to define what “intimate” or “nudity” represents. This takes me back again to my professional experience of going through slide decks and trying to decide what was on the right or wrong side of a nudity policy line. I will not go into the detail of everything it said, not least because I keep noticing younger people in the audience here, but I will leave you with the thought that you ended up looking at images that involved typically fishnets, in the case of women, and socks, in the case of men—I will leave the rest to your Lordships’ imaginations to determine at what point someone has gone from being clothed to nude. I can see in this amendment that the courts are going to have to deal with the same issues.

The serious point is that, where there is alignment between platform policies, definitions and what we do not want to be distributed, that is extremely helpful, because it then means that if someone does try to put an intimate image out across one of the major platforms, the platform does not have to ask whether there was consent. They can just say that it is in breach of their policy and take it down. It actually has quite a beneficial effect on slowing transmission.

The other point that comes out of that is that some of these questions of intimacy are quite culturally subjective. In some cultures, even a swimsuit photo could be used to cause humiliation and distress. I know this is extremely difficult; we do not want to be overly censorious but, at the same time, we do not want to leave people exposed to threats, and if you come from a culture where a swimsuit photo would be a threat, the definitions may not work for you. So I hope that, as we go through this, there will be a continued dialogue between experts in the platforms who have to deal with these questions and people working on the criminal offence side. To the extent that we can achieve it, there should be alignment and the message should go out that if you are thinking of distributing an image like this, you run the risk of being censored by the platforms but also of running into a criminal prosecution. That is on the mechanics of making it work.

--- Later in debate ---
Baroness Merron Portrait Baroness Merron (Lab)
- Hansard - - - Excerpts

My Lords, I am grateful to the Minister for introducing this suite of government amendments. From these Benches we welcome them. From the nature of the debate, this seems to be very much a work in progress. I wish the Minister well as he and the Justice Minister continue to pick their way through a route to get us to where we need to be. I too thank the Law Commission, Dame Maria Miller MP and so many other campaigners who, as noble Lords have said, have got us to this important point.

However, as I am sure is recognised, with the best of intentions, the government amendments still leave some areas that are as yet unresolved, particularly on sharing images with others: matters such as revenge porn and sending unwanted pictures on dating apps. There are areas still to be explored. The Minister and the Justice Minister said in a letter that, when parliamentary time allows, there will be a broader package of offences being brought forward. I realise that the Minister cannot be precise, but I would appreciate some sense of urgency or otherwise in terms of parliamentary time and when that might be.

We are only just starting to understand the impact of, for example, artificial intelligence, which we are about to come on to. That will be relevant in this regard too. We all understand that this is a bit of a moveable feast. The test will be whether this works. Can the Minister say a bit more about how this suite of measures will be kept under review and, in so doing, will the Government be looking at keeping an eye on the number of charges that are brought? How will this be reported to the House?

In line with this, will there be some consideration of the points that were raised in the previous group? I refer particularly to the issues raised in the amendments tabled by the noble Baroness, Lady Burt, especially where there may not be the intent, or the means, to obtain sexual gratification. They might be about “having a bit of a laugh”, as the noble Baroness said—which might be funny to some but really not funny to others.

In welcoming this, I hope that the Minister will indicate that this is just one step along the way and when we will see further steps.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am happy to respond clearly to that. As my right honourable friend Edward Argar MP and I said in our letter, this is just the first step towards implementing the changes which the Law Commission has recommended and which we agree are needed. We will implement a broader package of offences, covering, for instance, the taking of intimate images without consent, which were also part of the Law Commission’s report. The parameters of this Bill limit what we can do now. As I said in my opening remarks, we want to bring those forward now so that we can provide protections for victims in all the ways that the Bill gives us scope to do. We will bring forward further provisions when parliamentary time allows. The noble Baroness will understand that I cannot pre-empt when that is, although if we make good progress on the Bill, parliamentary time may allow for it sooner.

The noble Baroness also asked about our review. We will certainly take into account the number of prosecutions and charges that are brought. That is always part of our consideration of criminal law, but I am happy to reassure her that this will be the case here. These are new offences, and we want to make sure that they are leading to prosecutions to deter people from doing it.

The noble Lord, Lord Allan of Hallam, asked whether images will include those shared on virtual reality platforms and in other novel ways. As he knows, the Bill is written in a technologically neutral way to try to be future-proof and capture those technologies which have not yet been invented. I mentioned deepfakes in my opening remarks, which we can envisage. An image will be included on whatever platform it is shared, if it appears to be a photograph or film—that is to say, if it is photo-real. I hope that reassures him.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

If the Minister has time, can he actually direct us to that, because it is important that we are clear that it really is captured?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

In the amendments, if I can, I will. In the meantime, I reassure my noble friend Lady Morgan of Cotes that, as I said in opening, placing these offences in the Sexual Offences Act means that we are also extending the current special measures provisions to these offences, as we heard in our debate on the last group, so that victims can benefit from those in court. The same applies to anonymity provisions, which are so important when something so intimate has been shared without someone’s consent.

I promised in the previous group to outline the difference in the consent basis between this offence and the cyberflashing offence. Both are abhorrent behaviours which need to be addressed in criminal law. Although the levels of harm and distress may be the same in each case, the Law Commission recommended different approaches to take into account the different actions of the perpetrator in each offence. Sharing an intimate image of somebody without their consent is, in and of itself, wrongful, and a violation of their bodily privacy and sexual autonomy. Sending a genital image without the consent of the recipient is not, in and of itself, wrongful; for instance, the example I gave in the previous debate about an artistic performance, or a photograph which depicts a naked protester. If that was sent without the consent of the recipient, it is not always or necessarily harmful. This is an issue which the Law Commission looked at in some detail.

The criminal law must take the culpability of the perpetrator into account. I reassure noble Lords that both we and the Law Commission have looked at these offences considerably, working with the police and prosecutors in doing so. We are confident that the Bill provides the comprehensive protection for victims that we all want to see, including in situations where a perpetrator may claim that it was just a joke.

The terms “photograph” and “film” are defined in proposed new Section 66D(5). That refers to the definition in new Section 66A, which refers to an image which is made or altered in any way

“which appears to be a photograph or film”.

That is where the point I make about photo-reality is captured.

The noble Baroness, Lady Kidron, is right to highlight that this is a matter not just for the criminal law. As we discussed on the previous group, it is also a matter for public education, so that young people and users of any age are aware of the legal boundaries and legal issues at stake here. That is why we have the public education campaigns to which I alluded in the previous group.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I believe I misspoke when I asked my question. I referred to under-18s. Of course, if they are under 18 then it is child sexual abuse. I meant someone under the age of 18 with an adult image. I put that there for the record.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

If the noble Baroness misspoke, I understood what she intended. I knew what she was getting at.

With that, I hope noble Lords will be content not to press their amendments and that they will support the government amendments.

Amendment 7 agreed.
--- Later in debate ---
Moved by
8: After Clause 170, insert the following new Clause—
“Sharing or threatening to share intimate photograph or film
In the Sexual Offences Act 2003, after section 66A (inserted by section 170), insert—“66B Sharing or threatening to share intimate photograph or film(1) A person (A) commits an offence if—(a) A intentionally shares a photograph or film which shows, or appears to show, another person (B) in an intimate state,(b) B does not consent to the sharing of the photograph or film, and(c) A does not reasonably believe that B consents.(2) A person (A) commits an offence if—(a) A intentionally shares a photograph or film which shows, or appears to show, another person (B) in an intimate state,(b) A does so with the intention of causing B alarm, distress or humiliation, and(c) B does not consent to the sharing of the photograph or film.(3) A person (A) commits an offence if—(a) A intentionally shares a photograph or film which shows, or appears to show, another person (B) in an intimate state, (b) A does so for the purpose of A or another person obtaining sexual gratification,(c) B does not consent to the sharing of the photograph or film, and(d) A does not reasonably believe that B consents.(4) A person (A) commits an offence if—(a) A threatens to share a photograph or film which shows, or appears to show, another person (B) in an intimate state, and(b) A does so—(i) with the intention that B or another person who knows B will fear that the threat will be carried out, or(ii) being reckless as to whether B or another person who knows B will fear that the threat will be carried out.(5) Subsections (1) to (4) are subject to section 66C (exemptions).(6) For the purposes of subsections (1) to (3) and section 66C(3)(b)—(a) “consent” to the sharing of a photograph or film includes general consent covering the particular act of sharing as well as specific consent to the particular act of sharing, and(b) whether a belief is reasonable is to be determined having regard to all the circumstances including any steps A has taken to ascertain whether B consents.(7) Where a person is charged with an offence under subsection (4), it is not necessary for the prosecution to prove—(a) that the photograph or film mentioned in the threat exists, or(b) if it does exist, that it is in fact a photograph or film which shows or appears to show a person in an intimate state.(8) It is a defence for a person charged with an offence under subsection (1) to prove that the person had a reasonable excuse for sharing the photograph or film.(9) A person who commits an offence under subsection (1) is liable on summary conviction to imprisonment for a term not exceeding the maximum term for summary offences or a fine (or both).(10) A person who commits an offence under subsection (2), (3) or (4) is liable—(a) on summary conviction, to imprisonment for a term not exceeding the general limit in a magistrates’ court or a fine (or both);(b) on conviction on indictment, to imprisonment for a term not exceeding 2 years.(11) In subsection (9) “the maximum term for summary offences” means—(a) if the offence is committed before the time when section 281(5) of the Criminal Justice Act 2003 comes into force, six months;(b) if the offence is committed after that time, 51 weeks.(12) If on the trial of a person charged with an offence under subsection (2) or (3) a magistrates’ court or jury finds the person not guilty of the offence charged, the magistrates’ court or jury may find the person guilty of an offence under subsection (1).(13) The Crown Court has the same powers and duties in relation to a person who is by virtue of subsection (12) convicted before it of an offence under subsection (1) as a magistrates’ court would have on convicting the person of the offence. 66C Sharing or threatening to share intimate photograph or film: exemptions(1) A person (A) who shares a photograph or film which shows, or appears to show, another person (B) in an intimate state does not commit an offence under section 66B(1), (2) or (3) if—(a) the photograph or film was taken in a place to which the public or a section of the public had or were permitted to have access (whether on payment or otherwise),(b) B had no reasonable expectation of privacy from the photograph or film being taken, and(c) B was, or A reasonably believes that B was, in the intimate state voluntarily.(2) For the purposes of subsection (1)(b), whether a person had a reasonable expectation of privacy from a photograph or film being taken is to be determined by reference to the circumstances that the person sharing the photograph or film reasonably believes to have existed at the time the photograph or film was taken.(3) A person (A) who shares a photograph or film which shows, or appears to show, another person (B) in an intimate state does not commit an offence under section 66B(1), (2) or (3) if—(a) the photograph or film had, or A reasonably believes that the photograph or film had, been previously publicly shared, and(b) B had, or A reasonably believes that B had, consented to the previous sharing.(4) A person (A) who shares a photograph or film which shows, or appears to show, another person (B) in an intimate state does not commit an offence under section 66B(1) if—(a) B is a person under 16,(b) B lacks, or A reasonably believes that B lacks, capacity to consent to the sharing of the photograph or film, and(c) the photograph or film is shared—(i) with a healthcare professional acting in that capacity, or(ii) otherwise in connection with the care or treatment of B by a healthcare professional.(5) A person who shares a photograph or film which shows, or appears to show, a child in an intimate state does not commit an offence under section 66B(1) if the photograph or film is of a kind ordinarily shared between family and friends.(6) A person who threatens to share a photograph or film which shows, or appears to show, another person in an intimate state does not commit an offence under section 66B(4) if, by reason of this section, the person would not commit an offence under section 66B(1), (2) or (3) by sharing the photograph or film in the circumstances conveyed by the threat.66D Sharing or threatening to share intimate photograph or film: interpretation(1) This section applies for the purposes of sections 66B and 66C.(2) A person “shares” something if the person, by any means, gives or shows it to another person or makes it available to another person.(3) But a provider of an internet service by means of which a photograph or film is shared is not to be regarded as a person who shares it.(4) “Photograph” and “film” have the same meaning as in section 66A (see subsections (3) to (5) of that section). (5) Except where a photograph or film falls within subsection (8), a photograph or film “shows, or appears to show, another person in an intimate state” if it shows or appears to show—(a) the person participating or engaging in an act which a reasonable person would consider to be a sexual act,(b) the person doing a thing which a reasonable person would consider to be sexual,(c) all or part of the person’s exposed genitals, buttocks or breasts,(d) the person in an act of urination or defecation, or(e) the person carrying out an act of personal care associated with the person’s urination, defecation or genital or anal discharge.(6) For the purposes of subsection (5)(c) the reference to all or part of a person’s “exposed” genitals, buttocks or breasts includes—(a) a reference to all or part of the person’s genitals, buttocks or breasts visible through wet or otherwise transparent clothing,(b) the case where all or part of the person’s genitals, buttocks or breasts would be exposed but for the fact that they are covered only with underwear, and(c) the case where all or part of the person’s genitals, buttocks or breasts would be exposed but for the fact that they are obscured, provided that the area obscured is similar to or smaller than an area that would typically be covered by underwear worn to cover a person’s genitals, buttocks or breasts (as the case may be).(7) In subsection (6)(c) “obscured” means obscured by any means, other than by clothing that a person is wearing, including, in particular, by an object, by part of a person’s body or by digital alteration.(8) A photograph or film falls within this subsection if (so far as it shows or appears to show a person in an intimate state) it shows or appears to show something, other than breastfeeding, that is of a kind ordinarily seen in public.(9) For the purposes of subsection (8) “breastfeeding” includes the rearranging of clothing in the course of preparing to breastfeed or having just finished breastfeeding.””Member’s explanatory statement
This amendment provides for new offences of sharing or threatening to share intimate photographs or films.
--- Later in debate ---
Moved by
9: After Clause 171, insert the following new Clause—
“Repeals in connection with offences under section (Sharing or threatening to share intimate photograph or film)
Sections 33 to 35 of the Criminal Justice and Courts Act 2015 (disclosing or threatening to disclose private sexual photographs and films with intent to cause distress) are repealed.”Member’s explanatory statement
This amendment is consequential on the new Clause creating offences of sharing or threatening to share intimate photographs or films.
--- Later in debate ---
Moved by
10: Clause 172, page 150, line 15, leave out “section 170” and insert “sections 170 and (Sharing or threatening to share intimate photograph or film)”
Member’s explanatory statement
This amendment provides that Part 3 of Schedule 14 also makes consequential amendments on the new Clause creating offences of sharing and threatening to share intimate photographs or films.
--- Later in debate ---
Moved by
12: Schedule 14, page 240, line 24, after first “the” insert “first”
Member’s explanatory statement
This is a technical amendment ensuring that the amendments made under Schedule 14 to Schedule 1 to the Children and Young Persons Act 1933 are inserted in the correct place in that Act.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Moved by
27: Schedule 1, page 185, line 11, leave out from “provider” to end of line 13 and insert “, including where the publication of the content is effected or controlled by means of—
(a) software or an automated tool or algorithm applied by the provider or by a person acting on behalf of the provider, or(b) an automated tool or algorithm made available on the service by the provider or by a person acting on behalf of the provider.”Member’s explanatory statement
This amendment is about what counts as “provider content” for the purposes of the exemption in paragraph 4 of Schedule 1 of the Bill (which provides that limited functionality services are exempt). Words are added to expressly cover the case where an automated tool or algorithm is made available on the service by a provider, such as a generative AI bot.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, the Government are committed to protecting children against accessing pornography online. As technology evolves, it is important that the regulatory framework introduced by the Bill keeps pace with emerging risks to children and exposure to pornography in new forms, such as generative artificial intelligence.

Part 5 of the Bill has been designed to be future-proof, and we assess that it would already capture AI-generated pornography. Our Amendments 206 and 209 will put beyond doubt that content is “provider pornographic content” where it is published or displayed on a Part 5 service by means of an automated tool or algorithm, such as a generative AI bot, made available on the service by a provider. Amendments 285 and 293 make clear that the definition of an automated tool includes a bot. Amendment 276 clarifies the definition of a provider of a Part 5 service, to make clear that a person who controls an AI bot that generates pornography can be regarded as the provider of a service.

Overall, our amendments provide important certainty for users, providers and Ofcom on the services and content in scope of the Part 5 duties. This will ensure that the new, robust duties for Part 5 providers to use age verification or age estimation to prevent children accessing provider pornographic content will also extend to AI-generated pornography. I beg to move.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the noble Baroness, Lady Kidron, has unfortunately been briefly detained. If you are surprised to see me standing up, it is because I am picking up for her. I start by welcoming these amendments. I am grateful for the reaction to the thought-provoking debate that we had in Committee. I would like to ask a couple of questions just to probe the impact around the edges.

Amendment 27 looks as if it implies that purely content-generating machine-learning or AI bots could be excluded from the scope of the Bill, rather than included, which is the opposite of what we were hoping to achieve. That may be us failing to understand the detail of this large body of different amendments, but I would welcome my noble friend the Minister’s response to make sure that in Amendment 27 we are not excluding harm that could be generated by some form of AI or machine-learning instrument.

Maybe I can give my noble friend the Minister an example of what we are worried about. This is a recent scenario that noble Lords may have seen in the news, of a 15 year-old who asked, “How do I have sex with a 30 year-old?”. The answer was given in forensic detail, with no reference to the fact that it would in fact be statutory rape. Would the regulated service, or the owner of the regulated service that generated that answer, be included or excluded as a result of Amendment 27? That may be my misunderstanding.

This group is on AI-generated pornography. My friend, the noble Baroness, Lady Kidron, and I are both very concerned that it is not just about pornography, and that we should make sure that AI is included in the Bill. Specifically, many of us with teenage children will now be learning how to navigate the Snap AI bot. Would harm generated by that bot be captured in these amendments, or is it only content that is entirely pornographic? I hope that my noble friend the Minister can clarify both those points, then we will be able to support all these amendments.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, this has been a short but important debate and I am grateful to noble Lords for their broad support for the amendments here and for their questions. These amendments will ensure that services on which providers control a generative tool, such as a generative AI bot, are in scope of Part 5 of the Bill. This will ensure that children are protected from any AI-generated pornographic content published or displayed by provider-controlled generative bots. These changes will not affect the status of any non-pornographic AI-generated content, or AI-generated content shared by users.

We are making a minor change to definitions in Part 3 to ensure that comments or reviews on content generated by a provider-controlled artificial intelligence source are not regulated as user-generated content. This is consistent with how the Bill treats comments and reviews on other provider content. These amendments do not have any broader impact on the treatment of bots by Part 3 of the Bill’s regime beyond the issue of comments and reviews. The basis on which a bot will be treated as a user, for example, remains unchanged.

I am grateful to the noble Lord, Lord Clement-Jones, for degrouping his Amendment 152A so that I can come back more fully on it in a later group and I am grateful for the way he spoke about it in advance. I am grateful too for my noble friend Lady Harding’s question. These amendments will ensure that providers which control a generative tool on a service, such as a generative AI bot, are in scope of Part 5 of the Bill. A text-only generative AI bot would not be in scope of Part 5. It is important that we focus on areas which pose the greatest risk of harm to children. There is an exemption in Part 5 for text-based provider pornographic content because of the limited risks posed by published pornographic content. This is consistent with the approach of Part 3 of the Digital Economy Act 2017 and its provisions to protect children from commercial online pornography, which did not include text-based content in scope.

The right reverend Prelate the Bishop of Oxford is right to ask whether we think this is enough. These changes certainly help. The way that the Bill is written in a technology-neutral way will help us to future proof it but, as we have heard throughout the passage of the Bill, we all know that this area of work will need constant examination and scrutiny. That is why the Bill is subject the post-Royal Assent review and scrutiny that it is and why we are grateful for the anticipation noble Lords and Members of Parliament in the other place have already given to ensuring that it delivers on what we want to see. I believe these amendments, which put out of doubt important provisions relating to generative AI, are a helpful addition and I beg to move.

Amendment 27 agreed.
--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, much like the noble Lord, Lord Clement-Jones, I started off being quite certain I knew what to say about these amendments. I even had some notes—unusual for me, I know—but I had to throw them away, which I always do with my notes, because the arguments have been persuasive. That is exactly why we are here in Parliament discussing things: to try to reach common solutions to difficult problems.

We started with a challenge to the Minister to answer questions about scope, exemptions and discretion in relation to a named service—Wikipedia. However, as the debate went on, we came across the uncomfortable feeling that, having got so far into the Bill and agreed a lot of amendments today improving it, we are still coming up against quite stubborn issues that do not fit neatly into the categorisation and structures that we have. We do not seem to have the right tools to answer the difficult questions before us today, let alone the myriad questions that will come up as the technology advances and new services come in. Why have we not already got solutions to the problems raised by Amendments 281, 281A and 281B?

There is also the rather difficult idea we have from the noble Lord, Lord Russell, of dark patterns, which we need to filter into our thinking. Why does that not fit into what we have got? Why is it that we are still worried about Wikipedia, a service for public good, which clearly has risks in it and is sometimes capable of making terrible mistakes but is definitely a good thing that should not be threatened by having to conform with a structure and a system which we think is capable of dealing with some of the biggest and most egregious companies that are pushing stuff at us in the way that we have talked about?

I have a series of questions which I do not have the answers to. I am looking forward to the Minister riding to my aid on a white charger of enormous proportions and great skill which will take us out without having to fall over any fences.

If I may, I suggest to the Minister a couple of things. First, we are stuck on the word “content”. We will come back to that in the future, as we still have an outstanding problem about exactly where the Bill sets it. Time and again in discussions with the Bill team and with Ministers we have been led back to the question of where the content problem lies and where the harms relate to that, but this little debate has shown beyond doubt that harm can occur independent of and separate from content. We must have a solution to that, and I hope it will be quick.

Secondly, when approaching anybody or anything or any business or any charity that is being considered in scope for this Bill, we will not get there if we are looking only at the question of its size and its reach. We have to look at the risks it causes, and we have to drill down hard into what risks we are trying to deal with using our armoury as we approach these companies, because that is what matters to the children, vulnerable people and adults who would suffer otherwise, and not the question of whether or not these companies are big or small. I think there are solutions to that and we will get there, but, when he comes to respond, the Minister needs to demonstrate to us that he is still willing to listen and think again about one or two issues. I look forward to further discussions with him.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am grateful to noble Lords for their contributions during this debate. I am sympathetic to arguments that we must avoid imposing disproportionate burdens on regulated services, and particularly that the Bill should not inhibit services from providing valuable information which is of benefit to the public. However, I want to be clear that that is why the Bill has been designed in the way that it has. It has a broad scope in order to capture a range of services, but it has exemptions and categorisations built into it. The alternative would be a narrow scope, which would be more likely inadvertently to exempt risky sites or to displace harm on to services which we would find are out of scope of the Bill. I will disappoint noble Lords by saying that I cannot accept their amendments in this group but will seek to address the concerns that they have raised through them.

The noble Lord, Lord Allan, asked me helpfully at the outset three questions, to which the answers are yes, no and maybe. Yes, Wikipedia and OpenStreetMap will be in scope of the Bill because they allow users to interact online; no, we do not believe that they would fall under any of the current exemptions in the Bill; and the maybe is that Ofcom does not have the discretion to exempt services but the Secretary of State can create additional exemptions for further categories of services if she sees fit.

I must also say maybe to my noble friend Lord Moylan on his point about Wikipedia—and with good reason. Wikipedia, as I have just explained, is in scope of the Bill and is not subject to any of its exemptions. I cannot say how it will be categorised, because that is based on an assessment made by the independent regulator, but I reassure my noble friend that it is not the regulator but the Secretary of State who will set the categorisation thresholds through secondary legislation; that is to say, a member of the democratically elected Government, accountable to Parliament, through legislation laid before that Parliament. It will then be for Ofcom to designate services based on whether or not they meet those thresholds.

It would be wrong—indeed, nigh on impossible—for me to second-guess that designation process from the Dispatch Box. In many cases it is inherently a complex and nuanced matter since, as my noble friend Lady Harding said, many services change over time. We want to keep the Bill’s provisions flexible as services change what they do and new services are invented.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I would just like to finish my thought on Wikipedia. Noble Lords are right to mention it and to highlight the great work that it does. My honourable friend the Minister for Technology and the Digital Economy, Paul Scully, met Wikipedia yesterday to discuss its concerns about the Bill. He explained that the requirements for platforms in this legislation will be proportionate to the risk of harm, and that as such we do not expect the requirements for Wikipedia to be unduly burdensome.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I am computing the various pieces of information that have just been given, and I hope the Minister can clarify whether I have understood them correctly. These services will be in scope as user-to-user services and do not have an exemption, as he said. The Secretary of State will write a piece of secondary legislation that will say, “This will make you a category 1 service”—or a category 2 or 2B service—but, within that, there could be text that has the effect that Wikipedia is in none of those categories. So it and services like it could be entirely exempt from the framework by virtue of that secondary legislation. Is that a correct interpretation of what he said?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

The Secretary of State could create further exemptions but would have to bring those before Parliament for it to scrutinise. That is why there is a “maybe” in answer to his third question in relation to any service. It is important for the legislation to be future-proofed that the Secretary of State has the power to bring further categorisations before Parliament for it to discuss and scrutinise.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, I will keep pressing this point because it is quite important, particularly in the context of the point made by the noble Baroness, Lady Kidron, about categorisation, which we will debate later. There is a big difference when it comes to Schedule 11, which defines the categorisation scheme: whether in the normal run of business we might create an exemption in the categorisation secondary legislation, or whether it would be the Secretary of State coming back with one of those exceptional powers that the Minister knows we do not like. He could almost be making a case for why the Secretary of State has to have these exceptional powers. We would be much less comfortable with that than if the Schedule 11 categorisation piece effectively allowed another class to be created, rather than it being an exceptional Secretary of State power.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I do not think that it is, but it will be helpful to have a debate on categorisation later on Report, when we reach Amendment 245, to probe this further. It is not possible for me to say that a particular service will certainly be categorised one way or another, because that would give it carte blanche and we do not know how it may change in the future—estimable though I may think it is at present. That is the difficulty of setting the precise parameters that the noble Baroness, Lady Fox, sought in her contribution. We are setting broad parameters, with exemptions and categorisations, so that the burdens are not unduly heavy on services which do not cause us concern, and with the proviso for the Secretary of State to bring further exemptions before Parliament, as circumstances strike her as fit, for Parliament to continue the debate we are having now.

The noble Baroness, Lady Kidron, in her earlier speech, asked about the functionalities of user-to-user services. The definitions of user-to-user services are broad and flexible, to capture new and changing services. If a service has both user-to-user functionality and a search engine, it will be considered a combined service, with respective duties for the user-to-user services which form part of its service and search duties in relation to the search engine.

I reassure my noble friend Lady Harding of Winscombe that the Bill will not impose a disproportionate burden on services, nor will it impede the public’s access to valuable content. All duties on services are proportionate to the risk of harm and, crucially, to the capacity of companies. The Bill’s proportionate design means that low-risk services will have to put in place only measures which reflect the risk of harm to their users. Ofcom’s guidance and codes of practice will clearly set out how these services can comply with their duties. We expect that it will set out a range of measures and steps for different types of services.

Moreover, the Bill already provides for wholesale exemptions for low-risk services and for Ofcom to exempt in-scope services from requirements such as record-keeping. That will ensure that there are no undue burdens to such services. I am grateful for my noble friend’s recognition, echoed by my noble friend Lady Stowell of Beeston, that “non-profit” does not mean “not harmful” and that there can be non-commercial services which may pose harms to users. That is why it is important that there is discretion for proper assessment.

Amendment 30 seeks to allow Ofcom to withdraw the exemptions listed in Schedule 1 from the Bill. I am very grateful to my noble friend Lord Moylan for his time earlier this week to discuss his amendment and others. We have looked at it, as I promised we would, but I am afraid that we do not think that it would be appropriate for Ofcom to have this considerable power—my noble friend is already concerned that the regulator has too much.

The Bill recognises that it may be necessary to remove certain exemptions if there is an increased risk of harm from particular types of services. That is why the Bill gives the Secretary of State the power to remove particular exemptions, such as those related to services which have limited user-to-user functionality and those which offer one-to-one live aural communications. These types of services have been carefully selected as areas where future changes in user behaviour could necessitate the repeal or amendment of an exemption in Schedule 1. This power is intentionally limited to only these types of services, meaning that the Secretary of State will not be able to remove exemptions for comments on recognised news publishers’ sites. That is in recognition of the Government’s commitment to media freedom and public debate. It would not be right for Ofcom to have the power to repeal those exemptions.

Amendments 281 and 281B, in the name of the noble Lord, Lord Russell of Liverpool, are designed to ensure that the lists of features under the definition of “functionality” in the Bill apply to all regulated services. Amendment 281A aims to add additional examples of potentially addictive functionalities to the Bill’s existing list of features which constitute a “functionality”. I reassure him and other noble Lords that the list of functionalities in the Bill is non-exhaustive. There may be other functionalities which could cause harm to users and which services will need to consider as part of their risk assessment duties. For example, if a provider’s risk assessment identifies that there are functionalities which risk causing significant harm to an appreciable number of children on its service, the Bill will require the provider to put in place measures to mitigate and manage that risk.

He and other noble Lords spoke about the need for safety by design. I can reassure them this is already built into the framework of the Bill, which recognises how functionalities including many of the things mentioned today can increase the risk of harm to users and will encourage the safe design of platforms.

Amendments 281 and 281B have the effect that regulated services would need to consider the risk of harm of functionalities that are not relevant for their kind of service. For example, sharing content with other users is a functionality of user-to-user services, which is not as relevant for search services. The Bill already outlines specific features that both user-to-user and search services should consider, which are the most relevant functionalities for those types of service. Considering these functionalities would create an unnecessary burden for regulated services which would detract from where their efforts can best be focused. That is why I am afraid I cannot accept the amendments that have been tabled.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, surely it is the role of the regulators to look at functionalities of this kind. The Minister seemed to be saying that it would be an undue burden on the regulator. Is not that exactly what we are meant to be legislating about at this point?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Perhaps I was not as clear as I could or should have been. The regulator will set out in guidance the duties that fall on the businesses. We do not want the burden on the business to be unduly heavy, but there is an important role for Ofcom here. I will perhaps check—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

But these functionalities are a part of their business model, are they not?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Hence Ofcom will make the assessments about categorisation based on that. Maybe I am missing the noble Lord’s point.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I think we may need further discussions on the amendment from the noble Lord, Lord Russell.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I will check what I said but I hope that I have set out why we have taken the approach that we have with the broad scope and the exemptions and categorisations that are contained in it. With that, I urge the noble Lord to withdraw his amendment.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, that was a very useful debate. I appreciate the Minister’s response and his “yes, no, maybe” succinctness, but I think he has left us all more worried than when the debate started. My noble friend Lord Clement-Jones tied it together nicely. What we want is for the regulator to be focused on the greatest areas of citizen risk. If there are risks that are missing, or things that we will be asking the regulator to do that are a complete waste of time because they are low risk, then we have a problem. We highlighted both those areas. The noble Lord, Lord Russell, rightly highlighted that we are not content with just “content” as the primary focus of the legislation; it is about a lot more than content. In my amendment and those by the noble Lord, Lord Moylan, we are extremely worried—and remain so—that the Bill creates a framework that will trap Wikipedia and services like it, without that being our primary intention. We certainly will come back to this in later groups; I will not seek to press the amendment now, because there is a lot we all need to digest. However, at the end of this process, we want to get to point where the regulator is focused on things that are high risk to the citizen and not wasting time on services that are very low risk. With that, I beg leave to withdraw my amendment.

--- Later in debate ---
Moved by
31: Clause 5, page 4, line 40, leave out “section 54” and insert “sections 54 to (“Priority content that is harmful to children”)”
Member’s explanatory statement
This amendment is consequential on the new Clauses proposed to be inserted after Clause 54 in my name setting out which kinds of content count as primary priority content and priority content harmful to children.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, the government amendments in this group relate to the categories of primary priority and priority content that is harmful to children.

Children must be protected from the most harmful online content and activity. As I set out in Committee, the Government have listened to concerns about designating primary priority and priority categories of content in secondary legislation and the need to protect children from harm as swiftly as possible. We have therefore tabled amendments to set out these categories in the Bill. I am grateful for the input from across your Lordships’ House in finalising the scope of these categories.

While it is important to be clear about the kinds of content that pose a risk of harm to children, I acknowledge what many noble Lords raised during our debates in Committee, which is that protecting children from online harm is not just about content. That is why the legislation takes a systems and processes approach to tackling the risk of harm. User-to-user and search service providers will have to undertake comprehensive, mandatory risk assessments of their services and consider how factors such as the design and operation of a service and its features and functionalities may increase the risk of harm to children. Providers must then put in place measures to manage and mitigate these risks, as well as systems and processes to prevent and protect children from encountering the categories of harmful content.

We have also listened to concerns about cumulative harm. In response to this, the Government have tabled amendments to Clause 209 to make it explicit that cumulative harm is addressed. This includes cumulative harm that results from algorithms bombarding a user with content, or where combinations of functionality cumulatively drive up the risk of harm. These amendments will be considered in more detail under a later group of amendments, but they are important context for this discussion.

I turn to the government amendments, starting with Amendment 171, which designates four categories of primary priority content. First, pornographic content has been defined in the same way as in Part 5—to give consistent and comprehensive protection for children, regardless of the type of service on which the pornographic content appears. The other three categories capture content which encourages, promotes or provides instructions for suicide, self-harm or eating disorders. This will cover, for example, glamorising or detailing methods for carrying out these dangerous activities. Designating these as primary priority content will ensure that the most stringent child safety duties apply.

Government Amendment 172 designates six categories of priority content. Providers will be required to protect children from encountering a wide range of harmful violent content, which includes depictions of serious acts of violence or graphic injury against a person or animal, and the encouragement and promotion of serious violence, such as content glamorising violent acts. Providers will also be required to protect children from encountering abusive and hateful content, such as legal forms of racism and homophobia, and bullying content, which sadly many children experience online.

The Government have heard concerns from the noble Baronesses, Lady Kidron and Lady Finlay of Llandaff, about extremely dangerous activities being pushed to children as stunts, and content that can be harmful to the health of children, including inaccurate health advice and false narratives. As such, we are designating content that encourages dangerous stunts and challenges as a category of priority content, and content which encourages the ingestion or inhalation of, or exposure to, harmful substances, such as harmful abortion methods designed to be taken by a person without medical supervision.

Amendment 174, from the noble Baroness, Lady Kidron, seeks to add “mis- and disinformation” and “sexualised content” to the list of priority content. On the first of these, I reiterate what I said in Committee, which is that the Bill will protect children from harmful misinformation and disinformation where it intersects with named categories of primary priority or priority harmful content—for example, an online challenge which is promoted to children on the basis of misinformation or disinformation, or abusive content with a foundation in misinformation or disinformation. However, I did not commit to misinformation and disinformation forming its own stand-alone category of priority harmful content, which could be largely duplicative of the categories that we have already included in the Bill and risks capturing a broad range of legitimate content.

We have already addressed key concerns related to misinformation and disinformation content which presents the greatest risk to children by including content which encourages the ingestion or inhalation of, or exposure to, harmful substances to the list of priority categories. However, the term “mis- and disinformation”, as proposed by Amendment 174, in its breadth and subjectivity risks inadvertently capturing a wide range of content resulting in disproportionate, excessive censorship of the content children see online, including in areas of legitimate debate. The harm arising from misinformation or disinformation usually arises from the context or purpose of the content, rather than the mere fact that it is untrue. Our balanced approach ensures that children are protected from the most prevalent and concerning harms associated with misinformation and disinformation.

--- Later in debate ---
Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, we spent a lot of time in Committee raising concerns about how pornography and age verification were going to operate across all parts of the Bill. I have heard what the Minister has said in relation to this group, priority harms to children, which I believe is one of the most important groups under discussion in the Bill. I agree that children must be protected from the most harmful content online and offline.

I am grateful to the Government for having listened carefully to the arguments put forward by the House in this regard and commend the Minister for all the work he and his team have done since them. I also commend the noble Lord, Lord Bethell. He and I have been in some discussion between Committee and now in relation to these amendments.

In Committee, I argued for several changes to the Bill which span three groups of amendments. One of my concerns was that pornography should be named as a harm in the Bill. I welcome the Government’s Amendment 171, which names pornography as a primary priority content. I also support Amendment 174 in the name of the noble Baroness, Lady Kidron. She is absolutely right that sexualised content can be harmful to children if not age appropriate and, in that regard, before she even speaks, I ask the Minister tousb reconsider his views on this amendment and to accept it.

Within this group are the amendments which move the definition of “pornographic content” from Part 5 to Clause 211. In that context, I welcome the Government’s announcement on Monday about a review of the regulation, legislation and enforcement of pornography offences.

In Committee, your Lordships were very clear that there needed to be a consistent approach across the Bill to the regulation of pornography. I am in agreement with the amendments tabled in Committee to ensure that consistency applies across all media. In this regard, I thank the noble Baroness, Lady Benjamin, for her persistence in raising this issue. I also thank my colleagues on the Opposition Front Bench, the noble Lord, Lord Stevenson, and the noble Baroness, Lady Merron.

I appreciate that the Government made this announcement only three days ago, but I hope the Minister will set out a timetable for publishing the terms of reference and details of how this review will take place. The review is too important to disappear into the long grass over the Summer Recess, never to be heard of again, so if he is unable to answer my question today, will he commit to writing to your Lordships with the timeframe before the House rises for the summer? Will he consider the active involvement of external groups in this review, as much expertise lies outside government in this area? In that regard, I commend CARE, CEASE and Barnardo’s for all their input into the debates on the Bill.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, I think the noble Baroness’s comments relate to the next group of amendments, on pornography. She might have skipped ahead, but I am grateful for the additional thinking time to respond to her questions. Perhaps she will save the rest of her remarks for that group.

Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- Hansard - - - Excerpts

I thank the Minister for that. In conclusion, I hope he will reflect on those issues and come back, maybe at the end of the next group. I remind the House that in February the APPG on Commercial Sexual Exploitation, in its inquiry on pornography, recommended that the regulation of pornography should be consistent across all online platforms and between the online and offline spheres. I hope we can incorporate the voices I have already mentioned in the NGO sphere in order to assist the Government and both Houses in ensuring that we regulate the online platforms and that children are protected from any harms that may arise.

--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, like the noble Baroness, Lady Harding, I want to make it very clear that I think the House as a whole welcomes the change of heart by the Government to ensure that we have in the Bill the two sides of the question of content that will be harmful to children. We should not walk away from that. We made a big thing of this in Committee. The Government listened and we have now got it. The fact that we do not like it—or do not like bits of it—is the price we pay for having achieved something which is, probably on balance, good.

The shock comes from trying to work out why it is written the way it is, and how difficult it is to see what it will mean in practice when companies working to Ofcom’s instructions will take this and make this happen in practice. That lies behind, I think I am right in saying, the need for the addition to Amendment 172 from the noble Baroness, Lady Kidron, which I have signed, along with the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford. Both of them have spoken well in support of it and I do not need to repeat those points.

Somehow, in getting the good of Amendments 171 and 172, we have lost the flexibility that we think we want as well to try to get that through. The flexibility does exist, because the Government have retained powers to amend and change both primary priority content that is harmful to children and the primary content. Therefore, subject to approval through the secondary legislation process, this House will continue to have a concern about that—indeed, both Houses will.

Somehow, however, that does not get to quite where the concern comes from. The concern should be both the good points made by the noble Lord, Lord Russell—I should have caught him up in the gap and said I had already mentioned the fact that we had been together at the meeting. He found some additional points to make which I hope will also be useful to future discussion. I am glad he has done that. He is making a very good point in relation to cultural context and the work that needs to go on—which we have talked about in earlier debates—in order to make this live: in other words, to make people who are responsible for delivering this through Ofcom, but also those who are delivering it through companies, to understand the wider context. In that sense, clearly we need the misinformation/disinformation side of that stuff. It is part and parcel of the problems we have got. But more important even than that is the need to see about the functionality issues. We have come back to that. This Bill is about risk. The process that we will be going through is about risk assessment and making sure that the risks are understood by those who deliver services, and the penalties that follow the failure of the risk assessment process delivering change that we want to see in society.

However, it is not just about content. We keep saying that, but we do not see the changes around it. The best thing that could happen today would be if the Minister in responding accepted that these clauses are good—“Tick, we like them”—but could we just not finalise them until we have seen the other half of that, which is: what are the other risks to which those users of services that we have referred to and discussed are receiving through the systemic design processes that are designed to take them in different directions? It is only when we see the two together that we will have a proper concern.

I may have got this wrong, but the only person who can tell us is the Minister because he is the only one who really understands what is going on in the Bill. Am I not right in saying—I am going to say I am right; he will say no, I am not, but I am, aren’t I?—that we will get to Clauses 208 and 209, or the clauses that used to be 208 and 209, one of which deals with harms from content and the other deals with functionality? We may need to look at the way in which those are framed in order to come back and understand better how these lie and how they interact with that. I may have got the numbers wrong—the Minister is looking a bit puzzled, so I probably have—but the sense is that this will probably not come up until day 4. While I do not want to hold back the Bill, we may need to look at some of the issues that are hidden in the interstices of this set of amendments in order to make sure that the totality is better for those who have to use it.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, this has been a useful debate. As the noble Baroness, Lady Kidron, says, because I spoke first to move the government amendments, in effect I got my response in first to her Amendment 174, the only non-government amendment in the group. That is useful because it allows us to have a deeper debate on it.

The noble Baroness asked about the way that organisations such as the British Board of Film Classification already make assessments of sexualised content. However, the Bill’s requirement on service providers and the process that the BBFC takes to classify content are not really comparable. Services will have far less time and much more content to consider them the BBFC does, so will not be able to take the same approach. The BBFC is able to take an extended time to consider maybe just one scene, one image or one conversation, and therefore can apply nuance to its assessments. That is not possible to do at the scale at which services will have to apply the child safety duties in the Bill. We therefore think there is a real risk that they would excessively apply those duties and adversely affect children’s rights online.

I know the noble Baroness and other noble Lords are rightly concerned with protecting rights to free expression and access to information online for children and for adults. It is important that we strike the right balance, which is what we have tried to do with the government amendments in this group.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

To be clear, the point that I made about the BBFC was not to suggest a similar arrangement but to challenge the idea that we cannot categorise material of a sexualised nature. Building on the point made by the noble Lord, Lord Allan, perhaps we could think about it in terms of the amber light rather than the red light—in other words, something to think about.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I certainly will think about it, but the difficulty is the scale of the material and the speed with which we want these assessments to be made and that light to be lit, in order to make sure that people are properly protected.

My noble friend Lord Moylan asked about differing international terminology. In order for companies to operate in the United Kingdom they must have an understanding of the United Kingdom, including the English-language terms used in our legislation. He made a point about the Equality Act 2010. While it uses the same language, it does not extend the Equality Act to this part of the Bill. In particular, it does not create a new offence.

The noble Baroness, Lady Fox, also mentioned the Equality Act when she asked about the phraseology relating to gender reassignment. We included this wording to ensure that the language used in the Bill matches Section 7(1) of the Equality Act 2010 and that gender reassignment has the same meaning in the Bill as it does in that legislation. As has been said by other noble Lords—

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I clarify that what I said was aimed at protecting children. Somebody corrected me and asked, “Do you know that this says ‘abusive’?”—of course I do. What I suggested was that this is an area that is very contentious when we talk about introducing it to children. I am thinking about safeguarding children in this instance, not just copying and pasting a bit of an Act.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

I take this opportunity to ask my noble friend the Minister a question; I want some clarity about this. Would an abusive comment about a particular religion—let us say a religion that practised cannibalism or a historical religion that sacrificed babies, as we know was the norm in Carthage—count as “priority harmful content”? I appreciate that we are mapping the language of the Equality Act, but are we creating a new offence of blasphemy in this Bill?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

As was pointed out by others in the debate, the key provision in Amendment 172 is subsection (2) of the proposed new clause, which relates to:

“Content which is abusive and which targets any of the following characteristics”.


It must both be abusive and target the listed characteristics. It does not preclude legitimate debate about those things, but if it were abusive on the basis of those characteristics—rather akin to the debate we had in the previous group and the points raised by the noble Baroness, Lady Kennedy of The Shaws, about people making oblique threats, rather than targeting a particular person, by saying, “People of your characteristic should be abused in the following way”—it would be captured.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I will keep this short, because I know that everyone wants to get on. It would be said that it is abusive to misgender someone; in the context of what is going on in sixth forms and schools, I suggest that this is a problem. It has been suggested that showing pictures of the Prophet Muhammad in an RE lesson—these are real-life events that happen offline—is abusive. I am suggesting that it is not as simple as saying the word “abusive” a lot. In this area, there is a highly contentious and politicised arena that I want to end, but I think that this will exacerbate, not help, it.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My noble friend seemed to confirm what I said. If I wish to be abusive—in fact, I do wish to be abusive—about the Carthaginian religious practice of sacrificing babies to Moloch, and I were to do that in a way that came to the attention of children, would I be caught as having created “priority harmful content”? My noble friend appears to be saying yes.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Does my noble friend wish to do that and direct it at children?

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

With respect, it does not say “directed at children”. Of course, I am safe in expressing that abuse in this forum, but if I were to do it, it came to the attention of children and it were abusive—because I do wish to be abusive about that practice—would I have created “priority harmful content”, about which action would have to be taken?

--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

May I attempt to assist the Minister? This is the “amber” point described by the noble Lord, Lord Allan: “priority content” is not the same as “primary priority content”. Priority content is our amber light. Even the most erudite and scholarly description of baby eating is not appropriate for five year-olds. We do not let it go into “Bod” or any of the other of the programmes we all grew up on. This is about an amber warning: that user-to-user services must have processes that enable them to assess the risk of priority content and primary priority content. It is not black and white, as my noble friend is suggesting; it is genuinely amber.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, we may be slipping back into a Committee-style conversation. My noble friend Lord Moylan rightly says that this is the first chance we have had to examine this provision, which is a concession wrung out of the Government in Committee. As the noble Lord, Lord Stevenson, says, sometimes that is the price your Lordships’ House pays for winning these concessions, but it is an important point to scrutinise in the way that my noble friend Lord Moylan and the noble Baroness, Lady Fox, have done.

I will try to reassure my noble friend and the noble Baroness. This relates to the definition of a characteristic with which we began our debates today. To be a characteristic it has to be possessed by a person; therefore, the content that is abusive and targets any of the characteristics has to be harmful to an individual to meet the definition of harm. Further, it has to be material that would come to the attention of children in the way that the noble Baronesses who kindly leapt to my defence and added some clarity have set out. So my noble friend would be able to continue to criticise the polytheistic religions of the past and their tendencies to his heart’s content, but there would be protections in place if what he was saying was causing harm to an individual—targeting them on the basis of their race, religion or any of those other characteristics—if that person was a child. That is what noble Lords wanted in Committee, and that is what the Government have brought forward.

My noble friend and others asked why mis- and disinformation were not named as their own category of priority harmful content to children. Countering mis- and disinformation where it intersects with the named categories of primary priority or priority harmful content, rather than as its own issue, will ensure that children are protected from the mis- and disinformation narratives that present the greatest risk of harm to them. We recognise that mis- and disinformation is a broad and cross-cutting issue, and we therefore think the most appropriate response is to address directly the most prevalent and concerning harms associated with it; for example, dangerous challenges and hoax health advice for children to self-administer harmful substances. I assure noble Lords that any further harmful mis- and disinformation content will be captured as non-designated content where it presents a material risk of significant harm to an appreciable number of children.

In addition, the expert advisory committee on mis- and disinformation, established by Ofcom under the Bill, will have a wide remit in advising on the challenges of mis- and disinformation and how best to tackle them, including how they relate to children. Noble Lords may also have seen that the Government have recently tabled amendments to update Ofcom’s statutory media literacy duty. Ofcom will now be required to prioritise users’ awareness of and resilience to misinformation and disinformation online. This will include children and their awareness of and resilience to mis- and disinformation.

My noble friend Lady Harding of Winscombe talked about commercial harms. Harms exacerbated by the design and operation of a platform—that is, their commercial models—are covered in the Bill already through the risk assessment and safety duties. Financial harm, as used in government Amendment 237, is dealt with by a separate legal framework, including the Consumer Protection from Unfair Trading Regulations. This exemption ensures that there is no regulatory overlap.

The noble Lord, Lord Russell of Liverpool, elaborated on remarks made earlier by the noble Lord, Lord Stevenson of Balmacara, about their meeting looking at the incel movement, if it can be called that. I assure the noble Lord and others that Ofcom has a review and report duty and will be required to stay on top of changes in the online harms landscape and report to government on whether it recommends changes to the designated categories of content because of the emerging risks that it sees.

The noble Baroness, Lady Kidron, anticipated the debate we will have on Monday about functionalities and content. I am grateful to her for putting her name to so many of the amendments that we have brought forward. We will continue the discussions that we have been having on this point ahead of the debate on Monday. I do not want to anticipate that now, but I undertake to carry on those discussions.

In closing, I reiterate what I know is the shared objective across your Lordships’ House—to protect children from harmful content and activity. That runs through all the government amendments in this group, which cover the main categories of harmful content and activity that, sadly, too many children encounter online every day. Putting them in primary legislation enables children to be swiftly protected from encountering them. I therefore hope that noble Lords will be heartened by the amendments that we have brought forward in response to the discussion we had in Committee.

Amendment 31 agreed.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Moved by
32: Clause 6, page 5, line 29, at end insert—
“(ba) the duties about assessments related to adult user empowerment set out in section (Assessment duties: user empowerment),”Member’s explanatory statement
This amendment ensures that the new duties in the new Clause proposed after Clause 11 in my name are imposed on providers of Category 1 services.
--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - -

My Lords, as noble Lords will be aware, the Government removed the legal but harmful provisions from the Bill in another place, given concerns about freedom of expression. I know that many noble Lords would not have taken that approach, but I am grateful for their recognition of the will of the elected House in this regard as well as for their constructive contributions about ways of strengthening the Bill while continuing to respect that.

I am therefore glad to bring forward a package of amendments tabled in my name relating to adult safety. Among other things, these strengthen our existing approach to user empowerment and terms of service by rebalancing the power over the content adults see and interact with online, moving the choice away from unaccountable technology companies and towards individual users.

First, we are introducing a number of amendments, which I am pleased to say have the support of the Opposition Front Bench, which will introduce a comprehensive duty on category 1 providers to carry out a full assessment of the incidence of user empowerment content on their services. The amendments will mean that platforms can be held to account by Ofcom and their users when they fail to assess the incidence of this kind of content on their services or when they fail to offer their users an appropriate ability to control whether or not they view it.

Amendments 19 to 21 and 26—I am grateful to noble Lords opposite for putting their names to them—will strengthen the user empowerment content duty. Category 1 providers will now need proactively to ask their registered adult users how they would like the control features to be applied. We believe that these amendments achieve two important aims that your Lordships have been seeking from these duties: first, they ensure that they are more visible for registered adult users; and, secondly, they offer better protection for young adult users.

Amendments 55 and 56, tabled by the noble Lord, Lord Clement-Jones, my noble friend Lord Moylan and the noble Baroness, Lady Fox of Buckley, seek to provide users with a choice over how the tools are applied for each category of content set out in Clause 12(10), (11) and (12). The legislation gives platforms the flexibility to decide what tools they offer in compliance with Clause 12(2). A blanket approach is unlikely to be consistent with the duty on category 1 services to have particular regard to the importance of protecting users’ freedom of expression when putting these features in place. Additionally, the measures that Ofcom will recommend in its code of practice must consider the impact on freedom of expression so are unlikely to be a blanket approach.

Amendments 58 and 63 would require providers to set and enforce consistent terms of service on how they identify the categories of content to which Clause 12(2) applies; and to apply the features to content only when they have reasonable grounds to infer that it is user empowerment content. I assure noble Lords that the Bill’s freedom of expression duties will prevent providers overapplying the features or adopting an inconsistent or capricious approach. If they do, Ofcom can take enforcement action.

Amendments 59, 64 and 181, tabled by the noble Lord, Lord Clement-Jones, seek to require that the user empowerment and user verification features are provided at no cost. I reassure the noble Lord that the effect of these amendments is already achieved by the drafting of Clause 12. Category 1 providers will be compliant with their duties only if they proactively ask all registered users whether or not they want to use the user empowerment content features, which would not be possible with a paywall. Amendment 181 is similar and applies to user verification. While the Bill does not specify that verification must be free of charge, category 1 providers can meet the duties in the Bill only by offering all adult users the option to verify themselves.

Turning to Amendment 204, tabled by the noble Baroness, Lady Finlay of Llandaff, I share her concern about the impact that self-harm and suicide content can have. However, as I said in Committee, the Bill goes a long way to provide protections for both children and adults from this content. First, it includes the new criminal offence of encouraging or assisting self-harm. This then feeds through into the Bill’s illegal content duties. Companies will be required to take down such content when it is reported to them by users.

Beyond the illegal content duties, there are specific protections in place for children. The Government have tabled amendments designating content that encourages, promotes or provides instructions as a category of primary priority content, meaning that services will have to prevent children of all ages encountering it. For adults, the Government listened to concerns and, as mentioned, have strengthened the user empowerment duties to make it easier for adult users to opt in to using them by offering a forced choice. We have made a careful decision, however, to balance these protections with users’ right to freedom of expression and therefore cannot require platforms to treat legal content accessed by adults in a prescribed way. That is why, although I share the noble Baroness’s concerns about the type of content that she mentions, I cannot accept her amendment and hope that she will agree.

The Bill’s existing duties require category 1 platforms to offer users the ability to verify their identity. Clause 12 requires category 1 platforms to offer users the ability to filter out users who have not verified their identity. Amendment 183 from my noble friend Lord Moylan seeks to give Ofcom the discretion to decide when it is and is not proportionate for category 1 services to offer users the ability to verify their identity. We do not believe that these will be excessively burdensome, given that they will apply only to category 1 companies, which have the resource and capacity to offer such tools.

Amendment 182 would require platforms to offer users the option to make their verification status visible. The existing duty in Clause 57, in combination with the duty in Clause 12, will already provide significant protections for adults from anonymous abuse. Adult users will now be able to verify their own status and decide to interact only with other verified users, whether or not their status is visible. We do not believe that this amendment would provide additional protections.

The Government carefully considered mandating that all users display their verification status, which may heighten some users’ safety, but it would be detrimental to vulnerable users, who may need to remain anonymous for perfectly justifiable reasons. Further government amendments in my name will expand the types of information that Ofcom can require category 1, 2A and 2B providers to publish in their transparency reports in relation to user empowerment content.

Separately, but also related to transparency, government Amendments 189 and 202 make changes to Clause 67 and Schedule 8. These relate to category 1 providers’ duties to create clear and accessible terms of service and apply them consistently and transparently. Our amendments tighten these parts of the Bill so that all the providers’ terms through which they might indicate that a certain type of content is not allowed on their service, are captured by these duties.

I hope that noble Lords will therefore accept the Government amendments in this group and that my anticipatory remarks about their amendments will give them some food for thought as they make their contributions. I beg to move.

--- Later in debate ---
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I am happy to acknowledge and recognise what the Government did when they created user empowerment duties to replace legal but harmful. I think they were trying to counter the dangers of over-paternalism and illiberalism that oblige providers to protect adult users from content that allegedly would cause them harm.

At least the new provisions brought into the Bill have a different philosophy completely. They enhance users’ freedom as individuals and allow them to apply voluntary content filters and freedom of choice, on the principle that adults can make decisions for themselves.

In case anyone panics, I am not making a philosophical speech. I am reminding the Government that that is what they said to us—to everybody—“We are getting rid of legal but harmful because we believe in this principle”. I am worried that some of the amendments seem to be trying to backtrack from that different basis of the Bill—and that more liberal philosophy—to go back to the old legal but harmful. I say to the noble Lord, Lord Allan of Hallam, that the cat is distinctly not dead.

The purpose of Amendment 56 is to try to ensure that providers also cannot thwart the purpose of Clause 12 and make it more censorious and paternalistic. I am not convinced that the Government needed to compromise on this as I think Amendment 60 just muddies the waters and fudges the important principle that the Government themselves originally established.

Amendment 56 says that the default must be no filtering at all. Then users have to make an active decision to switch on the filtering. The default is that you should be exposed to a full flow of ideas and, if you do not want that, you have to actively decide not to and say that you want a bowdlerised or sanitised version.

Amendment 56 takes it a bit further, in paragraph (b), and applies different levels of filtering in terms of content of democratic importance and journalistic content. In the Bill itself, the Government accept the exceptional nature of those categories of content, and this just allows users to be able to do the same and say, “No; I might want to filter some things out but bear in mind the exceptional importance of democratic and journalistic content”. I worry that the government amendments signal to users that certain ideas are dangerous and must be hidden. That is my big concern. In other words, they might be legal but they are harmful: that is what I think these amendments try to counter.

One of the things that worries me about the Bill is the danger of echo chambers. I know we are concentrating on harms, but I think echo chambers are harmful. I started today quite early at Blue Orchid at 55 Broadway with a big crowd of sixth formers involved in debating matters. I complimented Keir Starmer on his speech on the importance of oracy and encouraging young people to speak. I stressed to all the year 12 and year 13 young people that the important thing was that they spoke out but also that they listened to contrary opinions and got out of their safe spaces and echo chambers. They were debating very difficult topics such as commercial surrogacy, cancel culture and the risks of contact sports. I am saying all that to them and then I am thinking, “We have now got a piece of legislation that says you can filter out all the stuff you do not want to hear and create your own safe space”. So I just get anxious that we do not inadvertently encourage in the young—I know this is for all adults—that antidemocratic tendency to not want to hear what you do not want to hear, even when it would be good to hear as many opinions as possible.

I also want to press the Minister on the problem of filtering material that targets race, religion, sex, sexual orientation, disability and gender reassignment. I keep trying to raise the problem that it could lead to diverse philosophical views around those subjects also being removed by overzealous filtering. You might think that you know what you are asking to be filtered out. If you say you want to filter out material that is anti-religion, you might not mean that you do not want any debates on religious tolerance. For example, there was that major controversy over the “The Lady of Heaven” film. I know the Minister was interested, as I was, in the dangers of censorship in relation to that. You would not want, because you said, “Don’t target me for my religion”, to not be able to access that debate.

I think there is a danger that we are handing a lot of power to filterers to make filtering decisions based on their values when we are not clear about what they are. Look at what has happened with the banks in the last few days. Their values have closed down people’s bank accounts because they disagree on values. Again, we say “Don’t target on race”, but I have been having lots of arguments with people recently who have accused the Government, through their Illegal Migration Bill, of being racist. I think we just need to know that we are not accepting an ideological filtering of what we see.

Amendment 63 is key because it requires providers’ terms of service to include provisions about how content to which Clause 12(2) applies is identified, precisely to try to counter these problems. It imposes a duty on providers to apply those provisions consistently, as the noble Lord, Lord Moylan, explained. The point that providers have to set out how they identify content that is allegedly hostile, for example, to religion, or racially abusive, is important because this is about empowering users. Users need to know whether this will be done by machine learning or will it be a human doing it. Do they look for red flags and, if so, what are the red flags? How are these things decided? That means that providers have to state clearly and be accountable for their definition of any criteria that could justify them filtering out and disturbing the flow of democratic information. It is all about transparency and accountability in that sense.

Finally, in relation to Amendment 183, I am worried about the notion of filtering out content from unverified users for a range of reasons. It indicates somehow that there is a direct link between being unverified or anonymous and harm or being dodgy, which I think that is illegitimate. It has already been explained that there will be a detrimental impact on certain organisations —we have talked about Reddit, but I like to remember Mumsnet. There are quite a lot of organisations with community-centred models, where the structure is that influencers broadcast to their followers and where there are pseudonymous users. Is the requirement to filter out those contributors likely to lead to those models collapsing? I need to be reassured on this because I am not convinced at all. As has been pointed out, there will be a two-tier internet because those who are unable or unwilling to disclose their identity online or to be verified by someone would be or could be shut out from public discussions. That is a very dangerous place to have ended up, even though I am sure it is not what the Government intend.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I am grateful for the broad, if not universal, support for the amendments that we have brought forward following the points raised in Committee. I apologise for anticipating noble Lords’ arguments, but I am happy to expand on my remarks in light of what they have said.

My noble friend Lord Moylan raised the question of non-verified user duties and crowdsourced platforms. The Government recognise concerns about how the non-verified user duties will work with different functionalities and platforms, and we have engaged extensively on this issue. These duties are only applicable to category 1 platforms, those with the largest reach and influence over public discourse. It is therefore right that such platforms have additional duties to empower their adult users. We anticipate that these features will be used in circumstances where vulnerable adults wish to shield themselves from anonymous abuse. If users decide that they are restricting their experience on a particular platform, they can simply choose not to use them. In addition, before these duties come into force, Ofcom will be required to consult effective providers regarding the codes of practice, at which point they will consider how these duties might interact with various functionalities.

My noble friend and the noble Lord, Lord Allan of Hallam, raised the potential for being bombarded with pop-ups because of the forced-choice approach that we have taken. These amendments have been carefully drafted to minimise unnecessary prompts or pop-ups. That is why we have specified that the requirement to proactively ask users how they want these tools to be applied is applicable only to registered users. This approach ensures that users will be prompted to make a decision only once, unless they choose to ignore it. After a decision has been made, the provider should save this preference and the user should not be prompted to make the choice again.

The noble Lord, Lord Clement-Jones, talked further about his amendments on the cost of user empowerment tools as a core safety duty in the Bill. Category 1 providers will not be able to put the user empowerment tools in Clause 12 behind a pay wall and still be compliant with their duties. That is because they will need to offer them to users at the first possible opportunity, which they will be unable to do if they are behind a pay wall. The wording of Clause 12(2) makes it clear that providers have a duty to include user empowerment features that an adult user may use or apply.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

The Minister may not have the information today, but I would be happy to get it in writing. Can he clarify exactly what will be expected of a service that already prohibits all the Clause 12 bad stuff in their terms of service?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - -

I will happily write to the noble Lord on that.

Clause 12(4) further sets out that all search user empowerment content tools must be made available to all adult users and be easy to access.

The noble Lord, Lord Clement-Jones, on behalf of the noble Baroness, Lady Finlay, talked about people who will seek out suicide, self-harm or eating-disorder content. While the Bill will not prevent adults from seeking out legal content, it will introduce significant protections for adults from some of the most harmful content. The duties relating to category 1 services’ terms of service are expected hugely to improve companies’ own policing of their sites. Where this content is legal and in breach of the company’s terms of service, the Bill will force the company to take it down.

We are going even further by introducing a new user empowerment content-assessment duty. This will mean that where content relates to eating disorders, for instance, but which is not illegal, category 1 providers need fully to assess the incidence of this content on their service. They will need clearly to publish this information in accessible terms of service, so users will be able to find out what they can expect on a particular service. Alternatively, if they choose to allow suicide, self-harm or eating content disorder which falls into the definition set out in Clause 12, they will need proactively to ask users how they would like the user empowerment content features to be applied.

My noble friend Lady Morgan was right to raise the impact on vulnerable people or people with disabilities. While we anticipate that the changes we have made will benefit all adult users, we expect them particularly to benefit those who may otherwise have found it difficult to find and use the user empowerment content features independently—for instance, some users with types of disabilities. That is because the onus will now be on category 1 providers proactively to ask their registered adult users whether they would like these tools to be applied at the first possible opportunity. The requirement also remains to ensure that the tools are easy to access and to set out clearly what tools are on offer and how users can take advantage of them.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, does the Minister have any more to say on identity verification?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am being encouraged to be brief so, if I may, I will write to the noble Lord on that point.

Amendment 32 agreed.
--- Later in debate ---
Moved by
33: Clause 6, page 5, line 37, leave out “duty about record-keeping set out in section 19(9)” and insert “duties about record-keeping set out in section 19(8A) and (9)”
Member’s explanatory statement
This amendment ensures that the new duties in Clause 19 proposed by amendments in my name to that clause are imposed on providers of Category 1 services.
--- Later in debate ---
Moved by
34: Clause 10, page 9, line 13, after “8” insert “and, in the case of services likely to be accessed by children which are Category 1 services, the duties about assessments set out in section (Assessment duties: user empowerment)”
Member’s explanatory statement
This amendment inserts a signpost to the new duties imposed on providers of Category 1 services by the new Clause proposed after Clause 11 in my name.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, I will speak to the government amendments now but not anticipate the non-government amendments in this group.

As noble Lords know, protecting children is a key priority for this Bill. We have listened to concerns raised across your Lordships’ House about ensuring that it includes the most robust protections for children, particularly from harmful content such as pornography. We also recognise the strength of feeling about ensuring the effective use of age-assurance measures, by which we mean age verification and age estimation, given the important role they will have in keeping children safe online.

I thank the noble Baroness, Lady Kidron, and my noble friends Lady Harding of Winscombe and Lord Bethell in particular for their continued collaboration over the past few months on these issues. I am very glad to have tabled a significant package of amendments on age assurance. These are designed to ensure that children are prevented from accessing pornography, whether it is published by providers in scope of the Part 5 duties or allowed by user-to-user services that are subject to Part 3 duties. The Bill will be explicit that services will need to use highly effective age verification or age estimation to meet these new duties.

These amendments will also ensure that there is a clear, privacy-preserving and future-proof framework governing the use of age assurance, which will be overseen by Ofcom. Our amendments will, for the first time, explicitly require relevant providers to use age verification or age estimation to protect children from pornography. Publishers of pornographic content, which are regulated in Part 5, will need to use age verification or age estimation to ensure that children are not normally able to encounter content which is regulated provider pornographic content on their service.

Further amendments will ensure that, where such tools are proactive technology, Ofcom may also require their use for Part 5 providers to ensure compliance. Amendments 279 and 280 make further definitional changes to proactive technology to ensure that it can be recommended or required for this purpose. To ensure parity across all regulated pornographic content in the Bill, user-to-user providers which allow pornography under their terms of service will also need to use age verification or age estimation to prevent children encountering pornography where they identify such content on their service. Providers covered by the new duties will also need to ensure that their use of these measures meets a clear, objective and high bar for effectiveness. They will need to be highly effective at correctly determining whether a particular user is a child. This new bar will achieve the intended outcome behind the amendments which we looked at in Committee, seeking to introduce a standard of “beyond reasonable doubt” for age assurance for pornography, while avoiding the risk of legal challenge or inadvertent loopholes.

To ensure that providers are using measures which meet this new bar, the amendments will also require Ofcom to set out, in its guidance for Part 5 providers, examples of age-verification and age-estimation measures which are highly effective in determining whether a particular user is a child. Similarly, in codes of practice for Part 3 providers, Ofcom will need to recommend age-verification or age-estimation measures which can be used to meet the new duty to use highly effective age assurance. This will meet the intent of amendments tabled in Committee seeking to require providers to use measures in a manner approved by Ofcom.

I confirm that the new requirement for Part 3 providers will apply to all categories of primary priority content that is harmful to children, not just pornography. This will mean that providers which allow content promoting or glorifying suicide, self-harm and eating disorders will also be required to use age verification or age estimation to protect children where they identify such content on their service.

Further amendments clarify that a provider can conclude that children cannot access a service—and therefore that the service is not subject to the relevant children’s safety duty—only if it uses age verification or age estimation to ensure that children are not normally able to access the service. This will ensure consistency with the new duties on Part 3 providers to use these measures to prevent children’s access to primary priority content. Amendment 34 inserts a reference to the new user empowerment duties imposed on category 1 providers in the child safety duties.

Amendment 214 will require Part 5 providers to publish a publicly available summary of the age-verification or age-estimation measures that they are using to ensure that children are not normally able to encounter content that is regulated provider pornographic content on their service. This will increase transparency for users on the measures that providers are using to protect children. It also aligns the duties on Part 5 providers with the existing duties on Part 3 providers to include clear information in terms of service on child protection measures or, for search engines, a publicly available statement on such measures.

I thank the noble Baroness, Lady Kidron, for her tireless work relating to Amendment 124, which sets out a list of age-assurance principles. This amendment clearly sets out the important considerations around the use of age-assurance technologies, which Ofcom must have regard to when producing its codes of practice. Amendment 216 sets out the subset of principles which apply to Part 5 guidance. Together, these amendments ensure that providers are deploying age-assurance technologies in an appropriate manner. These principles appear as a full list in Schedule 4. This ensures that the principles can be found together in one place in the Bill. The wider duties set out in the Bill ensure that the same high standards apply to both Part 3 and Part 5 providers. These principles have been carefully drafted to avoid restating existing duties in the Bill. In accordance with good legislative drafting practice, the principles also do not include reference to other legislation which already directly applies to providers. In its relevant guidance and codes, however, Ofcom may include such references as it deems appropriate.

Finally, I highlight the critical importance of ensuring that users’ privacy is protected throughout the age-assurance processes. I make it clear that privacy has been represented in these principles to the furthest degree possible, by referring to the strong safeguards for user privacy already set out in the Bill.

In recognition of these new principles and to avoid duplication, Amendment 127 requires Ofcom to refer to the age-assurance principles, rather than to the proactive technology principles, when recommending age-assurance technologies that are also proactive technology.

We have listened to the points raised by noble Lords about the importance of having clear and robust definitions in the Bill for age assurance, age verification and age estimation. Amendment 277 brings forward those definitions. We have also made it clear that self-declared age, without additional, more robust measures, is not to be regarded as age verification or age estimation for compliance with duties set out in the Bill. Amendment 278 aligns the definition of proactive technology with these new definitions.

The Government are clear that the Bill’s protections must be implemented as quickly as is feasible. This entails a complex programme of work for the Government and Ofcom, as well as robust parliamentary scrutiny of many parts of the regime. All of this will take time to deliver. It is right, however, that we set clear expectations for when the most pressing parts of the regulation—those targeting illegal content and protecting children—should be in place. These amendments create an 18-month statutory deadline from the day the Bill is passed for Ofcom’s implementation of those areas. By this point, Ofcom must submit draft codes of practice to the Secretary of State to be laid in Parliament and publish its final guidance relating to illegal content duties, duties about content harmful to children and duties about pornography content in Part 5. This also includes relevant cross-cutting duties, such as content reporting procedures, which are relevant to illegal content and content harmful to children.

In line with convention, most of the Bill’s substantive provisions will be commenced two months after Royal Assent. These amendments ensure that a set of specific clauses will commence earlier—on the day of Royal Assent—allowing Ofcom to begin vital implementation work sooner than it otherwise would have done. Commencing these clauses early will enable Ofcom to launch its consultation on draft codes of practice for illegal content duties shortly after Royal Assent.

Amendment 271 introduces a new duty on Ofcom to produce and publish a report on in-scope providers’ use of age-assurance technologies, and for this to be done within 18 months of the first date on which both Clauses 11 and 72(2), on pornography duties, are in force. I thank the noble Lord, Lord Allan of Hallam, for the amendment he proposed in Committee, to which this amendment responds. We believe that this amendment will improve transparency in how age-assurance solutions are being deployed by providers, and the effectiveness of those solutions.

Finally, we are also making a number of consequential and technical amendments to the Bill to split Clauses 11 and 25 into two parts. This is to ensure these do not become unwieldy and that the duties are clear for providers and for Ofcom. I beg to move.

Debate on Amendment 34 adjourned.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Debate on Amendment 34 resumed.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

We began this group on the previous day on Report, and I concluded my remarks, so it is now for other noble Lords to contribute on the amendments that I spoke to on Thursday.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I rise emphatically to welcome the government amendments in this group. They are a thoughtful and fulsome answer to the serious concerns expressed from the four corners of the Chamber by a great many noble Lords at Second Reading and in Committee about the treatment of age verification for pornography and online harms. For this, I express my profound thanks to my noble friend the Minister, the Secretary of State, the Bill team, the Ofcom officials and all those who have worked so hard to refine this important Bill. This is a moment when the legislative team has clearly listened and done everything it possibly can to close the gap. It is very much the House of Lords at its best.

It is worth mentioning the exceptionally broad alliance of noble Lords who have worked so hard on this issue, particularly my compadres, my noble friend Lady Harding, the noble Baroness, Lady Kidron, and the right reverend Prelate the Bishop of Oxford, who all signed many of the draft amendments. There are the Front-Benchers, including the noble Lords, Lord Stevenson, Lord Knight, Lord Clement-Jones and Lord Allan of Hallam, and the noble Baroness, Lady Merron. There are the Back-Benchers behind me, including my noble friends Lady Jenkin and Lord Farmer, the noble Lords, Lord Morrow, Lord Browne and Lord Dodds, and the noble Baroness, Lady Foster. Of those in front of me, there are the noble Baronesses, Lady Benjamin and Lady Ritchie, and there is also a number too large for me to mention, from all across the House.

I very much welcome the sense of pragmatism and proportionality at the heart of the Online Safety Bill. I welcome the central use of risk assessment as a vital tool for policy implementation and the recognition that some harms are worse than others, that some children need more protection than others, that we are legislating for future technologies that we do not know much about and that we must engage industry to achieve effective implementation. As a veteran of the Communications Act 2003, I strongly support the need for enabling legislation that has agility and a broad amount of support to stand the test of time.

--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a good debate, perhaps unfairly curtailed in terms of the range of voices we have heard, but I am sure the points we wanted to have on the table are there and we can use them in summarising the debate we have had so far.

I welcome the Government’s amendments in this group. They have gone a long way to resolving a number of the difficulties that were left after the Digital Economy Act. As the noble Lord, Lord Clement-Jones, has said, we now have Part 3 and Part 5 hooked together in a consistent and effective way and definitions of “age verification” and “age estimation”. The noble Lord, Lord Grade, is sadly not in his place today—I normally judge the quality of the debate by the angle at which he resides in that top corner there. He is not here to judge it, but I am sure he would be upright and very excited by what we have been hearing so far. His point about the need for companies to be clearly responsible for what they serve up through their services is really important in what we are saying here today.

However, despite the welcome links across to the ICO age-appropriate design code, with the concerns we have been expressing on privacy there are still a number of questions which I think the Minister will want to deal with, either today or in writing. Several noble Lords have raised the question of what “proportionate” means in this area. I have mentioned it in other speeches in other groups. We all want the overall system to be proportionate in the way in which it allocates the powers, duties and responsibilities on the companies providing us with the services they do. But there is an exception for the question of whether children should have access to material which they should not get because of legal constraints, and I hope that “proportionate” is not being used in any sense to evade that.

I say that particularly because the concern has been raised in other debates—and I would be grateful if the Minister could make sure when he comes to respond that this issue is addressed—that smaller companies with less robust track records in terms of their income and expenditures might be able to plead that some of the responsibilities outlined in this section of the Bill do not apply to them because otherwise it would bear on their ability to continue. That would be a complete travesty of where we are trying to get to here, which is an absolute bar on children having access to material that is illegal or in the lists now in the Bill in terms of priority content.

The second worry that people have raised is: will the system that is set up here actually work in practice, particularly if it does not apply to all companies? That relates perhaps to the other half of the coin that I have just mentioned.

The third point, raised by a number of Peers, is: where does all this sit in relation to the review of pornography which was announced recently? A number of questions have been asked about issues which the Minister may be unable to respond to, but I suspect he may also want to write to us on the wider issue of timing and the terms of reference once they are settled.

I think we need to know this as we reach the end of the progress on this Bill, because you cannot expect a system being set up with the powers that are being given to Ofcom to work happily and well if Ofcom knows it is being reviewed at the same time. I hope that some consideration will be given to how we get the system up and running, even if the timescale is now tighter than it was, if at the same time a review rightly positioned to try to look at the wider range of pornography is going to impact on its work.

I want to end on the question raised by a large number of noble Lords: how does all this work sit with privacy? Where information and data are being shared on the basis of assuring access to services, there will be a worry if privacy is not ensured. The amendments tabled by the noble Baroness, Lady Kidron, are very salient to this. I look forward to the Minister’s response to them.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I am sorry that the noble Baroness, Lady Benjamin, was unable to be here for the start of the debate on Thursday and therefore that we have not had the benefit of hearing from her today. I am very glad that she was here to hear the richly deserved plaudits from across the House for her years of campaigning on this issue.

I am very glad to have had the opportunity to discuss matters directly with her including, when it was first announced, the review that we have launched. I am pleased that she gave it a conditional thumbs up. Many of her points have been picked up by other noble Lords today. I did not expect anything more than a conditional thumbs up from her, given her commitment to getting this absolutely right. I am glad that she is here to hear some of the answers that I am able to set out, but I know that our discussions would have continued even if she had been able to speak today and that her campaigns on this important issue will not cease; she has been tireless in them. I am very grateful to her, my noble friends Lord Bethell and Lady Harding, the noble Baroness, Lady Kidron, and many others who have been working hard on this.

Let me pick up on their questions and those of the noble Baroness, Lady Ritchie of Downpatrick, and others on the review we announced last week. It will focus on the current regulatory landscape and how to achieve better alignment of online and offline regulation of commercial pornography. It will also look at the effectiveness of the criminal law and the response of the criminal justice system relating to pornography. This would focus primarily on the approach taken by law enforcement agencies and the Crown Prosecution Service, including considering whether changes to the criminal law would address the challenges identified.

The review will be informed by significant expert input from government departments across Whitehall, the Crown Prosecution Service and law enforcement agencies, as well as through consultation with the industry and with civil society organisations and regulators including, as the noble Baroness, Lady Ritchie, rightly says, some of the many NGOs that do important work in this area. It will be a cross-government effort. It will include but not be limited to input from the Ministry of Justice, the Home Office, the Department for Science, Innovation and Technology and my own Department for Culture, Media and Sport. I assure my noble friend Lord Farmer that other government departments will of course be invited to give their thoughts. It is not an exhaustive list.

I detected the enthusiasm for further details from noble Lords across the House. I am very happy to write as soon as I have more details on the review, to keep noble Lords fully informed. I can be clear that we expect the review to be complete within 12 months. The Government are committed to undertaking it in a timely fashion so that any additional safeguards for protecting UK users of online services can be put in place as swiftly as possible.

My noble friend Lord Bethell asked about international alignment and protecting Britain for investment. We continue to lead global discussions and engagement with our international partners to develop common approaches to online safety while delivering on our ambition to make the UK the safest place in the world to be online.

The noble Baroness, Lady Kidron, asked about the new requirements. They apply only to Part 3 providers, which allow pornography or other types of primary priority content on their service. Providers that prohibit this content under their terms of service for all users will not be required to use age verification or age estimation. In practice, we expect services that prohibit this content to use other measures to meet their duties, such as effective content moderation and user reporting. This would protect children from this content instead of requiring measures that would restrict children from seeing content that is not allowed on the service in the first place.

These providers can still use age verification and age estimation to comply with the existing duty to prevent children encountering primary priority content. Ofcom can still recommend age-verification and age-estimation measures in codes of practice for these providers where proportionate. On the noble Baroness’s second amendment, relating to Schedule 4, Ofcom may refer to the age-assurance principles set out in Schedule 4 in its children’s codes of practice.

On the 18-month timetable, I can confirm that 18 months is a backstop and not a target. Our aim is to have the regime in force as quickly as possible while making sure that services understand their new duties. Ofcom has set out in its implementation road map that it intends to publish draft guidance under Part 5 this autumn and draft children’s codes next spring.

The noble Baroness, Lady Ritchie, also asked about implementation timetables. I can confirm that Part 3 and Part 5 duties will be implemented at the same time. Ofcom will publish draft guidance shortly after Royal Assent for Part 5 duties and codes for the illegal content duties in Part 3. Draft codes for Part 3 children’s duties will follow in spring next year. Some Part 3 duties relating to category 1 services will be implemented later, after the categorisation thresholds have been set in secondary legislation.

The noble Lord, Lord Allan of Hallam, asked about interoperability. We have been careful to ensure that the Bill is technology neutral and to allow for innovation across the age-assurance market. We have also included a principle on interoperability in the new list of age-assurance principles in Schedule 4 and the Part 5 guidance.

At the beginning of the debate, on the previous day on Report, I outlined the government amendments in this group. There are some others, which noble Lords have spoken to. Amendments 125 and 217, from the noble Baroness, Lady Kidron, seek to add additional principles on user privacy to the new lists of age-assurance principles for both Part 3 and 5, which are brought in by Amendments 124 and 216. There are already strong safeguards for user privacy in the Bill. Part 3 and 5 providers will need to have regard to the importance of protecting users’ privacy when putting in place measures such as age verification or estimation. Ofcom will be required to set out, in codes of practice for Part 3 providers and in guidance for Part 5 providers, how they can meet these duties relating to privacy. Furthermore, companies that use age-verification or age-estimation solutions will need to comply with the UK’s robust data protection laws or face enforcement action.

Adding the proposed new principles would, we fear, introduce confusion about the nature of the privacy duties set out in the Bill. Courts are likely to assume that the additions are intended to mean something different from the provisions already in the Bill relating to privacy. The new amendments before your Lordships imply that privacy rights are unqualified and that data can never be used for more than one purpose, which is not the case. That would introduce confusion about the nature of—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I apologise to the Minister. Can he write giving chapter and verse for that particular passage by reference to the contents of the Bill?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am very happy to do that. That would probably be better than me trying to do so at length from the Dispatch Box.

Government Amendment 124 also reinforces the importance of protecting children’s privacy, including data protection, by ensuring that Ofcom will need to have regard to standards set out under Section 123 of the Data Protection Act 2018 in the age-appropriate design code. I hope that explains why we cannot accept Amendments 125 or 217.

The noble Baroness, Lady Fox, has Amendment 184 in this group and was unable to speak to it, but I am very happy to respond to it and the way she set it out on the Marshalled List. It seeks to place a new duty on Ofcom to evaluate whether internet service providers, internet-connected devices or individual websites should undertake user-identification and age-assurance checks. This duty would mean that such an evaluation would be needed before Ofcom produces guidance for regulated services to meet their duties under Clauses 16 and 72.

Following this evaluation, Ofcom would need to produce guidance on age-verification and age-assurance systems, which consider cybersecurity and a range of privacy considerations, to be laid before and approved by Parliament. The obligation for Ofcom to evaluate age assurance, included in the noble Baroness’s amendment, is already dealt with by Amendment 271, which the Government have tabled to place a new duty on Ofcom to publish a report on the effectiveness of age-assurance solutions. That will specifically include consideration of cost to business, and privacy, including the processing of personal data.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I just realised I forgot to thank the Government for Amendment 271, which reflected something I raised in Committee. I will reflect back to the Minister that, as is reinforced by his response now, it goes precisely where I wanted to. That is to make sure—I have raised this many times—that we are not implementing another cookie banner, but are implementing something and then going back to say, “Did it work as we intended? Were the costs proportionate to what we achieved?” I want to put on the record that I appreciate Amendment 271.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I appreciate the noble Lord’s interjection and, indeed, his engagement on this issue, which has informed the amendments that we have tabled.

In relation to the amendment of the noble Baroness, Lady Fox, as I set out, there are already robust safeguards for user privacy in the Bill. I have already mentioned Amendment 124, which puts age-assurance principles in the Bill. These require Ofcom to have regard, when producing its codes of practice on the use of age assurance, to the principle of protecting the privacy of users, including data protection. We think that the noble Baroness’s amendment is also unnecessary. I hope that she and the noble Baroness, Lady Kidron, will be willing to not move their amendments and to support the government amendments in the group.

Amendment 34 agreed.
--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

There is always a simple question. We are in a bit of a mess—again. When I said at Second Reading that I thought we should try to work together, as was picked up by the noble Baroness in her powerful speech, to get the best Bill possible out of what we had before us, I really did not know what I was saying. Emotion caught me and I ripped up a brilliant speech which will never see the light of day and decided to wing it. I ended up by saying that I thought we should do the unthinkable in this House—the unthinkable in politics, possibly—and try to work together to get the Bill to come right. As the noble Lord, Lord Clement-Jones, pointed out, I do not think I have ever seen, in my time in this House, so many government amendments setting out a huge number of what we used to call concessions. I am not going to call them concessions—they are improvements to the Bill. We should pay tribute to the Minister, who has guided his extensive team, who are listening anxiously as we speak, in the good work they have been doing for some time, getting questioned quite seriously about where it is taking us.

The noble Lord, Lord Clement-Jones, is quite right to pick up what the pre-legislative scrutiny committee said about this aspect of the work we are doing today and what is in the Bill. We have not really nailed the two big things that social media companies ask: this amplification effect, where a single tweet—or thread, let us call it now—can go spinning around the world and gather support, comment, criticism, complaint, anger and all sorts of things that we probably do not really understand in the short period of time it takes to be read and reacted to. That amplification is not something we see in the real world; we do not really understand it and I am not quite sure we have got to the bottom of where we should be going at this stage.

The second most important point—the point we are stuck on at the moment; this rock, as it were, in the ocean—is the commercial pressure which, of course, drives the way in which companies operate. They are in it for the money, not the social purpose. They did not create public spaces for people to discuss the world because they think it is a good thing. There is no public service in this—this is a commercial decision to get as much money as possible from as many people as possible and, boy, are they successful.

But commercial pressures can have harms; they create harms in ways that we have discussed, and the Bill reflects many of those. This narrow difference between the way the Bill describes content, which is meant to include many of the things we have been talking about today—the four Cs that have been brought into the debate helpfully in recent months—does not really deal with the commercial pressures under which people are placed because of the way in which they deal with social media. We do not think the Bill is as clear as it could be; nor does it achieve as much as it should in trying to deal with that issue.

That is in part to do with the structure. It is almost beyond doubt that the sensibility of what we are trying to achieve here is in the Bill, but it is there at such a level of opacity that it does not have the clarity of the messages we have heard today from those who have spoken about individuals—Milly and that sort of story—and the impact on people. Even the noble Lord, Lord Bethell, whose swimming exploits we must admire, is an unwitting victim of the drive of commercial pressures that sees him in his underwear at inappropriate moments in order that they should seek the profits from that. I think it is great, but I wonder why.

I want to set the Minister a task: to convince us, now that we are at the bar, that when he says that this matter is still in play, he realises what that must imply and will give us a guarantee that we will be able to gain from the additional time that he seeks to get this to settle. There is a case, which I hope he will agree to, for having in the Bill an overarching statement about the need to separate out the harms that arise from content and the harms that arise from the system discussions and debates we have been having today where content is absent. I suggest that, in going back to Clause 1, the overarching objectives clause, it might well be worth seeing whether that might be strengthened so that it covers this impact, so that the first thing to read in the Bill is a sense that we embrace, understand and will act to improve this question of harm arising absent content. There is a case for putting into Clauses 10, 11, 25 and 82 the wording in Amendments 35, 36, 37A and 240, in the name of the noble Baroness, Lady Kidron, and to use those as a way of making sure that every aspect of the journey through which social media companies must go to fulfil the duties set out in the Bill by Ofcom reflects both the content that is received and the design choices made by those companies in bringing forward those proposals for material content harms and the harms that arise from the design choices. Clauses 208 and 209 also have to provide a better consideration of how one describes harms so that they are not always apparently linked to content.

That is a very high hurdle, particularly because my favourite topic of how this House works will be engaged. We have, technically, already passed Clause 1; an amendment was debated and approved, and now appears in versions of the Bill. We are about to finish with Clauses 10 and 11 today, so we are effectively saying to the Minister that he must accept that there are deficiencies in the amendments that have already been passed or would be, if we were to pass Amendments 35, 36, 37A, 85 and 240 in the name of the noble Baroness, Lady Kidron, and others. It is not impossible, and I understand that it would be perfectly reasonable, for the Government to bring back a series of amendments on Third Reading reflecting on the way in which the previous provisions do not fulfil the aspirations expressed all around the House, and therefore there is a need to change them. Given the series of conversations throughout this debate—my phone is red hot with the exchanges taking place, and we do not have a clear signal as to where that will end up—it is entirely up to the Minister to convince the House whether these discussions are worth it.

To vote on this when we are so close seems ridiculous, because I am sure that if there is time, we can make this work. But time is not always available, and it will be up to the Minister to convince us that we should not vote and up to the noble Baroness to decide whether she wishes to test the opinion of the House. We have a three-line Whip on, and we will support her. I do not think that it is necessary to vote, however—we can make this work. I appeal to the Minister to get over the bar and tell us how we are to do it.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I am very grateful for the discussion we have had today and the parallel discussions that have accompanied it, as well as the many conversations we have had, not just over the months we have been debating the Bill but over the past few days.

I will turn in a moment to the amendments which have been the focus of the debate, but let me first say a bit about the amendments in this group that stand in my name. As noble Lords have kindly noted, we have brought forward a number of changes, informed by the discussions we have had in Committee and directly with noble Lords who have taken an interest in the Bill for a long time.

Government Amendments 281C, 281D, 281E and 281G relate to the Bill’s interpretation of “harm”, which is set out in Clause 209. We touched on that briefly in our debate on Thursday. The amendments respond to concerns which I have discussed with many across your Lordships’ House that the Bill does not clearly acknowledge that harm and risk can be cumulative. The amendments change the Bill to make that point explicit. Government Amendment 281D makes it clear that harm may be compounded in instances where content is repeatedly encountered by an individual user. That includes, but is not limited to, instances where content is repeatedly encountered as a result of algorithms or functionalities on a service. Government Amendment 281E addresses instances in which the combination of multiple functionalities on a service cumulatively drives up the risk of harm.

Those amendments go hand in hand with other changes that the Government have made on Report to strengthen protections for children. Government Amendment 1, for instance, which we discussed at the beginning of Report, makes it clear that services must be safe by design and that providers must tackle harms which arise from the design and operation of their service. Government Amendments 171 and 172 set out on the face of the Bill the categories of “primary priority” and “priority” content which is harmful to children to allow the protections for children to be implemented as swiftly as possible following Royal Assent. As these amendments demonstrate, the Government have indeed listened to concerns which have been raised from all corners of your Lordships’ House and made significant changes to strengthen the Bill’s protections for children. I agree that it has been a model of the way in which your Lordships’ House operates, and the Bill has benefited from it.

Let me turn to the amendments in the name of the noble Baroness, Lady Kidron. I am very grateful for her many hours of discussion on these specific points, as well as her years of campaigning which led to them. We have come a long way and made a lot of progress on this issue since the discussion at the start of Committee. The nature of online risk versus harm is one which we have gone over extensively. I certainly accept the points that the noble Baroness makes; I know how heartfelt they are and how they are informed by her experience sitting in courtrooms and in coroners’ inquests and talking to people who have had to be there because of the harms they or their families have encountered online. The Government are firmly of the view that it is indisputable that a platform’s functionalities, features or wider design are often the single biggest factor in determining whether a child will suffer harm. The Bill makes it clear that functions, features and design play a key role in the risk of harm occurring to a child online; I draw noble Lords’ attention to Clause 11(5), which makes it clear that the child safety duties apply across all areas of a service, including the way it is designed, operated and used, as well as content present on the service. That makes a distinction between the design, operation and use, and the content.

In addition, the Bill’s online safety objectives include that regulated services should be designed and operated so as to protect from harm people in the United Kingdom who are users of the service, including with regard to algorithms used by the service, functionalities of the services and other features relating to the operation of the service. There is no reference to content in this section, again underlining that the Bill draws a distinction.

This ensures that the role of functionalities is properly accounted for in the obligations on providers and the regulator, but I accept that noble Lords want this to be set out more clearly. Our primary aim must be to ensure that the regulatory framework can operate as intended, so that it can protect children in the way that they deserve and which we all want to see. Therefore, we cannot accept solutions that, however well meaning, may inadvertently weaken the Bill’s framework or allow providers to exploit legal uncertainty to evade their duties. We have come back to that point repeatedly in our discussions.

--- Later in debate ---
Moved by
37: Clause 11, page 10, line 42, leave out “(for example, by using age verification)”
Member’s explanatory statement
This amendment is consequential on the next amendment of Clause 11 in my name.
--- Later in debate ---
Moved by
38: Clause 11, page 10, line 46, at end insert—
“(3A) The duty set out in subsection (3)(a) requires a provider to use age verification or age estimation (or both) to prevent children of any age from encountering primary priority content that is harmful to children which the provider identifies on the service.(3B) That requirement applies to a provider in relation to a particular kind of primary priority content that is harmful to children in every case except where—(a) a term of service indicates (in whatever words) that the presence of that kind of primary priority content that is harmful to children is prohibited on the service, and(b) that policy applies in relation to all users of the service.(3C) If a provider is required by subsection (3A) to use age verification or age estimation for the purpose of compliance with the duty set out in subsection (3)(a), the age verification or age estimation must be of such a kind, and used in such a way, that it is highly effective at correctly determining whether or not a particular user is a child.”Member’s explanatory statement
This amendment requires providers of user-to-user services to use age verification or age estimation to prevent children from encountering identified primary priority content that is harmful to children, unless the terms of service indicate that that kind of content is prohibited; and where that requirement applies, new subsection (3C) provides that the age verification or age estimation must be highly effective.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I beg to move.

Amendment 39 (to Amendment 38)

Moved by
--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, as we have heard, this is a small group of amendments concerned with preventing size and lack of capacity being used as a reasonable excuse for allowing children to be unsafe. Part of the problem is the complexity of the Bill and the way it has been put together.

For example, Clause 11, around user-to-user services, is the pertinent clause and it is headed “Safety duties protecting children”. Clause 11(2) is preceded in italics with the wording “All services” so anyone reading it would think that what follows applies to all user-to-user services regardless of size. Clause 11(3) imposes a duty on providers

“to operate a service using proportionate systems and processes”

to protect children from harm. That implies that there will be judgment around what different providers can be expected to do to protect children; for example, by not having to use a particular unaffordable technical solution on age assurance if they can show the right outcome by doing things differently. That starts to fudge things a little.

The noble Lord, Lord Bethell, who introduced this debate so well with Amendment 39, supported by my noble friend Lady Ritchie, wants to be really sure that the size of the provider can never be used to argue that preventing all children from accessing porn is disproportionate and that a few children slipping through the net might just be okay.

The clarity of Clause 11 unravels even further at the end of the clause, where in subsection (12)(b) it reads that

“the size and capacity of the provider of a service”

is relevant

“in determining what is proportionate”.

The clause starts to fall apart at that point quite thoroughly in terms of anyone reading it being clear about what is supposed to happen.

Amendment 43 seeks to take that paragraph out, as we have heard from the noble Lord, Lord Russell, and would do the same for search in Amendment 87. I have added my name to these amendments because I fear that the ambiguity in the wording of this clause will give small and niche platforms an easy get out from ensuring that children are safe by design.

I use the phrase “by design” deliberately. We need to make a choice with this Bill even at this late stage. Is the starting point in the Bill children’s safety by design? Or is the starting point one where we do not want to overly disrupt the way providers operate their business first—which is to an extent how the speech from the noble Lord, Lord Allan, may have been heard—and then overlay children’s safety on top of that?

Yesterday, I was reading about how children access inappropriate and pornographic content, not just on Twitter, Instagram, Snapchat, TikTok and Pinterest but on Spotify and “Grand Theft Auto”—the latter being a game with an age advisory of “over 17” but which is routinely played by teenaged children. Wherever we tolerate children being online, there are dangers which must be tackled. Listening to the noble Baroness, Lady Harding, took me to where a big chunk of my day job in education goes to—children’s safeguarding. I regularly have to take training in safeguarding because of the governance responsibilities that I have. Individual childminders looking after one or two children have an assessment and an inspection around their safeguarding. In the real world we do not tolerate a lack of safety for children in this context. We should not tolerate it in the online world either.

The speech from the noble Lord, Lord Russell, reminded me of the breadcrumbing from big platforms into niche platforms that is part of that incel insight that he referenced. Content that is harmful to children can also be what some children are looking for, which keeps them engaged. Small, emergent services aggressively seeking growth could set algorithms accordingly. They must not be allowed to believe that engaging harmful content is okay until they get to the size that they need to be to afford the age-assurance technology which we might envisage in the Bill. I hope that the Minister shares our concerns and can help us with this problem.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, short debates can be helpful and useful. I am grateful to noble Lords who have spoken on this group.

I will start with Amendment 39, tabled by my noble friend Lord Bethell. Under the new duty at Clause 11(3)(a), providers which allow pornography or other forms of primary priority content under their terms of service will need to use highly effective age verification or age estimation to prevent children encountering it where they identify such content on their service, regardless of their size or capacity. While the size and capacity of providers is included as part of a consideration of proportionality, this does not mean that smaller providers or those with less capacity can evade the strengthened new duty to protect children from online pornography. In response to the questions raised by the noble Baronesses, Lady Ritchie of Downpatrick and Lady Kidron, and others, no matter how much pornographic content is on a service, where providers do not prohibit this content they would still need to meet the strengthened duty to use age verification or age estimation.

Proportionality remains relevant for the purposes of providers in scope of the new duty at Clause 11(3)(a) only in terms of the age-verification or age-estimation measures that they choose to use. A smaller provider with less capacity may choose to go for a less costly but still highly effective measure. For instance, a smaller provider with less capacity might seek a third-party solution, whereas a larger provider with greater capacity might develop their own solution. Any measures that providers use will need to meet the new high bar of being “highly effective”. If a provider does not comply with the new duties and fails to use measures which are highly effective at correctly determining whether or not a particular user is a child, Ofcom can take tough enforcement action.

The other amendments in this group seek to remove references to the size and capacity of providers in provisions relating to proportionality. The principle of proportionate, risk-based regulation is fundamental to the Bill’s regulatory framework, and we consider that the Bill as drafted already strikes the correct balance. The Bill ultimately will regulate a large number of services, ranging from some of the biggest companies in the world to smaller, voluntary organisations, as we discussed in our earlier debate on exemptions for public interest services.

The provisions regarding size and capacity recognise that what it is proportionate to require of companies of various sizes and business models will be different. Removing this provision would risk setting a lowest common denominator standard which does not create incentives for larger technology companies to do more to protect their users than smaller organisations. For example, it would not be proportionate for a large multinational company which employs thousands of content moderators and which invests in significant safety technologies to argue that it is required to take only the same steps to comply as a smaller provider which might have only a handful of employees and a few thousand UK users.

While the size and capacity of providers is included as part of a consideration of proportionality, let me be clear that this does not mean that smaller providers or those with less capacity do not need to meet the child safety duties and other duties in the Bill, such as the illegal content safety duties. These duties set out clear requirements for providers. If providers do not meet these duties, they will face enforcement action.

I hope that is reassuring to my noble friend Lord Bethell and to the other noble Lords with amendments in this group. I urge my noble friend to withdraw his amendment.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I thank my noble friend the Minister for that reassurance. He put the points extremely well. I very much welcome his words from the Dispatch Box, which go a long way towards clarifying and reassuring.

This was a short and perfectly formed debate. I will not go on a tour d’horizon of everyone who has spoken but I will mention the noble Lord, Lord Allan of Hallam. He is entirely right that no one wants gratuitously to hound out businesses from the UK that contribute to the economy and to our life here. There are good regulatory principles that should be applied by all regulators. The five regulatory principles of accountability, transparency, targeting, consistency and proportionality are all in the Legislative and Regulatory Reform Act 2006. Ofcom will embrace them and abide by them. That kind of reassurance is important to businesses as they approach the new regulatory regime.

I take on board what my noble friend the Minister said in terms of the application of regulations regardless of size or capacity, and the application of these strengthened duties, such as “highly effective”, regardless of any economic or financial capacity. I feel enormously reassured by what he has said. I beg leave to withdraw my amendment.

--- Later in debate ---
Moved by
41: Clause 11, page 11, line 1, leave out from beginning to “may” in line 2 and insert “Age verification or age estimation to identify who is or is not a child user or which age group a child user is in are examples of measures which (if not required by subsection (3A))”
Member’s explanatory statement
This amendment refers to age verification and age estimation as mentioned in the preceding amendment in my name, and clarifies the relationship between Clause 11(4) and new subsection (3A) of Clause 11 inserted by that amendment.
--- Later in debate ---
Moved by
44: Clause 11, page 12, line 12, leave out “this section” and insert “section 11”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 11 into two Clauses.
--- Later in debate ---
Moved by
47: Clause 11, page 12, line 21, leave out “subsections (3)” and insert “section 11(3)”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 11 into two Clauses.
--- Later in debate ---
Moved by
53: After Clause 11, insert the following new Clause—
“Assessment duties: user empowerment
(1) This section sets out the duties about assessments related to adult user empowerment which apply in relation to Category 1 services (in addition to the duties about risk assessments set out in section 8 and, in the case of Category 1 services likely to be accessed by children, section 10).(2) A duty to carry out a suitable and sufficient assessment for the purposes of section 12(2) at a time set out in, or as provided by, Schedule 3.(3) A duty to take appropriate steps to keep such an assessment up to date.(4) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient assessment for the purposes of section 12(2) relating to the impacts of that proposed change.(5) An assessment of a service “for the purposes of section 12(2)” means an assessment of the following matters—(a) the user base;(b) the incidence of relevant content on the service;(c) the likelihood of adult users of the service encountering, by means of the service, each kind of relevant content (with each kind separately assessed), taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service;(d) the likelihood of adult users with a certain characteristic or who are members of a certain group encountering relevant content which particularly affects them;(e) the likelihood of functionalities of the service facilitating the presence or dissemination of relevant content, identifying and assessing those functionalities more likely to do so;(f) the different ways in which the service is used, and the impact of such use on the likelihood of adult users encountering relevant content;(g) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to strengthen adult users’ control over their interaction with user-generated content, and other systems and processes) may reduce or increase the likelihood of adult users encountering relevant content.(6) In this section “relevant content” means content to which section 12(2) applies (content to which user empowerment duties set out in that provision apply).(7) See also—(a) section 19(8A) and (9) (records of assessments), and(b) Schedule 3 (timing of providers’ assessments).” Member’s explanatory statement
This amendment requires providers of Category 1 services to carry out and update as necessary an assessment about how likely it is that adult users will encounter content to which Clause 12(2) applies (suicide and self-harm content and so on - see Clause 12(10), (11) and (12)).
--- Later in debate ---
Moved by
57: Clause 12, page 13, line 9, after “(2)” insert “(“control features”)”
Member’s explanatory statement
This amendment is a technical drafting change related to the next amendment in my name.
--- Later in debate ---
Moved by
60: Clause 12, page 13, line 10, at end insert—
“(4A) A duty to operate a service using a system or process which seeks to ensure that all registered adult users are offered the earliest possible opportunity, in relation to each control feature included in the service, to take a step indicating to the provider that—(a) the user wishes to retain the default setting for the feature (whether that is that the feature is in use or applied, or is not in use or applied), or(b) the user wishes to change the default setting for the feature.(4B) The duty set out in subsection (4A)—(a) continues to apply in relation to a user and a control feature for so long as the user has not yet taken a step mentioned in that subsection in relation to the feature;(b) no longer applies in relation to a user once the user has taken such a step in relation to every control feature included in the service.”Member’s explanatory statement
This amendment imposes a new duty on providers of Category 1 services to proactively ask all registered adult users whether they wish to opt in or opt out of any features offered in compliance with the duty in subsection (2), until a choice is made.
--- Later in debate ---
Moved by
65: Clause 12, page 13, line 24, leave out “subsection (2)” and insert “section 12(2)”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 12 into two Clauses.
--- Later in debate ---
Moved by
74: Clause 16, page 19, line 26, leave out from “if” to “the” in line 28 and insert “age verification or age estimation is used on the service with”
Member’s explanatory statement
This amendment provides that a provider can only conclude that children cannot access a service if age verification or age estimation is used on the service with the result that children are not normally able to access it.
--- Later in debate ---
Moved by
75: Clause 17, page 21, line 2, leave out “11(3)” and insert “11(2) or (3)”
Member’s explanatory statement
This amendment is about complaints of content being blocked because of an incorrect assessment of a user’s age. A reference to Clause 11(2) is inserted, as the duty in that provision can also be complied with by using age verification or age estimation.
--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

It is always nice to be nice to the Minister.

I will reference, briefly, the introduction of the amendments in the name of the noble Baroness, Lady Fraser of Craigmaddie, which I signed. They were introduced extremely competently, as you would expect, by my noble and learned kinsman Lord Hope. It is important to get the right words in the right place in Bills such as this. He is absolutely right to point out the need to be sure that we are talking about the right thing when we say “freedom of expression”—that we do mean that and not “freedom of speech”; we should not get them mixed up—and, also, to have a consistent definition that can be referred to, because so much depends on it. Indeed, this group might have run better and more fluently if we had started with this amendment, which would have then led into the speeches from those who had the other amendments in the group.

The noble Baroness is not present today, but not for bad news: for good news. Her daughter is graduating and she wanted to be present at that; it is only right that she should do that. She will be back to pick up other aspects of the devolution issues she has been following very closely, and I will support her at that time.

The debate on freedom of expression was extremely interesting. It raised issues that, perhaps, could have featured more fully had this been timetabled differently, as both noble Lords who introduced amendments on this subject said. I will get my retaliation in first: a lot of what has been asked for will have been done. I am sure that the Minister will say that, if you look at the amendment to Clause 1, the requirement there is that freedom of expression is given priority in the overall approach to the Bill, and therefore, to a large extent, the requirement to replace that at various parts of the Bill may not be necessary. But I will leave him to expand on that; I am sure that he will.

Other than that, the tension I referred to in an earlier discussion, in relation to what we are made to believe about the internet and the social media companies, is that we are seeing a true public square, in which expressions and opinions can be exchanged as freely and openly as they would be in a public space in the real world. But, of course, neither of those places really exists, and no one can take the analogy further than has been done already.

The change, which was picked up by the noble Baroness, Lady Stowell, in relation to losing “legal but harmful”, has precipitated an issue which will be left to social media companies to organise and police—I should have put “policing” in quotation marks. As the noble Baroness, Lady Kidron, said, the remedy for much of this will be an appeals mechanism that works both at the company level and for the issues that need rebalancing in relation to complexity or because they are not being dealt with properly. We will not know that for a couple of years, but at least that has been provided for and we can look forward to it. I look forward to the Minister’s response.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I hope that the noble Baroness, Lady Fox, and my noble friend Lord Moylan do feel that they have been listened to. It was striking, in this debate, that they had support from all corners of your Lordships’ House. I know that, at various points in Committee, they may have felt that they were in a minority, but they have been a very useful and welcome one. This debate shows that many of the arguments that they have made throughout the passage of the Bill have resonated with noble Lords from across the House.

Although I have not signed amendments in the names of the noble Baroness and my noble friend Lord Moylan, in many cases it is not because I disagree with them but because I think that what they do is already covered in the Bill. I hope to reassure them of that in what I say now.

Amendments 77 to 81 from the noble Baroness, Lady Fox, would require services to have particular regard to freedom of expression and privacy when deciding on their terms of service. Services will already need to have particular regard to users’ rights when deciding on safety systems to fulfil their duties. These requirements will be reflected in providers’ terms of service, as a result of providers’ duties to set out their safety measures in their terms of service. The framework will also include a range of measures to allow scrutiny of the formulation, clarity and implementation of category 1 providers’ own terms of service.

However, there are some points on which we disagree. For instance, we do not think that it would be appropriate for all providers to have a general duty to have a particular regard to freedom of expression when deciding on their own terms of service about content. We believe that the Bill achieves the right balance. It requires providers to have regard to freedom of expression when carrying out their safety duties, and it enables public scrutiny of terms of service, while recognising providers’ own freedom of expression rights as private entities to set the terms of service that they want. It is of course up to adults to decide which services to use based on the way those services are drawn up and the way the terms of service set out what is permissible in them.

Nothing in the Bill restricts service providers’ ability to set their own terms and conditions for legal content accessed by adults—that is worth stressing. Ofcom will not set platforms’ terms and conditions, nor will it take decisions on whether individual pieces of content should, or should not, be on a platform. Rather, it will ensure that platforms set clear terms and conditions, so that adults know what to expect online, and ensure that platforms have systems and processes in place to enforce those terms and conditions themselves.

Amendment 226 from the noble Baroness, Lady Fox, would require providers to use all relevant information that is reasonably available to them whenever they make judgments about content under their terms of service. That is, where they have included or drafted those terms of service in compliance with duties in the Bill. Her amendment would be to an existing requirement in Clause 173, which already requires providers to take this approach whenever they implement a system or process to comply, and this system is making judgments about certain content. For example, Clause 173 already covers content judgments made via systems and processes that a category 1 provider implements to fulfil its Clause 65 duties to enforce its own terms of service consistently. So we feel that Clause 173 is already broad enough to achieve the objectives that the noble Baroness, Lady Fox, seeks.

My noble friend Lord Moylan’s amendments seek to require Ofcom to have special regard to the importance of protecting freedom of expression when exercising its enforcement duties and when drafting codes or guidance. As we discussed in Committee, Ofcom has existing obligations to protect freedom of expression, and the Bill will include additional measures in this regard. We are also making additional amendments to underline the importance of freedom of expression. I am grateful to the noble and learned Lord, Lord Hope of Craighead, and my noble friend Lady Fraser of Craigmaddie for their work to define “freedom of expression” in the Bill. The Bill’s new overarching statement at Clause 1, as the noble Lord, Lord Stevenson, rightly pointed out, lists “freedom of expression”, signalling that it is a fundamental part of the Bill. That is a helpful addition.

Amendment 188 in the name of the noble Baroness, Lady Fox, seeks to disapply platforms’ Clause 65 duties when platforms’ terms of service restrict lawful expression, or expression otherwise protected by Article 10 of the European Convention on Human Rights. Her amendment would mean that category 1 providers’ Clause 65 duties to enforce clear, accessible terms of service in a consistent manner would not apply to any of their terms of service, where they are making their own decisions restricting legal content. That would greatly undermine the application of these provisions in the Bill.

Article 10 of the European Convention on Human Rights concerns individuals’ and entities’ rights to receive and impart ideas without undue interference by public authorities, not private entities. As such, it is not clear how a service provider deciding not to allow a certain type of content on its platform would engage the Article 10 rights of a user.

Beyond the legal obligations regarding the treatment of certain kinds of user-generated content imposed by this Bill and by other legislation, platforms are free to decide what content they wish, or do not wish, to have on their services. Provisions in the Bill will set out important duties to ensure that providers’ contractual terms on such matters are clear, accessible and consistently enforced.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, before my noble friend sits down, perhaps I could seek a point of clarification. I think I heard him say, at the beginning of his response to this short debate, that providers will be required to have terms of service which respect users’ rights. May I ask him a very straightforward question: do those rights include the rights conferred by Article 10 of the European Convention on Human Rights? Put another way, is it possible for a provider operating in the United Kingdom to have terms and conditions that abridge the rights conferred by Article 10? If it is possible, what is the Government’s defence of that? If it is not possible, what is the mechanism by which the Bill achieves that?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

As I set out, I think my noble friend and the noble Baroness, Lady Fox, are not right to point to the European Convention on Human Rights here. That concerns individuals’ and entities’ rights

“to receive and impart ideas without undue interference”

by public authorities, not private entities. We do not see how a service provider deciding not to allow certain types of content on its platform would engage the Article 10 rights of the user, but I would be very happy to discuss this further with my noble friend and the noble Baroness in case we are talking at cross-purposes.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

On that point specifically, having worked inside one of the companies, they fear legal action under all sorts of laws, but not under the European Convention on Human Rights. As the Minister explained, it is for public bodies; if people are going to take a case on Article 10 grounds, they will be taking it against a public body. There are lots of other grounds to go after a private company but not ECHR compliance.

--- Later in debate ---
Moved by
82: Clause 19, page 23, line 30, at end insert—
“(8A) A duty to make and keep a written record, in an easily understandable form, of all aspects of every assessment under section (Assessment duties: user empowerment) (assessments related to the adult user empowerment duty set out in section 12(2)), including details about how the assessment was carried out and its findings.”Member’s explanatory statement
This amendment requires providers of Category 1 services to keep full records of their assessments under the new Clause proposed after Clause 11 in my name.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Moved by
84: Clause 19, page 24, line 4, at end insert “, and (Disclosure of information about use of service by deceased child users) (deceased child users).”
Member’s explanatory statement
This amendment has the effect that OFCOM have a duty to review compliance by user-to- user service providers with the new duties imposed by the Clause proposed after Clause 67 in my name.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, as I set out in Committee, the Government are bringing forward a package of amendments to address the challenges that bereaved parents and coroners have faced when seeking to access data after the death of a child.

These amendments have been developed after consultation with those who, so sadly, have first-hand experience of these challenges. I thank in particular the families of Breck Bednar, Sophie Parkinson, Molly Russell, Olly Stephens and Frankie Thomas for raising awareness of the challenges they have faced when seeking access to information following the heartbreaking cases involving their children. I am also grateful to the noble Baroness, Lady Kidron, for championing this issue in Parliament and more widely. I am very happy to say that she is supporting the government amendments in this group.

The loss of any life is heartbreaking, but especially so when it involves a child. These amendments will create a more straightforward and humane process for accessing data and will help to ensure that parents and coroners receive the answers they need in cases where a child’s death may be related to online harms. We know that coroners have faced challenges in accessing relevant data from online service providers, including information about a specific child’s online activity, where that might be relevant to an investigation or inquest. It is important that coroners can access such information.

As such, I turn first to Amendments 246, 247, 249, 250, 282, 283 and 287, which give Ofcom an express power to require information from regulated services about a deceased child’s online activity following a request from a coroner. This includes the content the child had viewed or with which he or she had engaged, how the content came to be encountered by the child, the role that algorithms and other functionalities played, and the method of interaction. It also covers any content that the child generated, uploaded or shared on the service.

Crucially, this power is backed up by Ofcom’s existing enforcement powers, so that, where a company refuses to provide information requested by Ofcom, companies may be subject to enforcement action, including senior management liability. To ensure that there are no barriers to Ofcom sharing information with coroners, first, Amendment 254 enables Ofcom to share information with a coroner without the prior consent of a business to disclose such information. This will ensure that Ofcom is free to provide information it collects under its existing online safety functions to coroners, as well as information requested specifically on behalf of a coroner, where that might be useful in determining whether social media played a part in a child’s death.

Secondly, coroners must have access to online safety expertise, given the technical and fast-moving nature of the industry. As such, Amendment 273 gives Ofcom a power to produce a report dealing with matters relevant to an investigation or inquest, following a request from a coroner. This may include, for example, information about a company’s systems and processes, including how algorithms have promoted specific content to a child. To this end, the Chief Coroner’s office will consider issuing non-statutory guidance and training for coroners about social media as appropriate, subject to the prioritisation of resources. We are confident that this well-established framework provides an effective means to provide coroners with training on online safety issues.

It is also important that we address the lack of transparency from large social media services about their approach to data disclosure. Currently, there is no common approach to this issue, with some services offering memorialisation or contact-nomination processes, while others seemingly lack any formal policy. To tackle this, a number of amendments in this group will require the largest services—category 1, 2A and 2B services—to set out policies relating to the disclosure of data regarding the online activities of a deceased child in a clear, accessible and sufficiently detailed format in their terms of service. These companies will also be required to provide a written response to data requests in a timely manner and must provide a dedicated helpline, or similar means, for parents to communicate with the company, in order to streamline the process. This will address the painful radio silence experienced by many bereaved parents. The companies must also offer options so that parents can complain when they consider that a platform is not meeting its obligations. These must be easy to access, easy to use and transparent.

The package of amendments will apply not only to coroners in England and Wales but also to Northern Ireland and equivalent investigations in Scotland, where similar sad events have occurred.

The Government will also address other barriers which are beyond the scope of this Bill. For example, we will explore measures to introduce data rights for bereaved parents who wish to request information about their deceased children through the Data Protection and Digital Information Bill. We are also working, as I said in Committee, with our American counterparts to clarify and, where necessary, address unintended barriers to information sharing created by the United States Stored Communications Act. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the Minister and indeed the Secretary of State for bringing forward these amendments in the fulsome manner that they have. I appreciate it, but I know that Bereaved Families for Online Safety also appreciates it. The Government committed to bringing forward these amendments on the last day in Committee, so they have been pre-emptively welcomed and discussed at some length. One need only read through Hansard of 22 June to understand the strength of feeling about the pain that has been caused to families and the urgent need to prevent others experiencing the horror faced by families already dealing with the loss of their child.

I will speak briefly on three matters only. First, I must once again thank bereaved families and colleagues in this House and in the other place for their tireless work in pressing this issue. This is one of those issues that does not allow for celebration. As I walked from the Chamber on 22 June, I asked one of the parents how they felt. They said: “It is too late for me”. It was not said in bitterness but in acknowledgement of their profound hurt and the failure of companies voluntarily to do what is obvious, moral and humane. I ask the Government to see the sense in the other amendments that noble Lords brought forward on Report to make children safer, and make the same, pragmatic, thoughtful solution to those as they have done on this group of amendments. It makes a huge difference.

Secondly, I need to highlight just one gap; I have written to the Secretary of State and the Minister on this. I find it disappointing that the Government did not find a way to require senior management to attend an inquest to give evidence. Given that the Government have agreed that senior managers should be subject to criminal liability under some circumstances, I do not understand their objections to summoning them to co-operate with legal proceedings. If a company submits information in response to Ofcom and at the coroner’s request the company’s senior management is invited to attend the inquest, it makes sense that someone should be required to appear to answer and follow up those questions. Again, on behalf of the bereaved families and specifically their legal representatives, who are very clear on the importance of this part of the regime, I ask the Government to reconsider this point and ask the Minister to undertake to speak to the department and the MoJ, if necessary, to make sure that, if senior managers are asked to attend court, they are mandated to do so.

Thirdly, I will touch on the additional commitments the Minister made beyond the Bill, the first of which is the upcoming Data Protection and Digital Information Bill. I am glad to report that some of the officials working on the Bill have already reached out, so I am grateful to the Minister that this is in train, but I expect it to include guidance for companies that will, at a minimum, cover data preservation orders and guidance about the privacy of other users in cases where a child has died. I think that privacy for other users is central to this being a good outcome for everybody, and I hope we are able to include that.

I am pleased to hear about the undertaking with the US regarding potential barriers, and I believe—and I would love to hear from the Minister—that the objective is to make a bilateral agreement that would allow data to be shared between the two countries in the case of a child’s death. It is very specific requirement, not a wide-ranging one. I believe, if we can do it on a bilateral basis, it would be easier than a broad attempt to change the data storage Act.

I turn finally to training for coroners. I was delighted that the Chief Coroner made a commitment to consider issuing non-legislative guidance and training on social media for coroners and the offer of consultation with experts, including Ofcom, the ICO and bereaved families and their representatives, but this commitment was made subject to funding. I ask the Minister to agree to discuss routes to funding from the levy via Ofcom’s digital literacy duty. I have proposed an amendment to the government amendment that would make that happen, but I would welcome the opportunity to discuss it with the Minister. Coroners must feel confident in their understanding of the digital world, and I am concerned that giving this new route to regulated companies via Ofcom without giving them training on how to use it may create a spectre of failure or further frustration and distress for bereaved families. I know there is not a person in the House who would want that to be the outcome of these welcome government amendments.

--- Later in debate ---
Again, I repeat my thanks to all across the House who have worked so hard to get substantial progress on this key issue.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I am grateful for the recognition of the work that has been done here, led by the noble Baroness, Lady Kidron, but involving many others, including officials who have worked to bring this package forward.

Noble Lords took the opportunity to ask a number of questions. The noble Baroness, Lady Kidron, asked about senior management liability. Ofcom will have extensive enforcement powers at its disposal if service providers do not comply with its information requests issued on behalf of a coroner. The powers will include the ability to hold senior managers criminally liable for non-compliance. Those powers are in line with Ofcom’s existing information-gathering powers in the Bill. Where Ofcom has issued an information request to a company, that company may be required to name a senior manager who is responsible for ensuring compliance with the requirements of the notice. If the named senior manager is found to have failed to comply with that information notice, or has failed to take all reasonable steps to prevent a failure to comply with the notice, that individual will be held personally liable and could be subject to imprisonment.

On the point about them not appearing in court, coroners have well-established powers to require senior managers to attend court. The enforcement powers available to Ofcom are in line with Ofcom’s existing information-gathering powers in the Bill. They do not extend to Ofcom requiring senior managers to appear in court as part of a coronial investigation. We do not think that would be appropriate for Ofcom, given that the coroner’s existing remit already covers this. The noble Baroness raised many specific instances that had come to her attention, and if she has specific examples of people not attending court that she would like to share with us and the Ministry of Justice, of course we would gladly follow those up.

The noble Lord, Lord Knight, rightly mentioned my noble friend Lady Newlove. I can reassure him that I have discussed this package of amendments with her, and had the benefit of her experience as a former Victims’ Commissioner.

On the training for coroners, which is an issue she raised, as did the noble Baroness, Lady Kidron, in her remarks just now, the Chief Coroner for England and Wales has statutory responsibility for maintaining appropriate arrangements for the training of coroners. That is of course independent of government, and exercised through the Judicial College, but the training is mandatory and the Chief Coroner is aware of the issues we are debating now.

The noble Lords, Lord Allan of Hallam and Lord Knight of Weymouth, raised the helpline for parents. Yes, we expect our approach of requiring a dedicated helpline or similar means will involve a human. As we say, we want a more humane process for those who need to use it; we think it would be more effective than requiring a company to provide a named individual contact. We touched on this briefly in Committee, where the point was raised, understandably, about staff turnover or people being absent on leave—that a requirement for a named individual could hinder the contact which families need to see there.

The noble Lord, Lord Allan, also asked some questions about deaths of people other than a child. First, Ofcom’s report in connection with investigations into a death covers any coronial inquest, not just children. More broadly, of course, social media companies may have their own terms and conditions or policies in place setting out when they will share information after somebody has passed away. Companies based outside the UK may have to follow the laws of the jurisdiction in which they are based, which may limit the sharing of data without a court order. While we recognise the difficulty that refusing to disclose data may cause for bereaved relatives in other circumstances, the right to access must, of course, be balanced with the right to privacy. Some adult social media users may be concerned, for instance, about the thought of family members having access to information about their private life after their deaths, so there is a complexity here, as I know the noble Lord understands.

The noble Baroness, Lady Kidron, asked about data preservation orders. I am very glad that officials from another Bill team are already in touch with her, as they should be. As we set out in Committee, we are aware of the importance of data preservation to coroners and bereaved parents, and the Government agree with the principle of ensuring that those data are preserved. We will work towards a solution through the Data Protection and Digital Information Bill. My noble friend Lord Camrose—who is unable to be with us today, also for graduation reasons—and I will be happy to keep the House and all interested parties updated about our progress in resolving the issue of data preservation as we work through this complex problem.

The noble Lord, Lord Clement-Jones, asked about the Information Commissioner’s Office. We expect Ofcom to consult the ICO on all the guidance where its expertise will be relevant, including on providers’ new duties under these amendments. I am grateful, as I say, for the support that they have had and the recognition that this has been a long process since these issues were first raised in the pre-legislative committee. We believe that it is of the utmost importance that coroners and families can access information about a child’s internet use following a bereavement, and that companies’ responses are made in a humane and transparent way.

This group of amendments should be seen alongside the wider protections for children in the Bill, and I hope they will help bereaved parents to get the closure that they deserve. The noble Lord, Lord Allan, was right to pay tribute to how these parents, who have campaigned so bravely, have turned their grief and frustration into a determination to make sure that no other parents go through the sorts of ordeals that they have. That is both humbling and inspiring, and I am glad that the Bill can help to be a part of the change that they are seeking. I share my noble friend Lady Harding’s wish that it may bring them a modicum of calm. I beg to move.

Amendment 84 agreed.
--- Later in debate ---
Moved by
86: Clause 25, page 29, line 28, leave out “this section” and insert “section 25”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 25 into two Clauses.
--- Later in debate ---
Moved by
88: Clause 25, page 29, line 34, leave out “this section” and insert “section 25”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 25 into two Clauses.
--- Later in debate ---
Moved by
91: Clause 25, page 29, line 42, leave out “subsection (3)” and insert “section 25(3)”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 25 into two Clauses.
--- Later in debate ---
Moved by
97: Clause 27, page 32, line 2, leave out “25(3)” and insert “25(2) or (3)”
Member’s explanatory statement
This amendment is about complaints of content being blocked because of an incorrect assessment of a user’s age. A reference to Clause 25(2) is inserted, as the duty in that provision can also be complied with by using age verification or age estimation.
--- Later in debate ---
Moved by
98: Clause 29, page 33, line 41, at end insert “,
and for the purposes of subsection (6), also includes the duties set out in section (Disclosure of information about use of service by deceased child users) (deceased child users).”Member’s explanatory statement
This amendment has the effect that OFCOM have a duty to review compliance by search service providers with the new duties imposed by the Clause proposed after Clause 67 in my name.
--- Later in debate ---
Moved by
99: Clause 30, page 34, line 12, leave out from “if” to “the” in line 13 and insert “age verification or age estimation is used on the service with”
Member’s explanatory statement
This amendment provides that a provider can only conclude that children cannot access a service if age verification or age estimation is used on the service with the result that children are not normally able to access it.
--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, very briefly, I commend these two amendments. Again, the provenance is very clear; the Joint Committee said:

“This regulatory alignment would simplify compliance for businesses, whilst giving greater clarity to people who use the service, and greater protection to children.”


It suggested that the Information Commissioner’s Office and Ofcom should issue a joint statement on how these two regulatory systems will interact once the Online Safety Bill has been enacted. That still sounds eminently sensible, a year and a half later.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, Amendments 100 and 101 seek further to define the meaning of “significant” in the children’s access assessment, with the intention of aligning this with the meaning of “significant” in the Information Commissioner’s draft guidance on the age-appropriate design code.

I am grateful to the noble Baroness, Lady Kidron, for the way in which she has set out the amendments and the swiftness with which we have considered it. The test in the access assessment in the Bill is already aligned with the test in the code, which determines whether a service is likely to be accessed by children in order to ensure consistency for all providers. The Information Commissioner’s Office has liaised with Ofcom on its new guidance on the likely to access test for the code, with the intention of aligning the two regulatory regimes while reflecting that they seek to do different things. In turn, the Bill will require Ofcom to consult the ICO on its guidance to providers, which will further support alignment between the tests. So while we agree about the importance of alignment, we think that it is already catered for.

With regard to Amendment 100, Clause 30(4)(a) already states that

“the reference to a ‘significant’ number includes a reference to a number which is significant in proportion to the total number of United Kingdom users of a service”.

There is, therefore, already provision in the Bill for this being a significant number in and of itself.

On Amendment 101, the meaning of “significant” must already be more than insignificant by its very definition. The amendment also seeks to define “significant” with reference to the number of children using a service rather than seeking to define what is a significant number.

I hope that that provides some reassurance to the noble Baroness, Lady Kidron, and that she will be content to withdraw the amendment.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I am not sure that, at this late hour, I completely understood what the Minister said. On the basis that we are seeking to align, I will withdraw my amendment, but can we check that we are aligned as my speech came directly from a note from officials that showed a difference? On that basis, I am happy to withdraw.

--- Later in debate ---
Moved by
102: Clause 31, page 35, line 1, leave out from “of” to “as” in line 2 and insert “age verification or age estimation that is used on the service”
Member’s explanatory statement
This amendment is consequential on the amendment of clause 30 in my name.
--- Later in debate ---
Moved by
103: Schedule 3, page 195, line 34, at end insert—
“5A (1) In this paragraph “the relevant day”, in relation to a regulated user-to- user service, means—(a) the first day on which the service is a Category 1 service, or (b) the first day on which the service again becomes a Category 1 service (following a period during which the service was not a Category 1 service).(2) If, on the relevant day, section 12(2) guidance is available, a section 12(2) assessment of the service must be completed within the period of three months beginning with that day.(3) Sub-paragraph (4) applies if—(a) on the relevant day, the first section 12(2) guidance has not yet been published, and(b) immediately before the publication of that guidance, the service is still a Category 1 service.(4) The first section 12(2) assessment of the service must be completed within the period of three months beginning with the day on which the first section 12(2) guidance is published.”Member’s explanatory statement
This amendment and the rest of the amendments of Schedule 3 in my name provide for the timing of the first assessments under the new Clause proposed after Clause 11 in my name.
--- Later in debate ---
Moved by
124: Schedule 4, page 203, line 23, at end insert—
“Content of codes of practice: age assurance
11A (1) This paragraph is about the inclusion of age assurance in a code of practice as a measure recommended for the purpose of compliance with any of the duties set out in section 11(2) or (3) or 25(2) or (3), and sub- paragraph (2) sets out some further principles, in addition to those in paragraphs 1 and 2 (general principles) and 10(2) (freedom of expression and privacy), which are particularly relevant.(2) In deciding whether to recommend the use of age assurance, or which kinds of age assurance to recommend, OFCOM must have regard to the following—(a) the principle that age assurance should be effective at correctly identifying the age or age-range of users;(b) relevant standards set out in the latest version of the code of practice under section 123 of the Data Protection Act 2018 (age- appropriate design code);(c) the need to strike the right balance between—(i) the levels of risk and the nature, and severity, of potential harm to children which the age assurance is designed to guard against, and(ii) protecting the right of users and (in the case of search services or the search engine of combined services) interested persons to freedom of expression within the law;(d) the principle that more effective kinds of age assurance should be used to deal with higher levels of risk of harm to children;(e) the principle that age assurance should be easy to use, including by children of different ages and with different needs;(f) the principle that age assurance should work effectively for all users regardless of their characteristics or whether they are members of a certain group;(g) the principle of interoperability between different kinds of age assurance. (3) In a code of practice that describes measures for the purpose of compliance with the duty set out in section 11(3)(a), OFCOM must recommend (among other things) age verification or age estimation which is such of a kind, and which is to be used in such a way, that it is highly effective at correctly determining whether or not a particular user is a child (see section 11(3C)).(4) In deciding which kinds and uses of age verification or age estimation to recommend for the purpose of compliance with the duty set out in section 11(3)(a), OFCOM must have regard to their guidance under section 73 that gives examples of kinds and uses of age verification and age estimation that are, or are not, highly effective at correctly determining whether or not a particular user is a child.(5) Nothing in sub-paragraph (2) is to be read as allowing OFCOM to recommend, for the purpose of compliance with the duty set out in section 11(3)(a) by providers subject to the requirement in section 11(3A), a kind or use of age verification or age estimation which does not meet the requirement to be highly effective as mentioned in section 11(3C).(6) A code of practice that recommends the use of age assurance for the purpose of compliance with the duties set out in section 11(2) or (3) must also describe measures recommended for the purpose of compliance with the duties set out in—(a) section 11(6), (8) and (10) (inclusion of clear information in terms of service), and(b) section 17(2) and (3)(see, in particular, section 17(5)(e) (complaints about age assurance)).(7) A code of practice that recommends the use of age assurance for the purpose of compliance with the duties set out in section 25(2) or (3) must also describe measures recommended for the purpose of compliance with the duties set out in—(a) section 25(5) and (8) (inclusion of clear information in publicly available statement), and(b) section 27(2) and (3)(see, in particular, section 27(5)(d) (complaints about age assurance)).(8) A code of practice may—(a) refer to industry or technical standards for age assurance (where they exist);(b) elaborate on the principles mentioned in paragraphs (a) and (c) to (g) of sub-paragraph (2).(9) In this paragraph “age assurance” means age verification or age estimation, and see in particular section (“Age verification” and “age estimation”) (4) (self-declaration of age not to be regarded as age verification or age estimation).”Member’s explanatory statement
This amendment contains provisions which relate to OFCOM’s recommendation of age assurance in codes of practice for the purposes of Part 3 of the Bill. It includes some relevant principles and makes it clear that OFCOM must recommend highly effective age assurance in connection with the duty in Clause 11(3)(a) (preventing children from encountering primary priority content that is harmful to children).
--- Later in debate ---
Moved by
126: Schedule 4, page 204, line 10, leave out “existing”
Member’s explanatory statement
This amendment is a minor drafting change to omit a superfluous word.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Moved by
129: Clause 38, page 40, line 29, after “39” insert “(A1), (B1) or”
Member’s explanatory statement
This amendment is consequential on the amendments made to Clause 39 in my name.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - -

My Lords, the amendments in this group consider regulatory accountability and the roles of Ofcom, the Government and Parliament in overseeing the new framework. The proposals include altering the powers of the Secretary of State to direct Ofcom, issue guidance to Ofcom and set strategic priorities. Ofcom’s operational independence is key to the success of this framework, but the regime must ensure that there is an appropriate level of accountability to government. Parliament will also have important functions, in particular scrutinising and approving the codes of practice which set out how platforms can comply with their duties and providing oversight of the Government’s powers.

I heard the strength of feeling expressed in Committee that the Bill’s existing provisions did not get this balance quite right and have tabled amendments to address this. Amendments 129, 134 to 138, 142, 143, 146 and 147 make three important changes to the power for the Secretary of State to direct Ofcom to modify a draft code of practice. First, these amendments replace the public policy wording in Clause 39(1)(a) with a more defined list of reasons for which the Secretary of State can make a direction. This list comprises: national security, public safety, public health and the UK’s international obligations. This is similar to the list set out in a Written Ministerial Statement made last July but omits “economic policy” and “burden to business”.

This closely aligns the reasons in the Bill with the existing power in Section 5 of the Communications Act 2003. The power is limited to those areas genuinely beyond Ofcom’s remit as a regulator and where the Secretary of State might have access to information or expertise that the regulator does not. Secondly, the amendments clarify that the power will be used only for exceptional reasons. As noble Lords know, this has always been our intent and the changes we are tabling today put this beyond doubt. Thirdly, the amendments increase the transparency regarding the use of the power by requiring the Secretary of State to publish details of a direction at the time the power is used. This will ensure that Parliament has advance sight of modifications to a code and I hope will address concerns that several directions could be made on a single code before Parliament became aware.

This group also considers Amendments 131 to 133, which create an 18-month statutory deadline for Ofcom to submit draft codes of practice to the Secretary of State to be laid in Parliament relating to illegal content, safety duties protecting children and other cross-cutting duties. These amendments sit alongside Amendment 230, which we debated on Monday and which introduced the same deadline for Ofcom’s guidance on Part 5 of the regime.

I am particularly grateful to my noble friend Lady Stowell of Beeston, with whom I have had the opportunity to discuss these amendments in some detail as they follow up points that she and the members of her committee gave particular attention to. I beg to move.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to the amendments in this group in my name: Amendments 139, 140, 144 and 145. I thank the noble Lords, Lord Stevenson and Lord Clement-Jones, and the noble Viscount, Lord Colville, for signing those amendments and for their continued support on this group. I am also grateful to my noble friend the Minister and his team for engaging with me on the issue of Secretary of State powers. He has devoted a lot of time and energy to this, which is reflected in the wide- ranging group of amendments tabled by him.

Before I go any further, it is worth emphasising that the underlying concern here is making sure that we have confidence, through this new regulation regime, that the Bill strikes the right balance of power between government, Parliament, the regulator and big tech firms. The committee that I chair—the Communications and Digital Select Committee of your Lordships’ House—has most focused on that in our consideration of the Bill. I should say also that the amendments I have brought forward in my name very much have the support of the committee as well.

These amendments relate to Clause 39, which is where the main issue lies in the context of Secretary of State powers, and we have three broad concerns. First, as it stood, the Bill handed the Secretary of State unprecedented powers to direct the regulator on pretty much anything. Secondly, these powers allowed the Government to conduct an infinite form of ping-pong with the regulator, enabling the Government to prevail in a dispute. Thirdly, this ping-pong could take place in private with no possibility of parliamentary oversight or being able to intervene, as would be appropriate in the event of a breakdown in the relationship between executive and regulator.

This matters because the Online Safety Bill creates a novel form for regulating the internet and what we can or cannot see online, in particular political speech, and it applies to the future. It is one thing for the current Government, who I support, to say that they would never use the powers in this way. That is great but, as we know, current Governments cannot speak for whoever is in power in the generations to come, so it is important that we get this right.

As my noble friend said, he has brought forward amendments to Clause 39 that help to address this. I support him in and commend him for that. The original laundry list of powers to direct Ofcom has been shortened and now follows the precedent set out in the Communications Act 2003. The government amendments also say that the Secretary of State must now publish their directions to Ofcom, which will improve transparency, and once the code is agreed Ofcom will publish changes so that Parliament can see what changes have been made and why. These are all very welcome and, as I say, they go a long way to addressing some of our concerns, but two critical issues remain.

First, the Government retain an opt-out, which means that they do not have to publish their directions if the Secretary of State believes that doing so would risk

“national security or public safety”,

or international relations. However, those points are now the precise grounds on which the Secretary of State may issue a direction and, if history is any guide, there is a real risk that we will never hear about the directions because the Government have decided that they are a security issue.

My Amendments 139 and 140 would require the Secretary of State to at least notify Parliament of the fact that a direction has been issued and what broad topic it relates to. That would not require any details to be published, so it does not compromise security, but it does give assurance that infinite, secretive ping-pong is not happening behind the scenes. My noble friend spoke so quickly at the beginning that I was not quite sure whether he signalled anything, but I hope that he may be able to respond enthusiastically to Amendments 139 and 140.

Secondly, the Government still have powers for infinite ping-pong. I appreciate that the Government have reservations about capping the number of exchanges between the Secretary of State and Ofcom, but they must also recognise the concern that they appear to be preparing the ground for any future Government to reject infinitely the regulator’s proposals and therefore prevail in a dispute about a politically contentious topic. My Amendments 144 and 145 would clarify that the Government will have a legally binding expectation that they will use no more than the bare minimum number of directions to achieve the intent set out in their first direction.

The Government might think that adding this to the Bill is superfluous, but it is necessary in order to give Parliament and the public confidence about the balance of power in this regime. If Parliament felt that the Secretary of State was acting inappropriately, we would have sufficient grounds to intervene. As I said, the Government acknowledged in our discussions the policy substance of these concerns, and as we heard from my noble friend the Minister in introducing this group, there is an understanding on this. For his part, there is perhaps a belief that what they have done goes far enough. I urge him to reconsider Amendments 144 and 145, and I hope that, when he responds to the debate on this group, he can say something about not only Amendments 139 and 140 but the other two amendments that will give me some grounds for comfort.

--- Later in debate ---
Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, first, I have to say that, having read Hansard from last Thursday, I feel I should have drawn attention to my interests in the register that relate to the Jewish community. I apologise for not doing so at the time and am pleased to now put this on the record.

I will be brief, as noble Lords have already raised a number of very pertinent points, to which I know the Minister will want to respond. In this group of amendments, there is a very welcome focus on transparency, accountability and the role of Parliament, all of which are absolutely crucial to the success of the Bill. I am grateful to the Minister for his introduction and explanation of the impact of the proposed changes to the role of the Secretary of State and Ofcom, whose codes of practice will be, as the noble Viscount, Lord Colville, said, vitally important to the Bill. We very much welcome the amendments in the name of the noble Baroness, Lady Stowell, which identify the requirements of the Secretary of State. We also welcome the government amendments, which along with the amendments by the noble Baroness, have been signed by my noble friend Lord Stevenson.

The amendments tabled in the name of the noble Lord, Lord Moylan, raise interesting points about the requirement to use the affirmative procedure, among other points. I look forward to the Minister’s response to that and other amendments. It would be helpful to hear from the Minister his thoughts on arrangements for post-legislative scrutiny. It would also be helpful to deliberations to understand whether there have been discussions on this between the usual channels.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, this is indeed an apposite day to be discussing ongoing ping-pong. I am very happy to speak enthusiastically and more slowly about my noble friend Lady Stowell of Beeston’s Amendments 139 and 140. We are happy to support those, subject to some tidying up at Third Reading. We agree with the points that she has made and are keen to bring something forward which would mean broadly that a statement would be laid before Parliament when the power to direct had been used. My noble friend Lady Harding characterised them as the infinite ping-pong question and the secretive ping-pong question; I hope that deals with the secretive ping-pong point.

My noble friend Lady Stowell’s other amendments focus on the infinite ping-pong question, and the power to direct Ofcom to modify a code. Her Amendments 139, 140, 144 and 145 seek to address those concerns: that the Secretary of State could enter into a private form of ping-pong with Ofcom, making an unlimited number of directions on a code to prevent it from ever coming before Parliament. Let me first be clear that we do not foresee that happening. As the amendments I have spoken to today show, the power can be used only when specific exceptional reasons apply. In that sense, we agree with the intent of the amendments tabled by my noble friend Lady Stowell. However, we cannot accept them as drafted because they rely on concepts— such as the “objective” of a direction—which are not consistent with the procedure for making a direction set out in the Bill.

The amendments I have brought forward mean that private ping-pong between the Secretary of State and Ofcom on a code is very unlikely to happen. Let me set out for my noble friend and other noble Lords why that is. The Secretary of State would need exceptional reasons for making any direction, and the Bill then requires that the code be laid before Parliament as soon as is reasonably practicable once the Secretary of State is satisfied that no further modifications to the draft are required. That does not leave room for the power to be used inappropriately. A code could be delayed in this way and in the way that noble Lords have set out only if the Secretary of State could show that there remained exceptional reasons once a code had been modified. This test, which is a very high bar, would need to be met each time. Under the amendments in my name, Parliament would also be made aware straightaway each time a direction was made, and when the modified code came before Parliament, it would now come under greater scrutiny using the affirmative procedure.

I certainly agree with the points that the noble Lord, Lord Allan, and others made that any directions should be made in as transparent a way as possible, which is why we have tabled these amendments. There may be some circumstances where the Secretary of State has access to information—for example, from the security services—the disclosure of which would have an adverse effect on national security. In our amendments, we have sought to retain the existing provisions in the Bill to make sure that we strike the right balance between transparency and protecting national security.

As the noble Lord mentioned, the Freedom of Information Act provides an additional route to transparency while also containing existing safeguards in relation to national security and other important areas. He asked me to think of an example of something that would be exceptional but not require that level of secrecy. By dropping economic policy and burden to business, I would point him to an example in those areas, but a concrete example evades me this afternoon. Those are the areas to which I would turn his attention.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

Can the Minister confirm that the fact that a direction has been made will always be known to the public, even if the substance of it is not because it is withheld under the secrecy provision? In other words, will the public always have a before and after knowledge of the fact of the direction, even if its substance is absent?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Yes; that is right.

I hope noble Lords will agree that the changes we have made and that I have outlined today as a package mean that we have reached the right balance in this area. I am very grateful to my noble friend Lady Stowell —who I see wants to come in—for the time that she too has given this issue, along with members of her committee.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - - - Excerpts

I am grateful to my noble friend for his constructive response to my Amendments 139 and 140. I am sure he will do me the honour of allowing me to see the Government’s reversioning of my amendments before they are laid so that we can be confident at Third Reading that they are absolutely in line with expectations.

Could I press my noble friend a little further on Amendments 144 and 145? As I understood what he said, the objection from within government is to the language in the amendments I have tabled—although as my noble friend Lady Harding said, they are incredibly modest in their nature.

I was not sure whether my noble friend was saying in his defence against accepting them that issuing a direction would have to be exceptional, and that that led to a need to clarify that this would be ongoing. Would each time there is a ping or a pong be exceptional? Forgive me, because it starts to sound a bit ridiculous when we get into this amount of detail, but it seems to me that the “exceptional” issue kicks in at the point where you issue the direction. Once you engage in a dialogue, “exceptional” is no longer really the issue. It is an odd defence against trying to limit the number of times you allow that dialogue to continue. Bearing in mind that he is willing to look again at Amendments 139 and 140, I wonder whether, between now and Third Reading, he would at least ask parliamentary counsel to look again at the language in my original amendment.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am certainly happy to commit to showing my noble friend the tidying up we think necessary of the two amendments I said we are happy to accept ahead of Third Reading. On the others, as I said, the code could be delayed repeatedly only if the Secretary of State showed that there remained exceptional reasons once it had been modified, and that high bar would need to be met each time. So we do not agree with her Amendments 14 and 145 because of concerns about the drafting of my noble friend’s current amendment and because the government amendments we have brought forward cater for the scenario about which she is concerned. Her amendments would place a constraint on the Secretary of State not to give more directions than are necessary to achieve the objectives set out in the original direction, but they would not achieve the intent I think my noble friend has. The Bill does not require the direction to have a particular objective. Directions are made because the Secretary of State believes that modifications are necessary for exceptional reasons, and the direction must set out the reasons why the Secretary of State believes that a draft should be modified.

Through the amendments the Government have laid today, the direction would have to be for exceptional reasons relating to a narrower list and Parliament would be made aware each time a direction was made. Parliament would also have increased scrutiny in cases where a direction had been made under Clause 39(1)(a), because of the affirmative procedure. However, I am very happy to keep talking to my noble friend, as we will be on the other amendments, so we can carry on our conversation then if she wishes.

Let me say a bit about the amendments tabled by my noble friend Lord Moylan. His Amendment 218 would require the draft statement of strategic priorities laid before Parliament to be approved by resolution of each House. As we discussed in Committee, the statement of strategic priorities is necessary because future technological changes are likely to shape harms online, and the Government must have an avenue through which to state their strategic priorities in relation to these emerging technologies.

The Bill already requires the Secretary of State to consult Ofcom and other appropriate persons when preparing a statement. This provides an opportunity for consideration and scrutiny of a draft statement, including, for example, by committees of Parliament. This process, combined with the negative procedure, provides an appropriate level of scrutiny and is in line with comparable existing arrangements in the Communications Act in relation to telecommunications, the management of radio spectrum and postal services.

My noble friend’s other amendments would place additional requirements on the Secretary of State’s power to issue non-binding guidance to Ofcom about the exercise of its online safety functions. The guidance document itself does not create any statutory requirements —Ofcom is required only to have regard to the guidance —and on that basis, we do not agree that it is necessary to subject it to parliamentary approval as a piece of secondary legislation. As my noble friend Lady Harding of Winscombe pointed out, we do not require that in numerous other areas of the economy, and we do not think it necessary here.

Let me reassure my noble friend Lord Moylan on the many ways in which Parliament will be able to scrutinise the work of Ofcom. Like most other regulators, it is accountable to Parliament in how it exercises its functions. The Secretary of State is required to present its annual report and accounts before both Houses. Ministers from the devolved Administrations must also lay a copy of the report before their respective Parliament or Assembly. Ofcom’s officers can be required to appear before Select Committees to answer questions about its work; indeed, its chairman and chief executive appeared before your Lordships’ Communications and Digital Committee just yesterday. Parliament will also have a role in approving a number of aspects of the regulatory framework through its scrutiny of both primary and secondary legislation.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, the key question is this: why have these powers over social media when the Secretary of State does not have them over broadcast?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

If I may, I will write to the noble Lord having reflected on that question further. We are talking here about the provisions set up in the Bill to deal with online harms; clearly, that is the focus here, which is why this Bill deals with that. I will speak to colleagues who look at other areas and respond further to the noble Lord’s question.

Let me reassure the noble Baroness, Lady Fox, that, through this Bill, both Ofcom and providers are being asked to have regard to freedom of expression. Ofcom already has obligations under the Human Rights Act to be bound by the European Convention on Human Rights, including Article 10 rights relating to freedom of expression. Through this Bill, user-to-user and search services will have to consider and implement safeguards for freedom of expression when fulfilling their duties. Those points are uppermost in our minds.

I am grateful for the support expressed by noble Lords for the government amendments in this group. Given the mixed messages of support and the continued work with my noble friend Lady Stowell of Beeston, I urge her not to move her amendments.

Amendment 129 agreed.
--- Later in debate ---
Moved by
131: Clause 38, page 41, line 4, leave out “This section applies” and insert “Subsections (1) to (6) apply”
Member’s explanatory statement
This amendment is consequential on the amendment inserting new subsections (9) to (13) into this Clause in my name.
--- Later in debate ---
Moved by
134: Clause 39, page 41, line 8, at end insert—
“(A1) The Secretary of State may direct OFCOM to modify a draft of a code of practice submitted under section 38(1) if the Secretary of State believes that modifications are required for the purpose of securing compliance with an international obligation of the United Kingdom.(B1) The Secretary of State may direct OFCOM to modify a draft of a code of practice, other than a terrorism or CSEA code of practice, submitted under section 38(1) if the Secretary of State believes that modifications are required for exceptional reasons relating to—(a) national security,(b) public safety,(c) public health, or(d) relations with the government of a country outside the United Kingdom.”Member’s explanatory statement
This amendment (together with other amendments to this Clause in my name) sets out the circumstances in which the Secretary of State can direct OFCOM to modify a draft of a code of practice.
--- Later in debate ---
Moved by
138: Clause 39, page 41, line 37, at end insert “, and
(c) must be published, except where the Secretary of State considers that doing so would have the effect mentioned in paragraph (b).”Member’s explanatory statement
This amendment requires a direction given under Clause 39 to be published except in cases where the Secretary of State considers that to do so would be against the interests of national security, public safety or relations with the government of a country outside the United Kingdom.
Amendment 139 (to Amendment 138) not moved.
--- Later in debate ---
Moved by
142: Clause 39, page 42, line 2, at end insert—
“(ca) publish the document, and”Member’s explanatory statement
This amendment requires OFCOM to publish a document submitted to the Secretary of State in response the Secretary of State giving a direction under this Clause.
--- Later in debate ---
Moved by
146: Clause 40, page 42, line 34, leave out “(1)(a)” and insert “(A1), (B1) or (1)(b)”
Member’s explanatory statement
This amendment is consequential on the amendments made to Clause 39 in my name.
--- Later in debate ---
Moved by
149: Clause 47, page 48, line 11, at end insert—
“(A1) OFCOM must produce guidance for providers of Category 1 services to assist them in complying with their duties set out in section (Assessment duties: user empowerment) (assessments related to the adult user empowerment duty set out in section 12(2)).”Member’s explanatory statement
This amendment requires OFCOM to produce guidance to assist providers of Category 1 services in carrying out their assessments as required by the new Clause proposed after Clause 11 in my name.
--- Later in debate ---
Moved by
151: Clause 48, page 48, line 33, leave out “12(9)” and insert “(User empowerment duties: interpretation)”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 12 into two Clauses.
--- Later in debate ---
Moved by
152: After Clause 48, insert the following new Clause—
“OFCOM’s guidance about protecting women and girls
(1) OFCOM must produce guidance for providers of Part 3 services which focuses on content and activity—(a) in relation to which such providers have duties set out in this Part or Part 4, and(b) which disproportionately affects women and girls.(2) The guidance may, among other things—(a) contain advice and examples of best practice for assessing risks of harm to women and girls from content and activity mentioned in subsection (1), and for reducing such risks;(b) refer to provisions contained in a code of practice under section 36 which are particularly relevant to the protection of women and girls from such content and activity.(3) Before producing the guidance (including revised or replacement guidance), OFCOM must consult—(a) the Commissioner for Victims and Witnesses,(b) the Domestic Abuse Commissioner, and(c) such other persons as OFCOM consider appropriate.(4) OFCOM must publish the guidance (and any revised or replacement guidance).”Member’s explanatory statement
This new Clause requires OFCOM to produce and publish a guidance document focusing on online content and activity which disproportionately affects women and girls.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, as we discussed in Committee, the Bill contains strong protection for women and girls and places duties on services to tackle and limit the kinds of offences and online abuse that we know disproportionately affect them. His Majesty’s Government are committed to ensuring that women and girls are protected online as well as offline. I am particularly grateful to my noble friend Lady Morgan of Cotes for the thoughtful and constructive way in which she has approached ensuring that the provisions in the Bill are as robust as possible.

It is with my noble friend’s support that I am therefore pleased to move government Amendment 152. This will create a new clause requiring Ofcom to produce guidance that summarises, in one clear place, measures that can be taken to tackle the abuse that women and girls disproportionately face online. This guidance will relate to regulated user-to-user and search services and will cover content regulated under the Bill’s frame- work. Crucially, it will summarise the measures in the Clause 36 codes for Part 3 duties, namely the illegal and child safety duties. It will also include a summary of platforms’ relevant Part 4 duties—for example, relevant terms of service and reporting provisions. This will provide a one-stop shop for providers.

Providers that adhere to the codes of practice will continue to be compliant with the duties. However, this guidance will ensure that it is easy and clear for platforms to implement holistic and effective protections for women and girls across their various duties. Any company that says it is serious about protecting women and girls online will, I am sure, refer to this guidance when implementing protections for its users.

Ofcom will have the flexibility to shape the guidance in a way it deems most effective in protecting women and girls online. However, as outlined in this amendment, we expect that it will include examples of best practice for assessing risks of harm to women and girls from content and activity, and how providers can reduce these risks and emphasise provisions in the codes of practice that are particularly relevant to the protection of women and girls.

To ensure that this guidance is effective and makes a difference, the amendment creates a requirement on Ofcom to consult the Domestic Abuse Commissioner and the Victims’ Commissioner, among other people or organisations it considers appropriate, when it creates this guidance. Much like the codes of practice, this will ensure that the views and voices of experts on the issue, and of women, girls and victims, are reflected. This amendment will also require Ofcom to publish this guidance.

I am grateful to all the organisations that have worked with us and with my noble friend Lady Morgan to get to this point. I hope your Lordships will accept the amendment. I beg to move.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak very briefly to this amendment; I know that the House is keen to get on to other business today. I very much welcome the amendment that the Government have tabled. My noble friend the Minister has always said that they want to keep women and girls safe online. As has been referred to elsewhere, the importance of making our digital streets safer cannot be overestimated.

As my noble friend said, women and girls experience a disproportionate level of abuse online. That is now recognised in this amendment, although this is only the start, not the end, of the matter. I thank my noble friend and the Secretary of State for their engagement on this issue. I thank the chief executive and the chair of Ofcom. I also thank the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Gloucester, who I know cannot be here today, and the noble Lord, Lord Knight, who signed the original amendment that we discussed in Committee.

My noble friend has already talked about the campaigners outside the Chamber who wanted there to be specific mention of women and girls in the Bill. I thank Refuge, the 100,000 people who signed the End Violence Against Women coalition’s petition, BT, Glitch, Carnegie UK, Professor Lorna Woods, the NSPCC and many others who made the case for this amendment.

As my noble friend said, this is Ofcom guidance. It is not necessarily a code of practice, but it is still very welcome because it is broader than just the specific offences that the Government have legislated on, which I also welcome. As he said, this puts all the things that companies, platforms and search engines should be doing to protect women and girls online in one specific place. My noble friend mentioned holistic protection, which is very important.

There is no offline/online distinction these days. Women and girls should feel safe everywhere. I also want to say, because I know that my noble friend has had a letter, that this is not about saying that men and boys should not be safe online; it is about recognising the disproportionate levels of abuse that women and girls suffer.

I welcome the fact that, in producing this guidance, Ofcom will have to consult with the Domestic Abuse Commissioner and the Victims’ Commissioner and more widely. I look forward, as I am sure do all the organisations I just mentioned, to working with Ofcom on the first set of guidance that it will produce. It gives me great pleasure to have signed the amendment and to support its introduction.

--- Later in debate ---
Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, this very positive government amendment acknowledges that there is not equality when it comes to online abuse. We know that women are 27 times more likely than men to be harassed online, that two-thirds of women who report abuse to internet companies do not feel heard, and three out of four women change their behaviour after receiving online abuse.

Like others, I am very glad to have added my name to support this amendment. I thank the Minister for bringing it before your Lordships’ House and for his introduction. It will place a requirement on Ofcom to produce and publish guidance for providers of Part 3 services in order to make online spaces safer for women and girls. As the noble Baroness, Lady Morgan, has said, while this is not a code of practice—and I will be interested in the distinction between the code of practice that was being called for and what we are expecting now—it would be helpful perhaps to know when we might expect to see it. As the noble Baroness, Lady Burt, just asked, what kind of timescale is applicable?

This is very much a significant step for women and girls, who deserve and seek specific protections because of the disproportionate amount of abuse received. It is crucial that the guidance take a holistic approach which focuses on prevention and tech accountability, and that it is as robust as possible. Can the Minister say whether he will be looking to the model of the Violence against Women and Girls Code of Practice, which has been jointly developed by a number of groups and individuals including Glitch, the NSPCC, 5Rights and Refuge? It is important that this be got right, that we see it as soon as possible and that all the benefits can be felt and seen.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I am very grateful to everyone for the support they have expressed for this amendment both in the debate now and by adding their names to it. As I said, I am particularly grateful to my noble friend Lady Morgan, with whom we have worked closely on it. I am also grateful for her recognition that men and boys also face harm online, as she rightly points out. As we discussed in Committee, this Bill seeks to address harms for all users but we recognise that women and girls disproportionately face harm online. As we have discussed with the noble Baroness, Lady Merron, women and girls with other characteristics such as women of colour, disabled women, Jewish women and many others face further disproportionate harm and abuse. I hope that Amendment 152 demonstrates our commitment to giving them the protection they need, making it easy and clear for platforms to implement protections for them across all the wide-ranging duties they have.

The noble Baroness, Lady Burt of Solihull, asked why it was guidance and not a code of practice. Ofcom’s codes of practice will set out how companies can comply with the duties and will cover how companies should tackle the systemic risks facing women and girls online. Stipulating that Ofcom must produce specific codes for multiple different issues could, as we discussed in Committee, create duplication between the codes, causing confusion for companies and for Ofcom.

As Ofcom said in its letter to your Lordships ahead of Report, it has already started the preparatory work on the draft illegal content and child sexual abuse and exploitation codes. If it were required to create a separate code relating to violence against women and girls, this preparatory work would need to be revised, so there would be the unintended—and, I think, across the House, undesired—consequence of slowing down the implementation of these vital protections. I am grateful for the recognition that we and Ofcom have had on that point.

Instead, government Amendment 152 will consolidate all the relevant measures across codes of practice, such as on illegal content, child safety and user empowerment, in one place, assisting platforms to reduce the risk of harm that women and girls disproportionately face.

On timing, at present Ofcom expects that this guidance will be published in phase 3 of the implementation of the Bill, which was set out in Ofcom’s implementation plan of 15 June. This is when the duties in Part 4 of the Bill, relating to terms of service and so on, will be implemented. The guidance covers the duties in Part 4, so for guidance to be comprehensive and have the most impact in protecting women and girls, it is appropriate for it to be published during phase 3 of the Bill’s implementation.

The noble Baroness, Lady Fox, mentioned the rights of trans people and the rights of people to express their views. As she knows, gender reassignment and religious or philosophical belief are both protected characteristics under the Equality Act 2010. Sometimes those are in tension, but they are both protected in the law.

With gratitude to all the noble Lords who have expressed their support for it, I commend the amendment to the House.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

The Minister did not quite grasp what I said but I will not keep the House. Would he be prepared to accept recommendations for a broader consultation—or who do I address them to? It is important that groups such as the Women’s Rights Network and others, which suffer abuse because they say “I know what a woman is”, are talked to in a discussion on women and abuse, because that would be appropriate.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am sorry—yes, the noble Baroness made a further point on consultation. I want to reassure her and other noble Lords that Ofcom has the discretion to consult whatever body it considers appropriate, alongside the Victims’ Commissioner, the Domestic Abuse Commissioner and others who I mentioned. Those consultees may not all agree. It is important that Ofcom takes a range of views but is able to consult whomever. As I mentioned previously, Ofcom and its officers can be scrutinised in Parliament through Select Committees and in other ways. The noble Baroness could take it up directly with them but could avail herself of those routes for parliamentary scrutiny if she felt that her pleas were falling on deaf ears.

Amendment 152 agreed.
--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to the noble Lord, Lord Clement-Jones, for raising this; it is important. Clause 49(3)(a)(i) mentions content

“generated directly on the service by a user”,

which, to me, implies that it would include the actions of another user in the metaverse. Sub-paragraph (ii) mentions content

“uploaded to or shared on the service by a user”,

which covers bots or other quasi-autonomous virtual characters in the metaverse. As we heard, a question remains about whether any characters or objects provided by the service itself are covered.

A scenario—in my imagination anyway—would be walking into an empty virtual bar at the start of a metaverse service. This would be unlikely to be engaging: the attractions of indulging in a lonely, morose drink at that virtual bar are limited. The provider may therefore reasonably configure the algorithm to generate characters and objects that are engaging until enough users then populate the service to make it interesting.

Of course, there is the much more straightforward question of gaming platforms. On Monday, I mentioned “Grand Theft Auto”, a game with an advisory age of 17—they are still children at that age—but that is routinely accessed by younger children. Shockingly, an article that I read claimed that it can evolve into a pornographic experience, where the player becomes the character from a first-person angle and received services from virtual sex workers, as part of the game design. So my question to the Minister is: does the Bill protect the user from these virtual characters interacting with users in virtual worlds?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I will begin with that. The metaverse is in scope of the Bill, which, as noble Lords know, has been designed to be technology neutral and future-proofed to ensure that it keeps pace with emerging technologies—we have indeed come a long way since the noble Lord, Lord Clement-Jones, the noble Lords opposite and many others sat on the pre-legislative scrutiny committee for the Bill. Even as we debate, we envisage future technologies that may come. But the metaverse is in scope.

The Bill will apply to companies that enable users to share content online or to interact with each other, as well as search services. That includes a broad range of services, such as websites, applications, social media services, video games and virtual reality spaces, including the metaverse.

Any service that enables users to interact, as the metaverse does, will need to conduct a child access test and will need to comply with the child safety duties—if it is likely to be accessed by children. Content is broadly defined in the Bill as,

“anything communicated by means of an internet service”.

Where this is uploaded, shared or directly generated on a service by a user and able to be encountered by other users, it will be classed as user-generated content. In the metaverse, this could therefore include things like objects or avatars created by users. It would also include interactions between users in the metaverse such as chat—both text and audio—as well as images, uploaded or created by a user.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I hope I am not interrupting the Minister in full flow. He has talked about users entirely. He has not yet got to talking about what happens where the provider is providing that environment—in exactly the way in which the noble Lord, Lord Knight, illustrated.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

We talked about bots controlled by service providers before the noble Lord, Lord Knight, asked questions on this. The Bill is designed to make online service providers responsible for the safety of their users in light of harmful activities that their platforms might facilitate. Providers of a user-to-user service will need to adhere to their duties of care, which apply to all user-generated content present on their service. The Bill does not, however, regulate content published by user-to-user providers themselves. That is because the providers are liable for the content they publish on the service themselves. The one exception to this—as the noble Baroness, Lady Kidron, alluded to in her contribution—is pornography, which poses a particular risk to children and is regulated by Part 5 of the Bill.

I am pleased to reassure the noble Lord, Lord Clement- Jones, that the Bill—

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank the noble Lord for giving way. The Minister just said that private providers will be responsible for their content. I would love to understand what mechanism makes a provider responsible for their content?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I will write to noble Lords with further information and will make sure that I have picked up correctly the questions that they have asked.

On Amendment 152A, which the noble Lord, Lord Clement-Jones, has tabled, I am pleased to assure him that the Bill already achieves the intention of the amendment, which seeks to add characters and objects that might interact with users in the virtual world to the Bill’s definition of user-generated content. Let me be clear again: the Bill already captures any service that facilitates online user-to-user interaction, including in the metaverse or other augmented reality or immersive online worlds.

The Bill broadly defines “content” as

“anything communicated by means of an internet service”,

so it already captures the various ways in which users may encounter content. Clause 211 makes clear that “encounter” in relation to content for the purposes of the Bill means to,

“read, view, hear or otherwise experience”

content. That definition extends to the virtual worlds which noble worlds have envisaged in their contributions. It is broad enough to encompass any way of encountering content, whether that be audio-visually or through online avatars or objects.

In addition, under the Bill’s definition of “functionality”,

“any feature that enables interactions of any description between users of the service”

will be captured. That could include interaction between avatars or interaction by means of an object in a virtual world. All in-scope services must therefore consider a range of functionalities as part of their risk assessment and must put in place any necessary measures to mitigate and manage any risks that they identify.

I hope that that provides some assurance to the noble Lord that the concerns that he has raised are covered, but I shall happily write on his further questions before we reach the amendment that the noble Baroness, Lady Finlay, rightly flagged in her contribution.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I thank the Minister. I feel that we have been slightly unfair because we have been asking questions about an amendment that we have not been able to table. The Minister has perfectly well answered the actual amendment itself and has given a very positive reply—and in a sense I expected him to say what he said about the actual amendment. But, of course, the real question is about an amendment that I was unable to table.

--- Later in debate ---
Moved by
153: Clause 49, page 49, line 27, after “bot” insert “or other automated tool”
Member’s explanatory statement
This amendment, and the next two amendments in my name, make it clear that an automated tool which is not a bot - as well as a bot - may be regarded as a user for the purposes of the definition of “user-generated content”.
--- Later in debate ---
Moved by
158: Clause 49, page 50, line 17, leave out sub-paragraphs (ii) and (iii) and insert—
“(ii) is video or audio content that was originally published or broadcast by a recognised news publisher, and is not a clipped or edited form of such content (unless it is the recognised news publisher who has clipped or edited it), or(iii) is a link to an article or item within sub-paragraph (i) or to content within sub-paragraph (ii).”Member’s explanatory statement
This amendment revises the definition of “news publisher content” so that, in particular, online content published by a recognised news publisher that has not first been broadcast is covered by the definition.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, as noble Lords know, His Majesty’s Government are committed to defending the invaluable role of a free media, and our online safety legislation must protect the vital role of the press in providing people with reliable and accurate information online. That is why we have included strong protections for recognised news publishers in the Bill.

Clause 49(9) and (10) set out what is considered “news publisher content” in relation to a regulated user-to-user service, while Clause 52 sets out that news publishers’ content is exempt from search services’ duties. The government amendments clarify minor elements of these exemptions and definitions. Given the evolving consumption habits for news, recognised news publishers might clip or edit content from their published or broadcast versions to cater to different audiences and platforms. We want to ensure that recognised news publisher content is protected in all its forms, as long as that content is created or generated by the news publishers themselves.

First, our amendments clarify that any video or audio content published or broadcast by recognised news publishers will be exempt from the Bill’s safety duties and will benefit from the news publisher appeals process, when shared on platforms in scope of the Bill. These amendments ensure that old terminology works effectively in the internet age. The amendments now also make it clear that any news publisher content that is clipped or edited by the publisher itself will qualify for the Bill’s protections when shared by third parties on social media. However, these protections will not apply when a third-party user modifies that content itself. This will ensure that the protections do not apply to news publisher content that has been edited by a user in a potentially harmful way.

The amendments make it clear that the Bill’s protections apply to links to any article, video or audio content generated by recognised news publishers, clipped or edited, and regardless of the form in which that content was first published or broadcast. Taken together, these amendments ensure that our online safety legislation protects recognised news publishers’ content as intended. I hope noble Lords will support them. I beg to move.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I reassure the noble Lord, Lord Stevenson, that he was right to sign the amendments; I am grateful that he did. I do not know whether it is possible to have a sense of déjà vu about debates that took place before one entered your Lordships’ House, but if so, I feel I have had it over the past hour. I am, however, glad to see the noble Lords, Lord Lipsey and Lord McNally, back in their places and that they have had the chance to express their views, which they were unable to do fully in Committee. I am grateful to noble Lords who have joined in that debate again.

At present, Amendment 159 would enable news publishers that are members of Impress, the sole UK regulator which has sought approval by the Press Recognition Panel, to benefit from the Bill’s protections for news publishers, without meeting the criteria set out in Clause 50(2). This would introduce a legislative advantage for Impress members over other news publishers. The amendment would, in effect, create strong incentives for publishers to join a specific press regulator. We do not consider that to be compatible with our commitment to a free press. To that end, as noble Lords know, we will repeal existing legislation that could have that effect, specifically Section 40 of the Crime and Courts Act 2013, through the media Bill, which was published recently.

Not only is creating an incentive for a publisher to join a specific regulator incompatible with protecting press freedom in the United Kingdom but it would undermine the aforementioned criteria. These have been drafted to be as robust as possible, with requirements including that organisations have publication of news as their principal purpose, that they are subject to a standards code and that their content is created by different persons. Membership of Impress, or indeed any other press regulator, does not and should not automatically ensure that these criteria are met.

Amendment 160 goes further by amending one of these criteria—specifically, the requirement for entities to be subject to a standards code. It would add the requirement that these standards codes be drawn up by a regulator, such as a body such as Impress. This amendment would create further incentives for news publishers to join a press regulator if they are to benefit from the exclusion for recognised news publishers. This is similarly not compatible with our commitment to press freedom.

We believe the criteria set out in Clause 50 of the Bill are already sufficiently strong, and we have taken significant care to ensure that only established news publishers are captured, while limiting the opportunity for bad actors to benefit.

The noble Lord, Lord Allan, asked about protections against that abuse by bad actors. The Bill includes protections for journalism and news publishers, given the importance of a free press in a democratic society. However, it also includes safeguards to prevent the abuse of these protections by bad actors. Platforms will still be able to remove recognised news publisher content that breaches their terms and conditions as long as they notify recognised news publishers and offer a right of appeal first. This means that content will remain online while the appeal is considered, unless it constitutes a relevant offence under the Bill or the platform would incur criminal or civil liability by hosting it. This marks a significant improvement on the status quo whereby social media companies can remove journalistic content with no accountability and little recourse for journalists to appeal.

We are clear that sanctioned news outlets such as RT must not benefit from these protections. We are amending the criteria for determining which entities qualify as recognised news publishers explicitly to exclude entities that are subject to sanctions. The criteria also exclude any entity that is a proscribed organisation under the Terrorism Act 2000 or whose purpose is to support an organisation that is proscribed under that Act. To require Ofcom or another party to assess standards would be to introduce press regulation by the back door.

The noble Baroness, Lady Fox of Buckley, asked about protecting clipped or edited content. Given evolving news consumption habits, recognised news publishers may clip or edit content from their published or broadcast versions to cater to different audiences and to be used on different platforms. We want to ensure recognised news publisher content is protected in all its forms as long as that content is still created or generated by the news publisher. For example, if a broadcaster shares a link to its shorter, online-only version of a long-form TV news programme or documentary on an in-scope platform, this should still benefit from the protections that the Bill affords. The amendment that we have brought forward ensures that this content and those scenarios remain protected but removes the risk of platforms being forced to carry news publisher content that has been edited by a third party potentially to cause harm. I hope that clarifies that.

I am grateful to the noble Lord, Lord Lipsey, for making it clear that he does not intend to press his amendments to a Division, so I look forward to that. I am also grateful for the support for the Government’s amendments in this group.

Amendment 158 agreed.
--- Later in debate ---
Moved by
161: Clause 51, page 52, line 14, leave out sub-paragraphs (ii) and (iii) and insert—
“(ii) is video or audio content that was originally published or broadcast by a recognised news publisher, and is not a clipped or edited form of such content (unless it is the recognised news publisher who has clipped or edited it), or(iii) is a link to an article or item within sub-paragraph (i) or to content within sub-paragraph (ii).”Member’s explanatory statement
This amendment ensures that, in particular, online content published by a recognised news publisher that has not first been broadcast is included in the list of content which does not count as search content for the purposes of the Bill.
--- Later in debate ---
Moved by
163: Clause 54, page 54, line 44, leave out “applies” and insert “and sections (“Primary priority content that is harmful to children”) and (“Priority content that is harmful to children”) apply”
Member’s explanatory statement
This technical amendment ensures that the new Clauses proposed to be inserted after Clause 54 in my name setting out which kinds of content count as primary priority content and priority content harmful to children apply for the purposes of Part 3 of the Bill.
--- Later in debate ---
Moved by
171: After Clause 54, insert the following new Clause—
““Primary priority content that is harmful to children”
(1) “Primary priority content that is harmful to children” means content of any of the following kinds. (2) Pornographic content, other than content within subsection (6).(3) Content which encourages, promotes or provides instructions for suicide.(4) Content which encourages, promotes or provides instructions for an act of deliberate self-injury.(5) Content which encourages, promotes or provides instructions for an eating disorder or behaviours associated with an eating disorder.(6) Content is within this subsection if it—(a) consists only of text, or(b) consists only of text accompanied by—(i) identifying content which consists only of text,(ii) other identifying content which is not itself pornographic content,(iii) a GIF which is not itself pornographic content,(iv) an emoji or other symbol, or(v) any combination of content mentioned in sub-paragraphs (i) to (iv).(7) In this section and section (“Priority content that is harmful to children”) “injury” includes poisoning.”Member’s explanatory statement
This amendment describes which kinds of content count as primary priority content harmful to children for the purposes of Part 3 of the Bill.
--- Later in debate ---
Moved by
172: After Clause 54, insert the following new Clause—
““Priority content that is harmful to children”
(1) “Priority content that is harmful to children” means content of any of the following kinds.(2) Content which is abusive and which targets any of the following characteristics—(a) race,(b) religion,(c) sex,(d) sexual orientation,(e) disability, or(f) gender reassignment.(3) Content which incites hatred against people—(a) of a particular race, religion, sex or sexual orientation,(b) who have a disability, or(c) who have the characteristic of gender reassignment.(4) Content which encourages, promotes or provides instructions for an act of serious violence against a person.(5) Bullying content.(6) Content which—(a) depicts real or realistic serious violence against a person;(b) depicts the real or realistic serious injury of a person in graphic detail.(7) Content which—(a) depicts real or realistic serious violence against an animal;(b) depicts the real or realistic serious injury of an animal in graphic detail;(c) realistically depicts serious violence against a fictional creature or the serious injury of a fictional creature in graphic detail. (8) Content which encourages, promotes or provides instructions for a challenge or stunt highly likely to result in serious injury to the person who does it or to someone else.(9) Content which encourages a person to ingest, inject, inhale or in any other way self-administer—(a) a physically harmful substance;(b) a substance in such a quantity as to be physically harmful.(10) In subsections (2) and (3)—(a) “disability” means any physical or mental impairment;(b) “race” includes colour, nationality, and ethnic or national origins;(c) references to religion include references to a lack of religion.(11) For the purposes of subsection (3), a person has the characteristic of gender reassignment if the person is proposing to undergo, is undergoing or has undergone a process (or part of a process) for the purpose of reassigning the person’s sex by changing physiological or other attributes of sex, and the reference to gender reassignment in subsection (2) is to be construed accordingly.(12) For the purposes of subsection (5) content may, in particular, be “bullying content” if it is content targeted against a person which—(a) conveys a serious threat;(b) is humiliating or degrading;(c) forms part of a campaign of mistreatment.(13) In subsection (6) “person” is not limited to a real person.(14) In subsection (7) “animal” is not limited to a real animal.”Member’s explanatory statement
This amendment describes which kinds of content count as priority content harmful to children for the purposes of Part 3 of the Bill.
--- Later in debate ---
Moved by
175: Clause 55, leave out Clause 55
Member’s explanatory statement
This amendment omits Clause 55 (regulations describing kinds of content harmful to children), as the kinds of content are now set out in the Bill - see the new Clauses proposed to be inserted after Clause 54 in my name.
--- Later in debate ---
Moved by
176: Clause 56, page 56, line 22, leave out subsection (1)
Member’s explanatory statement
This amendment and the next two amendments in my name omit references to regulations which are no longer needed, as primary priority content and priority content harmful to children are now set out in the new Clauses proposed to be inserted after Clause 54 in my name, not in regulations.
--- Later in debate ---
Moved by
185: Clause 60, page 59, line 15, at end insert—
“(2A) The regulations may also—(a) require providers to retain, for a specified period, data of a specified description associated with a report, and(b) impose restrictions or requirements in relation to the retention of such data (including how the data is to be secured or stored or who may access the data).(2B) The power to require the retention of data associated with a report includes power to require the retention of—(a) content generated, uploaded or shared by any user mentioned in the report (or metadata relating to such content), and(b) user data relating to any such person (or metadata relating to such data).“User data” here has the meaning given by section 206.” Member’s explanatory statement
This amendment provides that regulations under this Clause may require a provider to retain data associated with a report sent to the NCA and impose restrictions or requirements in relation to the retention of the data.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, child sexual exploitation or abuse is an abhorrent crime. Reporting allows victims to be identified and offenders apprehended. It is vital that in-scope companies retain the data included in reports made to the National Crime Agency. This will enable effective prosecutions and ensure that children can be protected.

The amendments in my name in this group will enable the Secretary of State to include in the regulations about the reporting of child sexual exploitation or abuse content a requirement for providers to retain data. This requirement will be triggered only by a provider making a report of suspected child sexual exploitation or abuse to the National Crime Agency. The provider will need to retain the data included in the report, along with any associated account data. This is vital to enabling prosecutions and to ensuring that children can be protected, because data in reports cannot be used as evidence. Law enforcement agencies request this data only when they have determined that the content is in fact illegal and that it is necessary to progress investigations.

Details such as the types of data and the period of time for which providers must retain this data will be specified in regulations. This will ensure that the requirement is future-proofed against new types of data and will prevent companies retaining types of data that may have become obsolete. The amendments will also enable regulations to include any necessary safeguards in relation to data protection. However, providers will be expected to store, process and share this personal data within the UK GDPR framework.

Regulations about child sexual exploitation or abuse reporting will undergo a robust consultation with relevant parties and will be subject to parliamentary scrutiny. This process will ensure that the regulations about retaining data will be well-informed, effective and fit for purpose. These amendments bring the child sexual exploitation and abuse reporting requirements into line with international standards. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, these seem very sensible amendments. I am curious about why they have arrived only at this stage, given this was a known problem and that the Bill has been drafted over a long period. I am genuinely curious as to why this issue has been raised only now.

On the substance of the amendments, it seems entirely sensible that, given that we are now going to have 20,000 to 25,000 regulated entities in scope, some of which will never have encountered child sexual exploitation or abuse material or understood that they have a legal duty in relation to it, it will be helpful for them to have a clear set of regulations that tell them how to treat their material.

Child sexual exploitation or abuse material is toxic in both a moral and a legal sense. It needs to be treated almost literally as toxic material inside a company, and sometimes that is not well understood. People feel that they can forward material to someone else, not understanding that in doing so they will break the law. I have had experiences where well-meaning people acting in a vigilante capacity sent material to me, and at that point you have to report them to police. There are no ifs or buts. They have committed an offence in doing so. As somebody who works inside a company, your computer has to be quarantined and taken off and cleaned, just as it would be for any other toxic material, because we framed the law, quite correctly, to say that we do not want to offer people the defence of saying “I was forwarding this material because I’m a good guy”. Forwarding the material is a strict liability offence, so to have regulations that explain, particularly to organisations that have never dealt with this material, exactly how they have to deal with it in order to be legally compliant will be extremely helpful.

One thing I want to flag is that there are going to be some really fundamental cross-border issues that have to be addressed. In many instances of child sexual exploitation or abuse material, the material has been shared between people in different jurisdictions. The provider may not be in a UK jurisdiction, and we have got to avoid any conflicts of laws. I am sure the Government are thinking about this, but in drafting those regulations, what we cannot do, for example, is order a provider to retain data in a way that would be illegal in the jurisdiction from which it originates or in which it has its headquarters. The same would apply vice versa. We would not expect a foreign Government to order a UK company to act in a way that was against UK law in dealing with child sexual exploitation or abuse material. This all has to be worked out. I hope the Government are conscious of that.

I think the public interest is best served if the United Kingdom, the United States and the European Union, in particular, adopt common standards around this. I do not think there is anything between us in terms of how we would want to approach child sexual exploitation or abuse material, so the extent to which we end up having common legal standards will be extraordinarily helpful.

As a general matter, to have regulations that help companies with their compliance is going to be very helpful. I am curious as to how we have got there with the amendment only at this very late stage.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, from this side we certainly welcome these government amendments. I felt it was probably churlish to ask why it had taken until this late stage to comply with international standards, but that point was made very well by the noble Lord, Lord Allan of Hallam, and I look forward to the Minister’s response.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I am grateful to noble Lords for their support for these amendments and for their commitment, as expected, to ensuring that we have the strongest protections in the Bill for children.

The noble Lord, Lord Allan of Hallam, asked: why only now? It became apparent during the regular engagement that, as he would expect, the Government have with the National Crime Agency on issues such as this that this would be necessary, so we are happy to bring these amendments forward. They are vital amendments to enable law enforcement partners to prosecute offenders and keep children safe.

Reports received by the National Crime Agency are for intelligence only and so cannot be relied on as evidence. As a result, in some cases law enforcement agencies may be required to request that companies provide data in an evidential format. The submitted report will contain a limited amount of information from which law enforcement agencies will have to decide what action to take. Reporting companies may hold wider data that relate to the individuals featured in the report, which could allow law enforcement agencies to understand the full circumstances of the event or attribute identities to the users of the accounts.

The data retention period will provide law enforcement agencies with the necessary time to decide whether it is appropriate to request data in order to continue their investigations. I hope that explains the context of why we are doing this now and why these amendments are important ones to add to the Bill. I am very grateful for noble Lords’ support for them.

Amendment 185 agreed.
Moved by
186: Clause 60, page 59, line 16, leave out “the regulations” and insert “regulations under this section”
Member’s explanatory statement
This amendment is consequential on the other amendment to Clause 60 in my name.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Finally, I refer to the very good discussion we have had about Amendment 186A, which was introduced by the noble Lord, Lord Moylan. Like many people who received his initial circulation of his draft amendment, I was struck by why on earth I had not thought of that myself. It is a good and obvious move that we should think a little more about. It probably needs a lot more thought about the concerns about the unintended consequences that might arise from it before we move forward on it, and I take the points made by the noble Lord, Lord Allan, about that, but I hope that the Minister will respond positively to it and that it is perhaps something we can pick up in future Bills.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - -

My Lords, let me add to this miscellany by speaking to the government amendments that stand in my name as part of this group. The first is Amendment 288A, which we mentioned on the first group of amendments on Report because it relates to the new introductory clause, Clause 1, and responds to the points raised by the noble Lord, Lord Stevenson of Balmacara. I am very happy to say again that the Government recognise that people with multiple and combined characteristics suffer disproportionately online and are often at greater risk of harm. This amendment therefore adds a provision in the new interpretation clause, Clause 1, to put beyond doubt that all the references to people with “a certain characteristic” throughout the Bill include people with a combination of characteristics. We had a good debate about the Interpretation Act 1978, which sets that out, but we are happy to set it out clearly here.

In his Amendment 186A, my noble friend Lord Moylan seeks to clarify a broader issue relating to consumer rights and online platforms. He got some general support—certainly gratitude—for raising this issue, although there was a bit of a Committee-style airing of it and a mixture of views on whether this is the right way or the right place. The amendment seeks to make it clear that certain protections for consumers in the Consumer Rights Act 2015 apply when people use online services and do not pay for them but rather give up their personal data in exchange. The Government are aware that the application of the law in that area is not always clear in relation to free digital services and, like many noble Lords, express our gratitude to my noble friend for highlighting the issue through his amendment.

We do not think that the Bill is the right vehicle for attempting to provide clarification on this point, however. We share some of the cautions that the noble Lord, Lord Allan of Hallam, raised and agree with my noble friend Lady Harding of Winscombe that this is part of a broader question about consumer rights online beyond the services with which the Bill is principally concerned. It could be preferable that the principle that my noble friend Lord Moylan seeks to establish through his amendment should apply more widely than merely to category 1 services regulated under the Bill. I assure him that the Bill will create a number of duties on providers which will benefit users and clarify that they have existing rights of action in the courts. We discussed these new protections in depth in Committee and earlier on Report. He drew attention to Clause 65(1), which puts a requirement on all services, not just category 1 services, to include clear and accessible provisions in their terms of service informing users about their right to bring a claim for breach of contract. Therefore, while we are grateful, we agree with noble Lords who suggested that this is a debate for another day and another Bill.

Amendment 191A from the noble Baroness, Lady Kidron, would require Ofcom to issue guidance for coroners and procurators fiscal to aid them in submitting requests to Ofcom to exercise its power to obtain information from providers about the use of a service by a deceased child. While I am sympathetic to her intention, I do not think that her amendment is the right answer. It would be inappropriate for an agency of the Executive to issue guidance to a branch of the judiciary. As I explained in Committee, it is for the Chief Coroner to provide detailed guidance to coroners. This is written to assist coroners with the law and their legal duties and to provide commentary and advice on policy and practice.

The amendment tabled by the noble Baroness cuts across the role of the Chief Coroner and risks compromising the judicial independence of the coroner, as set out in the Constitutional Reform Act 2005. As she is aware, the Chief Coroner has agreed to consider issuing guidance to coroners on social media and to consider the issues covered in the Bill. He has also agreed to explore whether coroners would benefit from additional training, with the offer of consultation with experts including Ofcom and the Information Commissioner’s Office. I suggest that the better approach would be for Ofcom and the Information Commissioner’s Office to support the Chief Coroner in his consideration of these issues where he would find that helpful.

I agree with the noble Lord, Lord Allan, that coroners must have access to online safety expertise given the technical and fast-moving nature of this sector. As we have discussed previously, Amendment 273 gives Ofcom a power to produce a report dealing with matters relevant to an investigation or inquest following a request from a coroner which will provide that expertise. I hope that this reassures the noble Baroness.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I understand the report on a specific death, which is very welcome and part of the regime as we all see it. The very long list of things that the coroner may not know that they do not know, as I set out in the amendment, is the issue which I and other noble Lords are concerned about. If the Government could find a way to make that possible, I would be very grateful.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

We are keen to ensure that coroners have access to the information and expertise that they need, while respecting the independence of the judicial process to decide what they do not know and would like to know more about and the role of the Chief Coroner there. It is a point that I have discussed a lot with the noble Baroness and with my noble friend Lady Newlove in her former role as Victims’ Commissioner. I am very happy to continue doing so because it is important that there is access to that.

The noble Lord, Lord Stevenson, spoke to the amendments tabled by the noble Baroness, Lady Merron, about supposedly gendered language in relation to Clauses 141 and 157. As I made clear in Committee, I appreciate the intention—as does Lady Deben—of making clear that a person of either sex can perform the role of chairman, just as they can perform the role of ombudsman. We have discussed in Committee the semantic point there. The Government have used “chairman” here to be consistent with terminology in the Office of Communications Act 2002. I appreciate that this predates the Written Ministerial Statement which the noble Lord cited, but that itself made clear that the Government at the time recognised that in practice, parliamentary counsel would need to adopt a flexible approach to this change—for example, in at least some of the cases where existing legislation originally drafted in the former style is being amended.

The noble Lord may be aware of a further Written Ministerial Statement, made on 23 May last year, following our debates on gendered language on another Bill, when the then Lord President of the Council and Leader of the House of Commons said that the Office of the Parliamentary Counsel would update its drafting guidance in light of that. That guidance is still forthcoming. However, importantly, the term here will have no bearing on Ofcom’s decision-making on who would chair the advisory committees. It must establish that this could indeed be a person of either sex.

Amendment 253 seeks to enable co-operation, particularly via information-sharing, between Ofcom and other regulators within the UK. I reassure noble Lords that Section 393 of the Communications Act 2003 already includes provisions for sharing information between Ofcom and other regulators in the UK.

As has been noted, Ofcom already co-operates effectively with other domestic regulators. That has been strengthened by the establishment of the Digital Regulation Co-operation Forum. By promoting greater coherence, the forum helps to resolve potential tensions, offering clarity for people and the industry. It ensures collaborative work across areas of common interest to address complex problems. Its outputs have already delivered real and wide-ranging impacts, including landmark policy statements clarifying the interactions between digital regulatory regimes, research into cross-cutting issues, and horizon-scanning activities on new regulatory challenges. We will continue to assess how best to support collaboration between digital regulators and to ensure that their approaches are joined up. We therefore do not think that Amendment 253 is necessary.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, the Minister has not stated that there is a duty to collaborate. Is he saying that that is, in fact, the case in practice?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Yes, there is a duty, and the law should be followed. I am not sure whether the noble Lord is suggesting that it is not—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Is there a duty to collaborate between regulators?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am not sure that I follow the noble Lord’s question, but perhaps—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, the Minister is saying that, in practice, there is a kind of collaboration between regulators and that there is a power under the Communications Act, but is he saying that there is any kind of duty on regulators to collaborate?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

If I may, I will write to the noble Lord setting that out; he has lost me with his question. We believe, as I think he said, that the forum has added to the collaboration in this important area.

The noble Baroness, Lady Finlay, raised important questions about avatars and virtual characters. The Bill broadly defines “content” as

“anything communicated by means of an internet service”,

meaning that it already captures the various ways through which users may encounter content. In the metaverse, this could therefore include things such as avatars or characters created by users. As part of the user-to-user services’ risk assessments, providers will be required to consider more than the risk in relation to user-generated content, including aspects such as how the design and operation of their services, including functionality and how the service is used, might increase the risk of harm to children and the presence of illegal content. A user-to-user service will need to consider any feature which enables interaction of any description between users of the service when carrying out its risk assessments.

The Bill is focused on user-to-user and search services, as there is significant evidence to support the case for regulation based on the risk of harm to users and the current lack of regulatory and other accountability in this area. Hosting, sharing and the discovery of user-generated content and activity give rise to a range of online harms, which is why we have focused on those services. The Bill does not regulate content published by user-to-user service providers themselves; instead, providers are already liable for the content that they publish on their services themselves, and the criminal law is the most appropriate mechanism for dealing with services which publish illegal provider content.

The noble Baroness’s Amendment 275A seeks to require Ofcom to produce a wide-ranging report of behaviour facilitated by emerging technologies. As we discussed in Committee, the Government of course agree that Ofcom needs continually to assess future risks and the capacity of emerging technologies to cause harm. That is why the Bill already contains provisions which allow it to carry out broad horizon scanning, such as its extensive powers to gather information, to commission skilled persons’ reports and to require providers to produce transparency reports. Ofcom has already indicated that it plans to research emerging technologies, and the Bill will require it to update its risk assessments, risk profiles and codes of practice with the outcomes of this research where relevant.

As we touched on in Committee, Clause 56 requires regular reviews by Ofcom into the incidence of content that is harmful to children, and whether there should be changes to regulations setting out the kinds of content that are harmful to children. In addition, Clause 143 mandates that Ofcom should investigate users’ experience of regulated services, which are likely to cover user interactions in virtual spaces, such as the metaverse and those involving content generated by artificial intelligence.

--- Later in debate ---
Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- View Speech - Hansard - - - Excerpts

I am most grateful to the Minister; perhaps I could just check something he said. There was a great deal of detail and I was trying to capture it. On the question of harms to children, we all understand that the harms to children are viewed more extensively than harms to others, but I wondered: what counts as unregulated services? The Minister was talking about regulated services. What happens if there is machine-generated content which is not generated by any user but by some random codes that are developed and then randomly incite problematic behaviours?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am happy to provide further detail in writing and to reiterate the points I have made as it is rather technical. Content that is published by providers of user-to-user services themselves is not regulated by the Bill because providers are liable for the content they publish on the services themselves. Of course, that does not apply to pornography, which we know poses a particular risk to children online and is regulated through Part 5 of the Bill. I will set out in writing, I hope more clearly, for the noble Baroness what is in scope to reassure her about the way the Bill addresses the harms that she has rightly raised.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Will the Minister copy other Members in?

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this has indeed been a wide-ranging and miscellaneous debate. I hope that since we are considering the Bill on Report noble Lords will forgive me if I do not endeavour to summarise all the different speeches and confine myself to one or two points.

The first is to thank the noble Baroness, Lady Kidron, for her support for my amendment but also to say that having heard her argument in favour of her Amendment 191A, I think the difference between us is entirely semantic. Had she worded it so as to say that Ofcom should be under a duty to offer advice to the Chief Coroner, as opposed to guidance to coroners, I would have been very much happier with it. Guidance issued under statute has to carry very considerable weight and, as my noble friend the Minister said, there is a real danger in that case of an arm of the Executive, if you like, or a creature of Parliament—however one wants to regard Ofcom—interfering in the independence of the judiciary. Had she said “advice to the Chief Coroner and whoever is the appropriate officer in Scotland”, that would have been something I could have given wholehearted support to. I hope she will forgive me for raising that quibble at the outset, but I think it is a quibble rather than a substantial disagreement.

On my own amendment, I simply say that I am grateful to my noble friend for the brevity and economy with which he disposed of it. He was of course assisted in that by the remarks and arguments made by many other noble Lords in the House as they expressed their support for it in principle.

I think there is a degree of confusion about what the Bill is doing. There seemed to be a sense that somehow the amendment was giving individuals the right to bring actions in the courts against providers, but of course that already happens because that right exists and is enshrined in Article 65. All the amendment would do is give some balance so that consumers actually had some protections in what is normally, in essence, an unequal contest, which is trying to ensure that a large company enforces the terms and contracts that it has written.

In particular, my amendment would give, as I think noble Lords know, the right to demand repeat performance—that is, in essence, the right to put things right, not monetary compensation—and it would frustrate any attempts by providers, in drafting their own terms and conditions, to limit their own liability. That is of course what they seek to do but the Consumer Rights Act frustrates them in their ability to do so.

We will say no more about that for now. With that, I beg leave to withdraw my amendment.

--- Later in debate ---
Moved by
187: Clause 65, page 62, line 18, leave out from “service” to “down” in line 20 and insert “indicate (in whatever words) that the presence of a particular kind of regulated user-generated content is prohibited on the service, the provider takes”
Member’s explanatory statement
This amendment makes a change to a provision about what the terms of service of a Category 1 service say. The effect of the change is to cover a wider range of ways in which a term of service might indicate that a certain kind of content is not allowed on the service.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, transparency and accountability are at the heart of the regulatory framework that the Bill seeks to establish. It is vital that Ofcom has the powers it needs to require companies to publish online safety information and to scrutinise their systems and processes, particularly their algorithms. The Government agree about the importance of improving data sharing with independent researchers while recognising the nascent evidence base and the complexities of this issue, which we explored in Committee. We are pleased to be bringing forward a number of amendments to strengthen platforms’ transparency, which confer on Ofcom new powers to assess how providers’ algorithms work, which accelerate the development of the evidence base regarding researchers’ access to information and which require Ofcom to produce guidance on this issue.

Amendment 187 in my name makes changes to Clause 65 on category 1 providers’ duties to create clear and accessible terms of service and apply them consistently and transparently. The amendment tightens the clause to ensure that all the providers’ terms through which they might indicate that a certain kind of content is not allowed on its service are captured by these duties.

Amendment 252G is a drafting change, removing a redundant paragraph from the Bill in relation to exceptions to the legislative definition of an enforceable requirement in Schedule 12.

In relation to transparency, government Amendments 195, 196, 198 and 199 expand the types of information that Ofcom can require category 1, 2A and 2B providers to publish in their transparency reports. With thanks to the noble Lord, Lord Stevenson of Balmacara, for his engagement on this issue, we are pleased to table these amendments, which will allow Ofcom to require providers to publish information relating to the formulation, development and scope of user-to-user service providers’ terms of service and search service providers’ public statements of policies and procedures. This is in addition to the existing transparency provision regarding their application.

Amendments 196 and 199 would enable Ofcom to require providers to publish more information in relation to algorithms, specifically information about the design and operation of algorithms that affect the display, promotion, restriction, discovery or recommendation of content subject to the duties in the Bill. These changes will enable greater public scrutiny of providers’ terms of service and their algorithms, providing valuable information to users about the platforms that they are using.

As well as publicly holding platforms to account, the regulator must be able to get under the bonnet and scrutinise the algorithms’ functionalities and the other systems and processes that they use. Empirical tests are a standard method for understanding the performance of an algorithmic system. They involve taking a test data set, running it through an algorithmic system and observing the output. These tests may be relevant for assessing the efficacy and wider impacts of content moderation technology, age-verification systems and recommender systems.

Government Amendments 247A, 250A, 252A, 252B, 252C, 252D, 252E and 252F will ensure that Ofcom has the powers to enable it to direct and observe such tests remotely. This will significantly bolster Ofcom’s ability to assess how a provider’s algorithms work, and therefore to assess its compliance with the duties in the Bill. I understand that certain technology companies have voiced some concerns about these powers, but I reassure your Lordships that they are necessary and proportionate.

The powers will be subject to a number of safeguards. First, they are limited to viewing information. Ofcom will be unable to remotely access or interfere with the service for any other purpose when exercising the power. These tests would be performed offline, meaning that they would not affect the services’ provision or the experience of users. Assessing systems, processes, features and functionalities is the focus of the powers. As such, individual user data and content are unlikely to be the focus of any remote access to view information.

Additionally, the power can be used only where it is proportionate to use in the exercise of Ofcom’s functions—for example, when investigating whether a regulated service has complied with relevant safety duties. A provider would have a right to bring a legal challenge against Ofcom if it considered that a particular exercise of the power was unlawful. Furthermore, Ofcom will be under a legal obligation to ensure that the information gathered from services is protected from disclosure, unless clearly defined exemptions apply.

The Bill contains no restriction on services making the existence and detail of the information notice public. Should a regulated service wish to challenge an information notice served to it by Ofcom, it would be able to do so through judicial review. In addition, the amendments create no restrictions on the use of this power being viewable to members of the public through a request, such as those under the Freedom of Information Act—noting that under Section 393 of the Communications Act, Ofcom will not be able to disclose information it has obtained through its exercise of these powers without the provider’s consent, unless permitted for specific, defined purposes. These powers are necessary and proportionate and will that ensure Ofcom has the tools to understand features and functionalities and the risks associated with them, and therefore the tools to assess companies’ compliance with the Bill.

Finally, I turn to researchers’ access to data. We recognise the valuable work of researchers in improving our collective understanding of the issues we have debated throughout our scrutiny of the Bill. However, we are also aware that we need to develop the evidence base to ensure that any sharing of sensitive information between companies and researchers can be done safely and securely. To this end, we are pleased to table government Amendments 272B, 272C and 272D.

Government Amendment 272B would require Ofcom to publish its report into researcher access to information within 18 months, rather than two years. This report will provide the evidence base for government Amendments 272C and 272D, which would require Ofcom to publish guidance on this issue. This will provide valuable, evidence-based guidance on how to improve access for researchers safely and securely.

That said, we understand the calls for further action in this area. The Government will explore this issue further and report back to your Lordships’ House on whether further measures to support researchers’ access to data are required—and if so, whether they could be implemented through other legislation, such as the Data Protection and Digital Information Bill. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, Amendment 247B in my name was triggered by government Amendment 247A, which the Minister just introduced. I want to explain it, because the government amendment is quite late—it has arrived on Report—so we need to look in some detail at what the Government have proposed. The phrasing that has caused so much concern, which the Minister has acknowledged, is that Ofcom will be able to

“remotely access the service provided by the person”.

It is those words—“remotely access”—which are trigger words for anyone who lived through the Snowden disclosures, where everyone was so concerned about remote access by government agencies to precisely the same services we are talking about today: social media services.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I am grateful to noble Lords for their contributions in this group. On the point made by the noble Lord, Lord Knight of Weymouth, on why we are bringing in some of these powers now, I say that the power to direct and observe algorithms was previously implicit within Ofcom’s information powers and, where a provider has UK premises, under powers of entry, inspection and audit under Schedule 12. However, the Digital Markets, Competition and Consumers Bill, which is set to confer similar powers on the Competition and Markets Authority and its digital markets unit, makes these powers explicit. We wanted to ensure that there was no ambiguity over whether Ofcom had equivalent powers in the light of that. Furthermore, the changes we are making ensure that Ofcom can direct and observe algorithmic assessments even if a provider does not have relevant premises or equipment in the UK.

I am grateful to the noble Lord, Lord Allan of Hallam, for inviting me to re-emphasise points and allay the concerns that have been triggered, as his noble friend Lord Clement-Jones put it. I am happy to set out again a bit of what I said in opening this debate. The powers will be subject to a number of safeguards. First, they are limited to “viewing information”. They can be used only where they are proportionate in the exercise of Ofcom’s functions, and a provider would have the right to bring a legal challenge against Ofcom if it considered that a particular exercise of the power was done unlawfully. Furthermore, Ofcom will be under a legal obligation to ensure that the information gathered from services is protected from disclosure, unless clearly defined exemptions apply.

These are not secret powers, as the noble Lord rightly noted. The Bill contains no restriction on services making the existence and detail of the information notice public. If a regulated service wished to challenge an information notice served to it by Ofcom, it would be able to do so through judicial review. I also mentioned the recourse that people have through existing legislation, such as the Freedom of Information Act, to give them safeguards, noting that, under Section 393 of the Communications Act, Ofcom will not be able to disclose information that it has obtained through its exercise of these powers without the provider’s consent unless that is permitted for specific, defined purposes.

The noble Lord’s Amendment 247B seeks to place further safeguards on Ofcom’s use of its new power to access providers’ systems remotely to observe tests. While I largely agree with the intention behind it, there are already a number of safeguards in place for the use of that power, including in relation to data protection, legally privileged material and the disclosure of information, as I have outlined. Ofcom will not be able to gain remote access simply for exploratory or fishing purposes, and indeed Ofcom expects to have conversations with services about how to provide the information requested.

Furthermore, before exercising the power, Ofcom will be required to issue an information notice specifying the information to be provided, setting out the parameters of access and why Ofcom requires the information, among other things. Following the receipt of an information notice, a notice requiring an inspection or an audit notice, if a company has identified that there is an obvious security risk in Ofcom exercising the power as set out in the notice, it may not be proportionate to do so. As set out in Ofcom’s duties, Ofcom must have regard to the principles under which regulatory activities should be proportionate and targeted only at cases where action is needed.

In line with current practice, we anticipate Ofcom will issue information notice requests in draft form to identify and address any issues, including in relation to security, before the information notice is issued formally. Ofcom will have a legal duty to exercise its remote access powers in a way that is proportionate, ensuring that undue burdens are not placed on businesses. In assessing proportionality in line with this requirement, Ofcom would need to consider the size and resource capacity of a service when choosing the most appropriate way of gathering information, and whether there was a less onerous method of obtaining the necessary information to ensure that the use of this power is proportionate. As I said, the remote access power is limited to “viewing information”. Under this power, Ofcom will be unable to interfere or access the service for any other purpose.

In practice, Ofcom will work with services during the process. It is required to specify, among other things, the information to be provided, which will set the parameters of its access, and why it requires the information, which will explain the link between the information it seeks and the online safety function that it is exercising or deciding whether to exercise.

As noble Lords know, Ofcom must comply with the UK’s data protection law. As we have discussed in relation to other issues, it is required to act compatibly with the European Convention on Human Rights, including Article 8 privacy rights. In addition, under Clause 91(7), Ofcom is explicitly prohibited from requiring the provision of legally privileged information. It will also be under a legal obligation to ensure that the information gathered from services is protected from disclosure unless clearly defined exemptions apply, such as those under Section 393(2) of the Communications Act 2003—for example, the carrying out of any of Ofcom’s functions. I hope that provides reassurance to the noble Lord, Lord Allan, and the noble Baroness, Lady Fox, who raised these questions.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I am grateful to the Minister. That was helpful, particularly the description of the process and the fact that drafts have to be issued early on. However, it still leaves open a couple of questions, one of which was very helpfully raised by the noble Lord, Lord Knight. We have in Schedule 12 this other set of protections that could be applied. There is a genuine question as to why this has been put in this place and not there.

The second question is to dig a little more into the question of what happens when there is a dispute. The noble Lord, Lord Moylan, pointed out that if you have created a backdoor then you have created a backdoor, and it is dangerous. If we end up in a situation where a company believes that what it is being asked to do by Ofcom is fundamentally problematic and would create a security risk, it will not be good enough to open up the backdoor and then have a judicial review. It needs to be able to say no at that stage, yet the Bill says that it could be committing a serious criminal offence by failing to comply with an information notice. We want some more assurances, in some form, about what would happen in a scenario where a company genuinely and sincerely believes that what Ofcom is asking for is inappropriate and/or dangerous and it wants not to have to offer it unless and until its challenge has been looked at, rather than having to offer it and then later judicially review a decision. The damage would already have been done by opening up an inappropriate backdoor.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

A provider would have a right to bring a legal challenge against Ofcom if it considered that a particular exercise of the remote access power was unlawful. I am sure that would be looked at swiftly, but I will write to the noble Lord on the anticipated timelines while that judicial review was pending. Given the serious nature of the issues under consideration, I am sure that would be looked at swiftly. I will write further on that.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I will write on Schedule 12 as well.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

Before the Minister sits down, to quote the way the Minister has operated throughout Report, there is consensus across the House that there are some concerns. The reason why there are concerns outside and inside the House on this particular amendment is that it is not entirely clear that those protections exist, and there are worries. I ask the Minister whether, rather than just writing, it would be possible to take this back to the department, table a late amendment and say, “Look again”. That has been done before. It is certainly not too late: if it was not too late to have this amendment then it is certainly not too late to take it away again and to adopt another amendment that gives some safeguarding. Seriously, it is worth looking again.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I had not quite finished; the noble Baroness was quick to catch me before I sat down. I still have some way to go, but I will certainly take on board all the points that have been made on this group.

The noble Lord, Lord Knight, asked about Schedule 12. I will happily write with further information on that, but Schedule 12 is about UK premises, so it is probably not the appropriate place to deal with this, as we need to be able to access services in other countries. If there is a serious security risk then it would not necessarily be proportionate. I will write to him with further details.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I am grateful to the Minister for giving way so quickly. I think the House is asking him to indicate now that he will go away and look at this issue, perhaps with some of us, and that, if necessary, he would be willing to look at coming back with something at Third Reading. From my understanding of the Companion, I think he needs to say words to that effect to allow him to do so, if that is what he subsequently wants to do at Third Reading.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am very happy to discuss this further with noble Lords, but I will reserve the right, pending that discussion, to decide whether we need to return to this at Third Reading.

Amendments 270 and 272, tabled by my noble friend Lady Fraser of Craigmaddie, to whom I am very grateful for her careful scrutiny of the devolved aspects of the Bill, seek to require Ofcom to include separate analyses of users’ online experiences in England, Wales, Scotland and Northern Ireland in the research about users’ experiences of regulated services and in Ofcom’s transparency reports. While I am sympathetic to her intention—we have corresponded on it, for which I am grateful—it is important that Ofcom has and retains the discretion to prioritise information requests that will best shed light on the experience of users across the UK.

My noble friend and other noble Lords should be reassured that Ofcom has a strong track record of using this discretion to produce data which are representative of people across the whole United Kingdom. Ofcom is committed to reflecting the online experiences of users across the UK and intends, wherever possible, to publish data at a national level. When conducting research, Ofcom seeks to gather views from a representative sample of the United Kingdom and seeks to set quotas that ensure an analysable sample within each of the home nations.

It is also worth noting the provisions in the Communications Act 2003 that require Ofcom to operate offices in each of the nations of the UK, to maintain advisory committees for each, and to ensure their representation on its various boards and panels—and, indeed, on the point raised by the noble Baroness, Lady Kidron, to capture the experiences of children and users of all ages. While we must give Ofcom the discretion it needs to ensure that the framework is flexible and remains future-proofed, I hope that I have reassured my noble friend that her point will indeed be captured, reported on and be able to be scrutinised, not just in this House but across the UK.

Baroness Fraser of Craigmaddie Portrait Baroness Fraser of Craigmaddie (Con)
- Hansard - - - Excerpts

I am grateful to the Minister for giving way. My premise is that the reason Ofcom reports in a nation-specific way in broadcasting and in communications is because there is a high-level reference in both the Communications Act 2003 and the BBC charter that requires it to do so, because it feeds down into national quotas and so on. There is currently nothing of that equivalence in the Online Safety Bill. Therefore, we are relying on Ofcom’s discretion, whereas in the broadcasting and communications area we have a high-level reference to insisting that there is a breakdown by nation.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

We think we can rely on Ofcom’s discretion, and point to its current practice. I hope that will reassure my noble friend that it will set out the information she seeks.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I was about to say that I am very happy to write to the noble Lord, Lord Stevenson, about the manner by which consent is given in Clause 53(5)(c), but I think his question is on something else.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I would be grateful if the Minister could repeat that immediately afterwards, when I will listen much harder.

Just to echo what the noble Baroness was saying, may we take it as an expectation that approaches that are signalled in legislation for broadcasting and communications should apply pari passu to the work of Ofcom in relation to the devolved Administrations?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Yes, and we can point to the current actions of Ofcom to show that it is indeed doing this already, even without that legislative stick.

I turn to the amendments in the name of my noble friend Lord Bethell and the noble Lord, Lord Clement-Jones, on researchers’ access to data. Amendment 237ZA would confer on the Secretary of State a power to make provisions about access to information by researchers. As my noble friend knows, we are sympathetic to the importance of this issue, which is why we have tabled our own amendments in relation to it. However, as my noble friend also knows, in such a complex and sensitive area that we think it is premature to endow the Secretary of State with such broad powers to introduce a new framework. As we touched on in Committee, this is a complex and still nascent area, which is why it is different from the other areas to which the noble Lord, Lord Clement-Jones, pointed in his contribution.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

The noble Baroness, Lady Harding, made the point that in other areas where the Minister has agreed to reviews or reports, there are backstop powers; for instance, on app stores. Of course, that was a negotiated settlement, so to speak, but why can the Minister not accede to that in the case of access for researchers, as he has with app stores? Indeed, there is one other example that escapes me, which the Minister has also agreed to.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

We touched on the complexity of defining who and what is a researcher and making sure that we do not give rise to bad actors exploiting that. This is a complex area, as we touched on in Committee. As I say, the evidence base here is nascent. It is important first to focus on developing our understanding of the issues to ensure that any power or legislation is fit to address those challenges. Ofcom’s report will not only highlight how platforms can share data with researchers safely but will provide the evidence base for considering any future policy approaches, which we have committed to doing but which I think the noble Lord will agree are worthy of further debate and reflection in Parliament.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

The benefit of having a period of time between the last day of Report on Wednesday and Third Reading is that that gives the Minister, the Bill team and parliamentary counsel the time to reflect on the kind of power that could be devised. The wording could be devised, and I would have thought that six weeks would be quite adequate for that, perhaps in a general way. After all, this is not a power that is immediately going to be used; it is a general power that could be brought into effect by regulation. Surely it is not beyond the wit to devise something suitable.

--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Sit down or stand up—I cannot remember.

I wonder whether the department has looked at the DSA and other situations where this is being worked out. I recognise that it takes a period of time, but it is not without some precedent that a pathway should be described.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

We do not think that six weeks is enough time for the evidence base to develop sufficiently, our assessment being that to endow the Secretary of State with that power at this point is premature.

Amendment 262AA would require Ofcom to consider whether it is appropriate to require providers to take steps to comply with Ofcom’s researcher access guidance when including a requirement to take steps in a confirmation decision. This would be inappropriate because the researcher access provisions are not enforceable requirements; as such, compliance with them should not be subject to enforcement by the regulator. Furthermore, enforcement action may relate to a wide variety of very important issues, and the steps needed should be sufficient to address a failure to comply with an enforceable requirement. Singling out compliance with researcher access guidance alone risks implying that this will be adequate to address core failures.

Amendment 272AB would require Ofcom to give consideration to whether greater access to data could be achieved through legal requirements or incentives for regulated services. I reassure noble Lords that the scope of Ofcom’s report will already cover how greater access to data could be achieved, including through enforceable requirements on providers.

Amendment 272E would require Ofcom to take a provider’s compliance with Ofcom’s guidance on researcher access to data into account when assessing risks from regulated services and determining whether to take enforcement action and what enforcement action to take. However, we do not believe that this is a relevant factor for consideration of these issues. I hope noble Lords will agree that whether or not a company has enabled researcher access to its data should not be a mitigating factor against Ofcom requiring companies to deal with terrorism or child sexual exploitation or abuse content, for example.

On my noble friend Lord Bethell’s remaining Amendments 272BA, 273A and 273B, the first of these would require Ofcom to publish its report on researchers’ access to information within six months. While six months would not be deliverable given other priorities and the complexity of this issue, the government amendment to which I have spoken would reduce the timelines from two years to 18 months. That recognises the importance of the issue while ensuring that Ofcom can deliver the key priorities in establishing the core parts of the regulatory framework; for example, the illegal content and child safety duties.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

Just on the timescale, one of the issues that we talked about in Committee was the fact that there needs to be some kind of mechanism created, with a code of practice with reference to data protection law and an approving body to approve researchers as suitable to take information; the noble Baroness, Lady Kidron, referred to the DSA process, which the European Union has been working on. I hope the Minister can confirm that Ofcom might get moving on establishing that. It is not dependent on there being a report in 18 months; in fact, you need to have it in place when you report in 18 months, which means you need to start building it now. I hope the Minister would want Ofcom, within its existing framework, to be encouraging the creation of that researcher approval body and code of practice, not waiting to start that process in 18 months’ time.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I will continue my train of thought on my noble friend’s amendments, which I hope will cover that and more.

My noble friend’s Amendment 273A would allow Ofcom to appoint approved independent researchers to access information. Again, given the nascent evidence base here, it is important to focus on understanding these issues before we commit to a researcher access framework.

Under the skilled persons provisions, Ofcom will already have the powers to appoint a skilled person to assess compliance with the regulatory framework; that includes the ability to leverage the expertise of independent researchers. My noble friend’s Amendment 273B would require Ofcom to produce a code of practice on access to data by researchers. The government amendments I spoke to earlier will require Ofcom to produce guidance on that issue, which will help to promote information sharing in a safe and secure way.

To the question asked by the noble Lord, Lord Allan: yes, Ofcom can start the process and do it quickly. The question here is really about the timeframe in which it does so. As I said in opening, we understand the calls for further action in this area.

I am happy to say to my noble friend Lord Bethell, to whom we are grateful for his work on this and the conversations we have had, that we will explore the issue further and report back on whether further measures to support researchers’ access to data are required and, if so, whether they can be implemented through other legislation, such as the Data Protection and Digital Information (No.2) Bill.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Before the Minister sits down—he has been extremely generous in taking interventions—I want to put on record my understanding of his slightly ambiguous response to Amendment 247A, so that he can correct it if I have got it wrong. My understanding is that he has agreed to go away and reflect on the amendment and that he will have discussions with us about it. Only if he then believes that it is helpful to bring forward an amendment at Third Reading will he do so.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Yes, but I do not want to raise the hopes of the noble Lord or others, with whom I look forward to discussing this matter. I must manage their expectations about whether we will bring anything forward. With that, I beg to move.

Amendment 187 agreed.
--- Later in debate ---
Moved by
189: Clause 67, page 64, line 15, leave out from “65(9),” to “and” in line 16 and insert “indicates (in whatever words) that the presence of content of that kind is prohibited on the service or that users’ access to content of that kind is restricted,”
Member’s explanatory statement
This amendment makes a change to the definition of “relevant content” which applies for the purposes of Chapter 3 of Part 4 of the Bill (transparency of terms of service etc). The effect of the change is to cover a wider range of ways in which a term of service might indicate that a certain kind of content is not allowed on the service.
--- Later in debate ---
Moved by
190: After Clause 67, insert the following new Clause—
“CHAPTER 3ADECEASED CHILD USERSDisclosure of information about use of service by deceased child users
(1) A provider of a relevant service must make it clear in the terms of service what their policy is about dealing with requests from parents of a deceased child for information about the child’s use of the service.(2) A provider of a relevant service must have a dedicated helpline or section of the service, or some similar means, by which parents can easily find out what they need to do to obtain information and updates in those circumstances, and the terms of service must provide details.(3) A provider of a relevant service must include clear and accessible provisions in the terms of service—(a) specifying the procedure for parents of a deceased child to request information about the child’s use of the service,(b) specifying what evidence (if any) the provider will require about the parent’s identity or relationship to the child, and(c) giving sufficient detail to enable child users and their parents to be reasonably certain about what kinds of information would be disclosed and how information would be disclosed. (4) A provider of a relevant service must respond in a timely manner to requests from parents of a deceased child for information about the child’s use of the service or for updates about the progress of such information requests.(5) A provider of a relevant service must operate a complaints procedure in relation to the service that—(a) allows for complaints to be made by parents of a deceased child who consider that the provider is not complying with a duty set out in any of subsections (1) to (4),(b) provides for appropriate action to be taken by the provider of the service in response to such complaints, and(c) is easy to access, easy to use and transparent.(6) A provider of a relevant service must include in the terms of service provisions which are easily accessible specifying the policies and processes that govern the handling and resolution of such complaints.(7) If a person is the provider of more than one relevant service, the duties set out in this section apply in relation to each such service.(8) The duties set out in this section extend only to the design, operation and use of a service in the United Kingdom, and references in this section to children are to children in the United Kingdom.(9) A “relevant service” means—(a) a Category 1 service (see section 86(10)(a));(b) a Category 2A service (see section 86(10)(b));(c) a Category 2B service (see section 86(10)(c)).(10) In this section “parent”, in relation to a child, includes any person who is not the child’s parent but who—(a) has parental responsibility for the child within the meaning of section 3 of the Children Act 1989 or Article 6 of the Children (Northern Ireland) Order 1995 (S.I. 1995/755 (N.I. 2)), or(b) has parental responsibilities in relation to the child within the meaning of section 1(3) of the Children (Scotland) Act 1995.(11) In the application of this section to a Category 2A service, references to the terms of service include references to a publicly available statement.”Member’s explanatory statement
This amendment imposes new duties on providers of Category 1, 2A and 2B services to have a policy about disclosing information to the parents of deceased child users, and providing details about it in the terms of service or a publicly available statement.
--- Later in debate ---
Moved by
191: After Clause 67, insert the following new Clause—
“OFCOM’s guidance about duties set out in section (Disclosure of information about use of service by deceased child users)
(1) OFCOM must produce guidance for providers of relevant services to assist them in complying with their duties set out in section (Disclosure of information about use of service by deceased child users).(2) OFCOM must publish the guidance (and any revised or replacement guidance).(3) In this section “relevant service” has the meaning given by section (Disclosure of information about use of service by deceased child users).”Member’s explanatory statement
This amendment requires OFCOM to give guidance to providers about the new duties imposed by the other Clause proposed after Clause 67 in my name.
--- Later in debate ---
Amendment 191A (to Amendment 191) not moved.
--- Later in debate ---
Moved by
192: Schedule 8, page 212, line 26, leave out “and relevant content” and insert “, relevant content and content to which section 12(2) applies”
Member’s explanatory statement
This amendment adds a reference to content to which section 12(2) applies (content to which certain user empowerment duties apply) to paragraph 1 of the transparency reporting Schedule, which allows OFCOM to require providers of user-to-user services to include information in their transparency reports about the incidence of content.
--- Later in debate ---
Moved by
205: Clause 70, page 66, line 42, leave out subsection (2)
Member’s explanatory statement
This amendment is consequential on the amendment to Clause 211 in my name adding a definition of “pornographic content” to that Clause.
--- Later in debate ---
Moved by
210: Clause 72, page 68, line 18, leave out subsection (2) and insert—
“(2) A duty to ensure, by the use of age verification or age estimation (or both), that children are not normally able to encounter content that is regulated provider pornographic content in relation to the service.(2A) The age verification or age estimation must be of such a kind, and used in such a way, that it is highly effective at correctly determining whether or not a particular user is a child.” Member’s explanatory statement
This amendment requires providers within scope of Part 5 to use highly effective age verification or age estimation (or both) to comply with the duty in Clause 72(2) (preventing children from encountering provider pornographic content).
--- Later in debate ---
Moved by
215: Clause 73, page 68, line 36, leave out from “of” to end of line 37 and insert “kinds and uses of age verification and age estimation that are, or are not, highly effective at correctly determining whether or not a particular user is a child,”
Member’s explanatory statement
This amendment requires OFCOM’s guidance about the duty in Clause 72(2) to give examples of kinds and uses of age verification and age estimation that are, or are not, highly effective at determining whether or not a user is a child.
--- Later in debate ---
Moved by
216: Clause 73, page 68, line 43, at end insert—
“(2A) The guidance may elaborate on the following principles governing the use of age verification or age estimation for the purpose of compliance with the duty set out in section 72(2)—(a) the principle that age verification or age estimation should be easy to use;(b) the principle that age verification or age estimation should work effectively for all users regardless of their characteristics or whether they are members of a certain group; (c) the principle of interoperability between different kinds of age verification or age estimation.(2B) The guidance may refer to industry or technical standards for age verification or age estimation (where they exist).”Member’s explanatory statement
This amendment sets out principles about age verification or age estimation, which are relevant to OFCOM’s guidance to providers about their duty in Clause 72(2).
Amendment 217 (to Amendment 216) not moved.
--- Later in debate ---
Moved by
218B: Clause 158, page 139, line 5, leave out “duty” and insert “duties”
Member’s explanatory statement
This amendment is consequential on the new Clause proposed to be inserted after Clause 149 in my name expanding OFCOM’s duties to promote media literacy in relation to regulated user-to-user and search services.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, the amendments in this group relate to provisions for media literacy in the Bill and Ofcom’s existing duty on media literacy under Section 11 of the Communications Act 2003. I am grateful to noble Lords from across your Lordships’ House for the views they have shared on this matter, which have been invaluable in helping us draft the amendments.

Media literacy remains a key priority in our work to tackle online harms; it is essential not only to keep people safe online but for them to understand how to make informed decisions which enhance their experience of the internet. Extensive work is currently being undertaken in this area. Under Ofcom’s existing duty, the regulator has initiated pilot work to promote media literacy. It is also developing best practice principles for platform-based media literacy measures and has published guidance on how to evaluate media literacy programmes.

While we believe that the Communications Act provides Ofcom with sufficient powers to undertake an ambitious programme of media literacy activity, we have listened to the concerns raised by noble Lords and understand the desire to ensure that Ofcom is given media literacy objectives which are fit for the digital age. We have therefore tabled the following amendments seeking to update Ofcom’s statutory duty to promote media literacy, in so far as it relates to regulated services.

Amendment 274B provides new objectives for Ofcom to meet in discharging its duty. The first objective requires Ofcom to take steps to increase the public’s awareness and understanding of how they can keep themselves and others safe when using regulated services, including building the public’s understanding of the nature and impact of harmful content online, such as disinformation and misinformation. To meet that objective, Ofcom will need to carry out, commission or encourage the delivery of activities and initiatives which enhance users’ media literacy in these ways.

It is important to note that, when fulfilling this new objective, Ofcom will need to increase the public’s awareness of the ways in which they can protect groups that disproportionately face harm online, such as women and girls. The updated duty will also compel Ofcom to encourage the development and use of technologies and systems that support users of regulated services to protect themselves and others. Ofcom will be required to publish a statement recommending ways in which others, including platforms, can take action to support their users’ media literacy.

Amendment 274C places a new requirement on Ofcom to publish a strategy setting out how it will fulfil its media literacy functions under Section 11, including the new objectives. Ofcom will be required to update this strategy every three years and report on progress made against it annually to provide assurance that it is fulfilling its duty appropriately. These reports will be supported by the post-implementation review of the Bill, which covers Ofcom’s media literacy duty in so far as it relates to regulated services. This will provide a reasonable point at which to establish the impact of Ofcom’s work, having given it time to take effect.

I am confident that, through this updated duty, Ofcom will be empowered to ensure that internet users become more engaged with media literacy and, as a result, are safer online. I hope that these amendments will find support from across your Lordships’ House, and I beg to move.

Baroness Bull Portrait Baroness Bull (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome this proposed new clause on media literacy and support the amendments in the names of the noble Lords, Lord Clement-Jones and Lord Knight of Weymouth. I will briefly press the Minister on two points. First, proposed new subsection (1C) sets out how Ofcom must perform its duty under proposed new subsection (1A), but it does not explicitly require Ofcom to work in partnership with existing bodies already engaged in and expert in provision of these kinds of activities. The potential for Ofcom to commission is explicit, but this implies quite a top-down relationship, not a collaboration that builds on best practice, enables scale-up where appropriate and generally avoids reinventing wheels. It seems like a wasted opportunity to fast-track delivery of effective programmes through partnership.

My second concern is that there is no explicit requirement to consider the distinct needs of specific user communities. In particular, I share the concerns of disability campaigners and charities that media literacy activities and initiatives need to take into account the needs of people with learning disabilities, autism and mental capacity issues, both in how activities are shaped and in how they are communicated. This is a group of people who have a great need to go online and engage, but we also know that they are at greater risk online. Thinking about how media literacy can be promoted, particularly among learning disability communities, is really important.

The Minister might respond by saying that Ofcom is already covered by the public sector equality duty and so is already obliged to consider the needs of people with protected characteristics when designing and implementing policies. But the unfortunate truth is that the concerns of the learning disability community are an afterthought in legislation compared with other disabilities, which are already an afterthought. The Petitions Committee in the other place, in its report on online abuse and the experience of disabled people, noted that there are multiple disabled people around the country with the skills and experience to advise government and its bodies but that there is a general unwillingness to engage directly with them. They are often described as hard to reach, which is kind of ironic because in fact most of these people use multiple services and so are very easy to reach, because they are on lots of databases and in contact with government bodies all the time.

The Minister may also point out that Ofcom’s duties in the Communications Act require it to maintain an advisory committee on elderly and disabled persons that includes

“persons who are familiar with the needs of persons with disabilities”.

But referring to an advisory committee is not the same as consulting people with disabilities, both physical and mental, and it is especially important to consult directly with people who may have difficulty understanding what is being proposed. Talking to people directly, rather than through an advisory committee, is very much the goal.

Unlike the draft Bill, which had media literacy as a stand-alone clause, the intention in this iteration is to deal with the issue by amending the Communications Act. It may be that in the web of interactions between those two pieces of legislation, my concerns can be set to rest. But I would find it very helpful if the Minister could confirm today that the intention is that media literacy programmes will be developed in partnership with—and build on best practice of—those organisations already delivering in this space and that the organisations Ofcom collaborates with will be fully inclusive of all communities, including those with disabilities and learning disabilities. Only in this way can we be confident that media literacy programmes will meet their needs effectively, both in content and in how they are communicated.

Finally, can the Minister confirm whether Ofcom considers people with lived experience of disability as subject matter experts on disability for the purpose of fulfilling its consultation duties? I asked this question during one of the helpful briefing sessions during the Bill’s progress earlier this year, but I did not get an adequate answer. Can the Minister clarify that for the House today?

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, the Government have moved on this issue, and I very much welcome that. I am grateful to the Minister for listening and for the fact that we now have Section 11 of the Communications Act being brought into the digital age through the Government’s Amendments 274B and 274C. The public can now expect to be informed and educated about content-related harms, reliability and accuracy; technology companies will have to play their part; and Ofcom will have to regularly report on progress, and will commission and partner with others to fulfil those duties. That is great progress.

The importance of this was underscored at a meeting of the United Nations Human Rights Council just two weeks. Nada Al-Nashif, the UN Deputy High Commissioner for Human Rights in an opening statement said that media and digital literacy empowered individuals and

“should be considered an integral part of education efforts”.

Tawfik Jelassi, the assistant director-general of UNESCO, in a statement attached to that meeting, said that

“media and information literacy was essential for individuals to exercise their right to freedom of opinion and expression”—

I put that in to please the noble Baroness, Lady Fox—and

“enabled access to diverse information, cultivated critical thinking, facilitated active engagement in public discourse, combatted misinformation, and safeguarded privacy and security, while respecting the rights of others”.

If only the noble Lord, Lord Moylan, was in his place to hear me use the word privacy. He continued:

“Together, the international community could ensure that media and information literacy became an integral part of everyone’s lives, empowering all to think critically, promote digital well-being, and foster a more inclusive and responsible global digital community”.


I thought those were great words, summarising why we needed to do this.

I am grateful to Members on all sides of the House for the work that they have done on media literacy. Part of repeating those remarks was that this is so much more about empowerment than it is about loading safety on to individuals, as the noble Baroness, Lady Kidron, rightly said in her comments.

Nevertheless, we want the Minister to reflect on a couple of tweaks. Amendment 269C in my name is around an advisory committee being set up within six months and in its first report assessing the need for a code on misinformation. I have a concern that, as the regime that we are putting in place with this Bill comes into place and causes some of the harmful content that people find engaging to be suppressed, the algorithms will go to something else that is engaging, and that something else is likely to be misinformation and disinformation. I have a fear that that will become a growing problem that the regulator will need to be able to address, which is why it should be looking at this early.

Incidentally, that is why the regulator should also look at provenance, as in Amendment 269AA from the noble Lord, Lord Clement-Jones. It was tempting in listening to him to see whether there was an AI tool that could trawl across all the comments that he has made during the deliberations on this Bill to see whether he has quoted the whole of the joint report—but that is a distraction.

My Amendment 269D goes to the need for media literacy on systems, processes and business models, not just on content. Time and again, we have emphasised the need for this Bill to be as much about systems as content. There are contexts where individual, relatively benign pieces of content can magnify if part of a torrent that then creates harm. The Mental Health Foundation has written to many of us to make this point. In the same way that the noble Baroness, Lady Bull, asked about ensuring that those with disability have their own authentic voice heard as these media literacy responsibilities are played out, so the Mental Health Foundation wanted the same kind of involvement from young people; I agree with both. Please can we have some reassurance that this will be very much part of the literacy duties on Ofcom and the obligations it places on service providers?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I am grateful to noble Lords for their comments, and for the recognition from the noble Lord, Lord Knight, of the changes that we have made. I am particularly grateful to him for having raised media literacy throughout our scrutiny of this Bill.

His Amendments 269C and 269D seek to set a date by which the establishment of the advisory committee on misinformation and disinformation must take place and to set requirements for its first report. Ofcom recognises the valuable role that the committee will play in providing advice in relation to its duties on misinformation and disinformation, and has assured us that it will aim to establish the committee as soon as is reasonably possible, in recognition of the threats posed by misinformation and disinformation online.

Given the valuable role of the advisory committee, Ofcom has stressed how crucial it will be to have appropriate time to appoint the best possible committee. Seeking to prescribe a timeframe for its implementation risks impeding Ofcom’s ability to run the thorough and transparent recruitment process that I am sure all noble Lords want and to appoint the most appropriate and expert members. It would also not be appropriate for the Bill to be overly prescriptive on the role of the committee, including with regard to its first report, in order for it to maintain the requisite independence and flexibility to give us the advice that we want.

Amendment 269AA from the noble Lord, Lord Clement-Jones, seeks to add advice on content provenance to the duties of the advisory committee. The new media literacy amendments, which update Ofcom’s media literacy duties, already include a requirement for Ofcom to take steps to help users establish the reliability, accuracy and authenticity of content found on regulated services. Ofcom will have duties and mechanisms to be able to advise platforms on how they can help users to understand whether content is authentic; for example, by promoting tools that assist them to establish the provenance of content, where appropriate. The new media literacy duties will require Ofcom to take tangible steps to prioritise the public’s awareness of and resilience to misinformation and disinformation online. That may include enabling users to establish the reliability, accuracy and authenticity of content, but the new duties will not remove content online; I am happy to reassure the noble Baroness, Lady Fox, on that.

The advisory committee is already required under Clause 141(4)(c) to advise Ofcom on its exercise of its media literacy functions, including its new duties relating to content authenticity. The Bill does not stipulate what tools service providers should use to fulfil their duties, but Ofcom will have the ability to recommend in its codes of practice that companies use tools such as provenance technologies to identify manipulated media which constitute illegal content or content that is harmful to children, where appropriate. Ofcom is also required to take steps to encourage the development and use of technologies that provide users with further context about content that they encounter online. That could include technologies that support users to establish content provenance. I am happy to reassure the noble Lord, Lord Clement-Jones, that the advisory committee will already be required to advise on the issues that he has raised in his amendment.

On media literacy more broadly, Ofcom retains its overall statutory duty to promote media literacy, which remains broad and non-prescriptive. The new duties in this Bill, however, are focused specifically on harm; that is because the of nature of the Bill, which seeks to make the UK the safest place in the world to be online and is necessarily focused on tackling harms. To ensure that Ofcom succeeds in the delivery of these new specific duties with regard to regulated services, it is necessary that the regulator has a clearly defined scope. Broadening the duties would risk overburdening Ofcom by making its priorities less clear.

The noble Baroness, Lady Bull—who has been translated to the Woolsack while we have been debating this group—raised media literacy for more vulnerable users. Under Ofcom’s existing media literacy programme, it is already delivering initiatives to support a range of users, including those who are more vulnerable online, such as people with special educational needs and people with disabilities. I am happy to reassure her that, in delivering this work, Ofcom is already working not just with expert groups including Mencap but with people with direct personal experiences of living with disabilities.

The noble Lord, Lord Clement-Jones, raised Ofsted. Effective regulatory co-ordination is essential for addressing the crosscutting opportunities and challenges posed by digital technologies and services. Ofsted will continue to engage with Ofcom through its existing mechanisms, including engagement led by its independent policy team and those held with Ofcom’s online safety policy director. In addition to that, Ofsted is considering mechanisms through which it can work more closely with Ofcom where appropriate. These include sharing insights from inspections in an anonymised form, which could entail reviews of its inspection bases and focus groups with inspectors, on areas of particular concern to Ofcom. Ofsted is committed to working with Ofcom’s policy teams to work these plans up in more detail.

Lord McNally Portrait Lord McNally (LD)
- Hansard - - - Excerpts

My Lords, could I ask the Minister a question? He has put his finger on one of the most important aspects of this Bill: how it will integrate with the Department for Education and all its responsibilities for schools. Again, talking from long experience, one of the worries is the silo mentality in Whitehall, which is quite often strongest in the Department for Education. Some real effort will be needed to make sure there is a crossover from the powers that Ofcom has to what happens in the classroom.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I hope what I have said about the way that Ofsted and Ofcom are working together gives the noble Lord some reassurance. He is right, and it is not just in relation to the Department for Education. In my own department, we have discussed in previous debates on media literacy the importance of critical thinking, equipping people with the sceptical, quizzical, analytic skills they need—which art, history and English literature do as well. The provisions in this Bill focus on reducing harm because the Bill is focused on making the UK the safest place to be online, but he is right that media literacy work more broadly touches on a number of government departments.

Amendment 274BA would require Ofcom to promote an understanding of how regulated services’ business models operate, how they use personal data and the operation of their algorithmic systems and processes. We believe that Ofcom’s existing duty under the Communications Act already ensures that the regulator can cover these aspects in its media literacy activities. The duty requires Ofcom to build public awareness of the processes by which material on regulated services is selected or made available. This enables Ofcom to address the platform features specified in this amendment.

The Government’s amendments include extensive new objectives for Ofcom, which apply to harmful ways in which a service is used as well as harmful content. We believe it important not to add further to this duty when the outcomes can already be achieved through the existing duty. We do not wish to limit, by implication, Ofcom’s media literacy duties in relation to other, non-regulated services.

We also judge that the noble Lord’s amendment carries a risk of confusing the remits of Ofcom and the Information Commissioner’s Office. UK data protection law already confers a right for people to be informed about how their personal data are being used, making this aspect of the amendment superfluous.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

I do not believe that the Minister has dealt with the minimum standards issue.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I do not think that the noble Lord was listening to that point, but I did.

Amendment 218B agreed.
--- Later in debate ---
The noble Lord, Lord Moylan, made a very good point in our last session. When I try to assess this, I understand that the Secretary of State is elected and that Ofcom is an unelected regulator, so in many ways it is more democratic that the Secretary of State should be openly politicised, but I am concerned that in this instance the Secretary of State will force the unelected Ofcom to do something that the Government will not do directly but will do behind the scenes. That is the danger. We will not even be able to see it correctly and it will emerge to the public as “media literacy” or something of that nature. That will obfuscate accountability even further. I have a lot of sympathy for the amendment to leave out this clause.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I am grateful for the opportunity to set out the need for Clauses 158 and 159. The amendments in this group consider the role of government in two specific areas: the power for the Secretary of State to direct Ofcom about its media literacy functions in special circumstances and the power for the Secretary of State to issue non-binding guidance to Ofcom. I will take each in turn.

Amendment 219 relates to Clause 158, on the Secretary of State’s power to direct Ofcom in special circumstances. These include where there is a significant threat to public safety, public health or national security. This is a limited power to enable the Secretary of State to set specific objectives for Ofcom’s media literacy activity in such circumstances. It allows the Secretary of State to direct Ofcom to issue public statement notices to regulated service providers, requiring providers to set out the steps they are taking to address the threat. The regulator and online platforms are thereby compelled to take essential and transparent actions to keep the public sufficiently informed during crises. The powers ensure that the regulatory framework is future-proofed and well equipped to respond in such circumstances.

As the noble Lord, Lord Clement-Jones, outlined, I corresponded with him very shortly before today’s debate and am happy to set out a bit more detail for the benefit of the rest of the House. As I said to him by email, we expect the media literacy powers to be used only in exceptional circumstances, where it is right that the Secretary of State should have the power to direct Ofcom. The Government see the need for an agile response to risk in times of acute crisis, such as we saw during the Covid-19 pandemic or in relation to the war in Ukraine. There may be a situation in which the Government have access to information, through the work of the security services or otherwise, which Ofcom does not. This power enables the Secretary of State to make quick decisions when the public are at risk.

Our expectation is that, in exceptional circumstances, Ofcom would already be taking steps to address harm arising from the provision of regulated services through its existing media literacy functions. However, these powers will allow the Secretary of State to step in if necessary to ensure that the regulator is responding effectively to these sudden threats. It is important to note that, for transparency, the Secretary of State will be required to publish the reasons for issuing a direction to Ofcom in these circumstances. This requirement does not apply should the circumstances relate to national security, to protect sensitive information.

The noble Lord asked why we have the powers under Clause 158 when they do not exist in relation to broadcast media. We believe that these powers are needed with respect to social media because, as we have seen during international crises such as the Covid-19 pandemic, social media platforms can sadly serve as hubs for low-quality, user-generated information that is not required to meet journalistic standards, and that can pose a direct threat to public health. By contrast, Ofcom’s Broadcasting Code ensures that broadcast news, in whatever form, is reported with due accuracy and presented with due impartiality. Ofcom can fine, or ultimately revoke a licence to broadcast in the most extreme cases, if that code is breached. This means that regulated broadcasters can be trusted to strive to communicate credible, authoritative information to their audiences in a way that social media cannot.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

We established in our last debate that the notion of a recognised news publisher will go much broader than a broadcaster. I put it to the Minister that we could end up in an interesting situation where one bit of the Bill says, “You have to protect content from these people because they are recognised news publishers”. Another bit, however, will be a direction to the Secretary of State saying that, to deal with this crisis, we are going to give a media literacy direction that says, “Please get rid of all the content from this same news publisher”. That is an anomaly that we risk setting up with these different provisions.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

On the previous group, I raised the issue of legal speech that was labelled as misinformation and removed in the extreme situation of a public health panic. This was seemingly because the Government were keen that particular public health information was made available. Subsequently, we discovered that those things were not necessarily untrue and should not have been removed. Is the Minister arguing that this power is necessary for the Government to direct that certain things are removed on the basis that they are misinformation—in which case, that is a direct attempt at censorship? After we have had a public health emergency in which “facts” have been contested and shown to not be as black and white or true as the Government claimed, saying that the power will be used only in extreme circumstances does not fill me with great confidence.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am happy to make it clear, as I did on the last group, that the power allows Ofcom not to require platforms to remove content, only to set out what they are doing in response to misinformation and disinformation—to require platforms to make a public statement about what they are doing to tackle it. In relation to regulating news providers, we have brought the further amendments forward to ensure that those subject to sanctions cannot avail themselves of the special provisions in the Bill. Of course, the Secretary of State will be mindful of the law when issuing directions in the exceptional circumstances that these clauses set out.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

While the Minister is describing that, can he explain exactly which media literacy power would be invoked by the kind of example I gave when I was introducing the amendment and in the circumstances he has talked about? Would he like to refer to the Communications Act?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

It depends on the circumstances. I do not want to give one example for fear of being unnecessarily restrictive. In relation to the health misinformation and disinformation we saw during the pandemic, an example would be the suggestions of injecting oneself with bleach; that sort of unregulated and unhelpful advice is what we have in mind. I will write to the noble Lord, if he wants, to see what provisions of the Communications Act we would want invoked in those circumstances.

In relation to Clause 159, which is dealt with by Amendment 222, it is worth setting out that the Secretary of State guidance and the statement of strategic priorities have distinct purposes and associated requirements. The purpose of the statement of strategic priorities is to enable the Secretary of State to specifically set out priorities in relation to online safety. For example, in the future, it may be that changes in the online experience mean that the Government of the day wish to set out their high-level overarching priorities. In comparison, the guidance allows for clarification of what Parliament and Government intended in passing this legislation—as I hope we will—by providing guidance on specific elements of the Bill in relation to Ofcom’s functions. There are no plans to issue guidance under this power but, for example, we are required to issue guidance to Ofcom in relation to the fee regime.

On the respective requirements, the statement of strategic priorities requires Ofcom to explain in writing what it proposes to do in consequence of the statement and publish an annual review of what it has done. Whereas Ofcom must “have regard” to the guidance, the guidance itself does not create any statutory requirements.

This is a new regime and is different in its nature from other established areas of regulations, such as broadcasting. The power in Clause 159 provides a mechanism to provide more certainty, if that is considered necessary, about how the Secretary of State expects Ofcom to carry out its statutory functions. Ofcom will be consulted before guidance is issued, and there are checks on how often it can be issued and revised. The guidance document itself, as I said, does not create any statutory requirements, so Ofcom is required only to “have regard” to it.

This will be an open and transparent way to put forward guidance appropriately with safeguards in place. The independence of the regulator is not at stake here. The clause includes significant limitations on the power, and the guidance cannot fetter Ofcom’s operational independence. We feel that both clauses are appropriate for inclusion in the Bill, so I hope that the noble Lord will withdraw his amendment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I thank the Minister for that more extended reply. It is a more reassuring response on Clause 159 than we have had before. On Clause 158, the impression I get is that the media literacy power is being used as a smokescreen for the Government telling social media what it should do, indirectly via Ofcom. That seems extraordinary. If the Government were telling the mainstream media what to do in circumstances like this, we would all be up in arms. However, it seems to be accepted as a part of the Bill and that we should trust the Government. The Minister used the phrase “special circumstances”. That is not the phraseology in the clause; it is that “circumstances exist”, and then it goes on to talk about national security and public health. The bar is very low.

I am sure everyone is getting hungry at this time of day, so I will not continue. However, we still have grave doubts about this clause. It seems an extraordinary indirect form of censorship which I hope is never invoked. In the meantime, I beg leave to withdraw my amendment.

--- Later in debate ---
Moved by
224: Clause 161, page 140, line 27, leave out “or 3” and insert “, 3 or 3A”
Member’s explanatory statement
Clause 161 is about a review by the Secretary of State of the regulatory framework established by this Bill. This amendment inserts a reference to Chapter 3A, which is the new Chapter containing the new duties imposed by the Clause proposed after Clause 67 in my name.
--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, clearly, there is a limited number of speakers in this debate. We should thank the noble Lord, Lord Moylan, for tabling this amendment because it raises a very interesting point about the transparency—or not—of the Counter Disinformation Unit. Of course, it is subject to an Oral Question tomorrow as well, which I am sure the noble Viscount will be answering.

There is some concern about the transparency of the activities of the Counter Disinformation Unit. In its report, Ministry of Truth, which deals at some length with the activities of the Counter Disinformation Unit, Big Brother Watch says:

“Giving officials an unaccountable hotline to flag lawful speech for removal from the digital public square is a worrying threat to free speech”.


Its complaint is not only about oversight; it is about the activities. Others such as Full Fact have stressed the fact that there is little or no parliamentary scrutiny. For instance, freedom of information requests have been turned down and Written Questions which try to probe what the activities of the Counter Disinformation Unit are have had very little response. As it says, when the Government

“lobby internet companies about content on their platforms … this is a threat to freedom of expression”.

We need proper oversight, so I am interested to hear the Minister’s response.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, the Government share the view of my noble friend Lord Moylan about the importance of transparency in protecting freedom of expression. I reassure him and other noble Lords that these principles are central to the Government’s operational response to addressing harmful disinformation and attempts artificially to manipulate our information environment.

My noble friend and others made reference to the operational work of the Counter Disinformation Unit, which is not, as the noble Baroness, Lady Fox, said, the responsibility of my department but of the Department for Science, Innovation and Technology. The Government have always been transparent about the work of the unit; for example, recently publishing a factsheet on GOV.UK which sets out, among other things, how the unit works with social media companies.

I reassure my noble friend that there are existing processes governing government engagements with external parties and emphasise to him that the regulatory framework that will be introduced by the Bill serves to increase transparency and accountability in a way that I hope reassures him. Many teams across government regularly meet industry representatives on a variety of issues from farming and food to telecoms and digital infrastructure. These meetings are conducted within well-established transparency processes and frameworks, which apply in exactly the same way to government meetings with social media companies. The Government have been open about the fact that the Counter Disinformation Unit meets social media companies. Indeed, it would be surprising if it did not. For example, at the beginning of the Russian invasion of Ukraine, the Government worked with social media companies in relation to narratives which were being circulated attempting to deny incidents leading to mass casualties, and to encourage the promotion of authoritative sources of information. That work constituted routine meetings and was necessary in confirming the Government’s confidence in the preparedness and ability of platforms to respond to new misinformation and disinformation threats.

To require additional reporting on a sector-by-sector or department-by-department basis beyond the standardised transparency processes, as proposed in my noble friend’s amendment, would be a disproportionate and unnecessary response to what is routine engagement in an area where the Government have no greater powers or influence than in others. They cannot compel companies to alter their terms of service; nor can or do they seek to mandate any action on specific pieces of content.

I reassure the noble Baroness, Lady Fox, that the Counter Disinformation Unit does not monitor individual people, nor has it ever done so; rather, it tracks narratives and trends using publicly available information online to protect public health, public safety and national security. It has never tracked the activity of individuals, and there is a blanket ban on referring any content from journalists or parliamentarians to social media performs. The Government have always been clear that the Counter Disinformation Unit refers content for consideration only where an assessment has been made that it is likely to breach the platform’s own terms of service. It has no role in deciding what action, if any, to take in response, which is entirely a matter for the platform concerned.

As I said, the Bill will introduce new transparency, accountability and freedom of expression duties for category 1 services which will make the process for any removal or restriction of user-generated content more transparent by requiring category 1 services to set terms of service which are clear, easy for users to understand and consistently enforced. Category 1 services will be prohibited from removing or restricting user-generated content or suspending or banning users where this does not align with those terms of service. Any referrals from government will not, and indeed cannot, supersede these duties in the Bill.

Although I know it will disappoint my noble friend that another of his amendments has not been accepted, I hope I have been able to reassure him about the Government’s role in these processes. As the noble Lord, Lord Clement-Jones, noted, my noble friend Lord Camrose is answering a Question on this in your Lordships’ House tomorrow, further underlining the openness and parliamentary accountability with which we go about this work. I hope my noble friend will, in a similarly post-prandial mood of generosity, suppress his disappointment and feel able to withdraw his amendment.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

Before the Minister sits down, I think that it is entirely appropriate for him to say—I have heard it before—“Oh no, nothing was taken down. None of this is believable. No individuals were targeted”. However, that is not the evidence I have seen, and it might well be that I have been shown misinformation. But that is why the Minister has to acknowledge that one of the problems here is that indicated by Full Fact—which, as we know, is often endorsed by government Ministers as fact-checkers. It says that because the Government are avoiding any scrutiny for this unit, it cannot know. It becomes a “he said, she said” situation. I am afraid that, because of the broader context, it would make the Minister’s life easier, and be clearer to the public—who are, after all, worried about this—if he accepted the ideas in the amendment of the noble Lord, Lord Moylan. We would then be clear and it would be out in the open. If the FOIs and so on that have been constantly put forward were answered, would that not clear it up?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I have addressed the points made by the noble Baroness and my noble friend already. She asks the same question again and I can give her the same answer. We are operating openly and transparently here, and the Bill sets out further provisions for transparency and accountability.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I see what my noble friend did there, and it was very cunning. He gave us a very worthwhile account of the activities of the Counter Disinformation Unit, a body I had not mentioned at all, as if the Counter Disinformation Unit was the sole locus of this sort of activity. I had not restricted it to that. We know, in fact, that other bodies within government have been involved in undertaking this sort of activity, and on those he has given us no answer at all, because he preferred to answer about one particular unit. He referred also to its standardised transparency processes. I can hardly believe that I am reading out words such as those. The standardised transparency process allows us all to know that encounters take place but still refuses to let us know what actually happens in any particular encounter, even though there is a great public interest in doing so. However, I will not press it any further.

My noble friend, who is genuinely a friend, is in danger of putting himself, at the behest of civil servants and his ministerial colleagues, in some danger. We know what happens in these cases. The Minister stands at the Dispatch Box and says “This has never happened; it never normally happens; it will not happen. Individuals are never spoken of, and actions of this character are never taken”. Then of course, a few weeks or months later, out pour the leaked emails showing that all these things have been happening all the time. The Minister then has to resign in disgrace and it is all very sad. His friends, like myself, rally round and buy him a drink, before we never see him again.

Anyway, I think my noble friend must be very careful that he does not put himself in that position. I think he has come close to doing so this evening, through the assurances he has given your Lordships’ House. Although I do not accept those assurances, I will none the less withdraw the amendment, with the leave of the House.

--- Later in debate ---
Moved by
227: Clause 173, page 150, line 23, at end insert “or
(c) an assessment required to be carried out by section (Assessment duties: user empowerment),”Member’s explanatory statement
This amendment ensures that Clause 173, which is about the approach to be taken by providers to judgements about the status of content, applies to assessments under the new Clause proposed after Clause 11 in my name.
--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a good debate. It is very hard to see where one would want to take it. If it proves anything, it is that the decision to drop the legal but harmful provisions in the Bill was probably taken for the wrong reasons but was the right decision, since this is where we end up—in an impossible moral quandary which no amount of writing, legalistic or otherwise, will get us out of. This should be a systems Bill, not a content Bill.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I start by saying that accurate systems and processes for content moderation are crucial to the workability of this Bill and keeping users safe from harm. Amendment 228 from the noble Lord, Lord Allan of Hallam, seeks to remove the requirement for platforms to treat content as illegal or fraudulent content if reasonable grounds for that inference exist. The noble Lord set out his concerns about platforms over-removing content when assessing illegality.

Under Clause 173(5), platforms will need to have reasonable grounds to determine whether content is illegal or a fraudulent advertisement. Only when a provider has reasonable grounds to infer that said content is illegal or a fraudulent advertisement must it then comply with the relevant requirements set out in the Bill. This would mean removing the content or preventing people from encountering it through risk-based and proportionate systems and processes.

--- Later in debate ---
Moved by
230: After Clause 174, insert the following new Clause—
“Time for publishing first guidance under certain provisions of this Act
(1) OFCOM must publish guidance to which this section applies within the period of 18 months beginning with the day on which this Act is passed. (2) This section applies to—(a) the first guidance under section 47(2)(a) (record-keeping and review);(b) the first guidance under section 47(2)(b) (children’s access assessments);(c) the first guidance under section 48(1) (content harmful to children);(d) the first guidance under section 73 (provider pornographic content);(e) the first guidance under section 90(1) (illegal content risk assessments under section 8);(f) the first guidance under section 90(2) (illegal content risk assessments under section 22);(g) the first guidance under section 90(3) (children’s risk assessments);(h) the first guidance under section 140 (enforcement);(i) the first guidance under section 174 relating to illegal content judgements within the meaning of subsection (2)(a) of that section (illegal content and fraudulent advertisements).(3) If OFCOM consider that it is necessary to extend the period mentioned in subsection (1) in relation to guidance mentioned in any of paragraphs (a) to (i) of subsection (2), OFCOM may extend the period in relation to that guidance by up to 12 months by making and publishing a statement.But this is subject to subsection (6).(4) A statement under subsection (3) must set out—(a) the reasons why OFCOM consider that it is necessary to extend the period mentioned in subsection (1) in relation to the guidance concerned, and(b) the period of extension.(5) A statement under subsection (3) may be published at the same time as (or incorporate) a statement under section 38(12) (extension of time to prepare certain codes of practice).(6) But a statement under subsection (3) may not be made in relation to guidance mentioned in a particular paragraph of subsection (2) if—(a) a statement has previously been made under subsection (3) (whether in relation to guidance mentioned in the same or a different paragraph of subsection (2)), or(b) a statement has previously been made under section 38(12).”Member’s explanatory statement
This amendment provides that OFCOM must prepare the first guidance under certain provisions of the Bill within 18 months of Royal Assent, unless they consider a longer period to be necessary in which case OFCOM may (on one occasion only) extend the period and set out why in a published statement.
--- Later in debate ---
Moved by
231: Clause 176, page 152, line 33, at end insert—
“(ga) Chapter 3A of Part 4 (deceased child users);”Member’s explanatory statement
Clause 176 is about liability of providers who are individuals. This amendment inserts a reference to Chapter 3A, which is the new Chapter containing the new duties imposed by the Clause proposed after Clause 67 in my name, so that individuals may be jointly and severally liable for the duties imposed by that clause.
--- Later in debate ---
Moved by
231A: Clause 179, page 154, line 8, leave out “is” and insert “has been”
Member’s explanatory statement
This amendment is a minor change to ensure consistency of tenses.
--- Later in debate ---
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - -

232: Schedule 17, page 247, line 35, at end insert—


“(ba) section (Assessment duties: user empowerment) (assessments related to the adult user empowerment duty set out in section 12(2)), and”

Member’s explanatory statement


This amendment ensures that, during the transitional period when video-sharing platform services continue to be regulated by Part 4B of the Communications Act 2003, providers of such services are not exempt from the new duty in the new clause proposed after Clause 11 in my name to carry out assessments for the purposes of the user empowerment duties in Clause 12(2).

233: Schedule 17, page 247, line 36, leave out “and (9) (records of risk assessments)” and insert “, (8A) and (9) (records of assessments)”
Member’s explanatory statement
This amendment ensures that, during the transitional period when video-sharing platform services continue to be regulated by Part 4B of the Communications Act 2003, providers of such services are not exempt from the new duty inserted in Clause 19 (see the amendments of that Clause proposed in my name) to keep records of the new assessments.
234: Schedule 17, page 248, line 20, at end insert—
“(ea) the duties set out in section (Disclosure of information about use of service by deceased child users) (deceased child users);”Member’s explanatory statement
This amendment ensures that services already regulated under Part 4B of the Communications Act 2003 (video-sharing platform services) are not required to comply with the new duties imposed by the clause proposed after Clause 67 in my name during the transitional period.
--- Later in debate ---
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - -

236A: After Clause 194, insert the following new Clause—


“Power to regulate app stores


(1) Subject to the following provisions of this section and section (Power to regulate app stores: supplementary), the Secretary of State may by regulations amend any provision of this Act to make provision for or in connection with the regulation of internet services that are app stores.

(2) Regulations under this section may not be made before OFCOM have published a report under section (OFCOM’s report about use of app stores by children)(report about use of app stores by children).

(3) Regulations under this section may be made only if the Secretary of State, having considered that report, considers that there is a material risk of significant harm to an appreciable number of children presented by either of the following, or by both taken together—

(a) harmful content present on app stores, or

(b) harmful content encountered by means of regulated apps available in app stores.

(4) Before making regulations under this section the Secretary of State must consult—

(a) persons who appear to the Secretary of State to represent providers of app stores,

(b) persons who appear to the Secretary of State to represent the interests of children (generally or with particular reference to online safety matters),

(c) OFCOM,

(d) the Information Commissioner,

(e) the Children’s Commissioner, and

(f) such other persons as the Secretary of State considers appropriate.

(5) In this section and in section (Power to regulate app stores: supplementary)—

“amend” includes repeal and apply (with or without modifications);

“app” includes an app for use on any kind of device, and “app store” is to be read accordingly;

“content that is harmful to children” has the same meaning as in Part 3 (see section 54);

“harmful content” means—

(a) content that is harmful to children,

(b) search content that is harmful to children, and

(c) regulated provider pornographic content;

“regulated app” means an app for a regulated service;

“regulated provider pornographic content” has the same meaning as in Part 5 (see section 70);

“search content” has the same meaning as in Part 3 (see section 51).

(6) In this section and in section (Power to regulate app stores: supplementary) references to children are to children in the United Kingdom.”

Member’s explanatory statement


This amendment provides that the Secretary of State may make regulations amending this Bill so as to bring app stores within its scope. The regulations may not be made until OFCOM have published their report about the use of app stores by children (see the new Clause proposed to be inserted after Clause 147 in my name).

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, we have had some productive discussions on application stores, commonly known as “app stores”, and their role as a gateway for children accessing online services. I am grateful in particular to my noble friend Lady Harding of Winscombe for her detailed scrutiny of this area and the collaborative approach she has taken in relation to it and to her amendments, to which I will turn in a moment. These share the same goals as the amendments tabled in my name in seeking to add evidence-based duties on app stores to protect children.

The amendments in my name will do two things. First, they will establish an evidence base on the use of app stores by children and the role that app stores play in children encountering harmful content online. Secondly, following consideration of this evidence base, the amendments also confer a power on the Secretary of State to bring app stores into scope of the Bill should there be a material risk of significant harm to children on or through them.

On the evidence base, Amendment 272A places a duty on Ofcom to publish a report on the role of app stores in children accessing harmful content on the applications of regulated services. To help build a greater evidence base about the types of harm available on and through different kinds of app stores, the report will consider a broad range of these stores, which could include those available on various devices, such as smartphones, gaming devices and smart televisions. The report will also assess the use and effectiveness of age assurance on app stores and consider whether the greater use of age assurance or other measures could protect children further.

Publication of the report must be two to three years after the child safety duties come into force so as not to interfere with the Bill’s implementation timelines. This timing will also enable the report to take into account the impact of the regulatory framework that the Bill establishes.

Amendment 274A is a consequential amendment to include this report in the Bill’s broader confidentiality provisions, meaning that Ofcom will need to exclude confidential matters—for example, commercially sensitive information—from the report’s publication.

Government Amendments 236A, 236B and 237D provide the Secretary of State with a delegated power to bring app stores into the scope of regulation following consideration of Ofcom’s report. The power will allow the Secretary of State to make regulations putting duties on app stores to reduce the risks of harm presented to children from harmful content on or via app stores. The specific requirements in these regulations will be informed by the outcome of the Ofcom report I have mentioned.

As well as setting out the rules for app stores, the regulations may also make provisions regarding the duties and functions of Ofcom in regulating app stores. This may include information-gathering and enforcement powers, as well as any obligations to produce guidance or codes of practice for app store providers.

By making these amendments, our intention is to build a robust evidence base on the potential risks of app stores for children without affecting the Bill’s implementation more broadly. Should it be found that duties are required, the Secretary of State will have the ability to make robust and comprehensive duties, which will provide further layers of protection for children. I beg to move.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, before speaking to my Amendment 239A, I thank my noble friend the Minister, the Secretary of State and the teams in both the department and Ofcom for their collaborative approach in working to bring forward this group of amendments. I also thank my cosignatories. My noble friend Lady Stowell cannot be in her place tonight but she has been hugely helpful in guiding me through the procedure, as have been the noble Lords, Lord Stevenson, Lord Clement-Jones and Lord Knight, not to mention the noble Baroness, Lady Kidron. It has been a proper cross-House team effort. Even the noble Lord, Lord Allan, who started out quite sceptical, has been extremely helpful in shaping the discussion.

I also thank the NSPCC and Barnardo’s for their invaluable advice and support, as well as Snap and Match—two companies which have been willing to stick their heads above the parapet and challenge suppliers and providers on which they are completely dependent in the shape of the current app store owners, Apple and Google.

I reassure my noble friend the Minister—and everyone else—that I have no intention of dividing the House on my amendment, in case noble Lords were worried. I am simply seeking some reassurance on a number of points where my amendments differ from those tabled by the Government—but, first, I will highlight the similarities.

As my noble friend the Minister has referred to, I am delighted that we have two packages of amendments that in both cases recognise that this was a really significant gap in the Bill as drafted. Ignoring the elements of the ecosystem that sell access to regulated services, decide age guidelines and have the ability to do age assurance was a substantial gap in the framing of the Bill. But we have also recognised together that it is very important that this is an “and” not an “or”—it is not instead of regulating user-to-user services or search but in addition to. It is an additional layer that we can bring to protect children online, and it is very important that we recognise that—and both packages do.

--- Later in debate ---
I gather that the Minister’s department has a working group to examine loot boxes. An update on that now, or in writing if he would prefer, would be helpful. The main point of raising this is apparent: app stores are an important pinch point in the digital user journey. We need to ensure that Ofcom has a proper look at whether including them helps it deliver the aims of the Bill. We should include the powers for it to be able to do that, in addition to the other safeguards that we are putting in the Bill to protect children. We strongly support these amendments.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I am very grateful for the strength of support and echo the tributes that have been paid to my noble friend Lady Harding—the winsome Baroness from Winscombe —for raising this issue and working with us so collaboratively on it. I am particularly glad that we were able to bring these amendments on Report; as she knows, it involved some speedy work by the Bill team and some speedy drafting by the Office of the Parliamentary Counsel, but I am glad that we were able to do it on Report, so that I can take it off my list of things to do over the summer, which was kindly written for me by the noble Lord, Lord Clement-Jones.

My noble friend’s amendments were laid before the Government’s, so she rightly asked a couple of questions on where they slightly differ. Her amendment seeks to ensure that other websites or online marketplaces that allow users to download apps are also caught by these duties. I reassure her that the Government’s amendments would capture these types of services. We have intentionally not provided detail about what constitutes an app store to ensure that the Bill remains future-proof. I will say a bit more about that in a moment. Regulations made by the Secretary of State under this power will be able to specify thresholds for which app stores are in scope, giving clarity to providers and users about the application of the duties.

On questions of definition, we are intentionally choosing not to define app stores in these amendments. The term is generally understood as meaning a service that makes applications available, which means that the Secretary of State will be able to impose duties on any such service. Any platform that enables apps to be downloaded can therefore be considered an app store for the purpose of this duty, regardless of whether or not it calls itself one. Regulations will clearly set out which providers are in scope of the duties. The ability to set threshold conditions will also ensure that any duties capture only those that pose the greatest risk of children accessing harmful content.

We touched on the long-running debate about content and functionality. We have made our position on that clear; it will be caught by references to content. I am conscious that we will return to this on Wednesday, when we will have a chance to debate it further.

On timing, as I said, I am glad that we were able to bring these amendments forward at this stage. The publication date for Ofcom’s report is to ensure that Ofcom can prioritise the implementation of the child safety duties and put in place the Bill’s vital protections for children before turning to its research on app stores.

That timing also allows the Secretary of State to base his or her decision on commencement on the effectiveness of the existing framework and to use the research of Ofcom’s report to set out a more granular approach to issues such as risk assessment and safety duties. It is necessary to await the findings of Ofcom’s report before those duties are commenced.

To the questions posed by the noble Baroness, Lady Kidron, and others about the consultation for that report by Ofcom, we expect Ofcom to consult widely and with all relevant parties when producing its report. We do not believe that there is a need for a specific list of consultees given Ofcom’s experience and expertise in this area as well as the great experience it will have through its existing enforcement and wider consultation requirements. In addition, the Secretary of State, before making regulations, will be required to consult a range of key parties, such as the Children’s Commissioner and the Information Commissioner, and those who represent the interests of children, as well as providers of app stores. That can include children themselves.

On the questions asked by the noble Lord, Lord Knight, on loot boxes, he is right that this piece of work is being led by my department. We want to see the games industry take the lead in strengthening protections for children and adults to mitigate the risk of harms. We are pursuing that through a DCMS-led technical working group, and we will publish an update on progress in the coming months. I again express my gratitude to my noble friend Lady Harding and other noble Lords who have expressed their support.

Amendment 236A agreed.
Moved by
236B: After Clause 194, insert the following new Clause—
“Power to regulate app stores: supplementary
(1) In this section (except in subsection (4)(c)) “regulations” means regulations under section (Power to regulate app stores)(1).(2) Provision may be made by regulations only for or in connection with the purposes of minimising or mitigating the risks of harm to children presented by harmful content as mentioned in section (Power to regulate app stores)(3)(a) and (b).(3) Regulations may not have the effect that any body other than OFCOM is the regulator in relation to app stores.(4) Regulations may—(a) make provision exempting specified descriptions of app stores from regulation under this Act;(b) make provision amending Part 2, section 49 or Schedule 1 in connection with provision mentioned in paragraph (a);(c) make provision corresponding or similar to provision which may be made by regulations under paragraph 1 of Schedule 11 (“threshold conditions”), with the effect that only app stores which meet specified conditions are regulated by this Act.(5) Regulations may make provision having the effect that app stores provided from outside the United Kingdom are regulated by this Act (as well as app stores provided from within the United Kingdom), but, if they do so, must contain provision corresponding or similar to section 3(5) and (6)(UK links).(6) The provision that may be made by regulations includes provision—(a) imposing on providers of app stores duties corresponding or similar to duties imposed on providers of Part 3 services by—(i) section 10 or 11 (children’s online safety: user-to-user services) or any of sections 16 to 19 so far as relating to section 10 or 11;(ii) section 24 or 25 (children’s online safety: search services) or any of sections 26 to 29 so far as relating to section 24 or 25;(b) imposing on providers of app stores duties corresponding or similar to duties imposed on providers of internet services within section 71(2) by section 72 (duties about regulated provider pornographic content);(c) imposing on providers of app stores requirements corresponding or similar to requirements imposed on providers of regulated services by, or by OFCOM under, Part 6 (fees); (d) imposing on OFCOM duties in relation to app stores corresponding or similar to duties imposed in relation to Part 3 services by Chapter 3 of Part 7 (OFCOM’s register of risks, and risk profiles);(e) conferring on OFCOM functions in relation to app stores corresponding or similar to the functions that OFCOM have in relation to regulated services under—(i) Chapter 4 of Part 7 (information), or(ii) Chapter 6 of Part 7 (enforcement), including provisions of that Chapter conferring power for OFCOM to impose monetary penalties;(f) about OFCOM’s production of guidance or a code of practice relating to any aspect of the regulation of app stores that is included in the regulations.(7) The provision that may be made by regulations includes provision having the effect that app stores fall within the definition of “Part 3 service” or “regulated service” for the purposes of specified provisions of this Act (with the effect that specified provisions of this Act which apply in relation to Part 3 services or regulated services, or to providers of Part 3 services or regulated services, also apply in relation to app stores or to providers of app stores).(8) Regulations may not amend or make provision corresponding or similar to—(a) Chapter 2 of Part 4 (reporting CSEA content),(b) Chapter 5 of Part 7 (notices to deal with terrorism content and CSEA content), or(c) Part 10 (communications offences).(9) Regulations may make different provision with regard to app stores of different kinds.(10) In this section “specified” means specified in regulations.”Member’s explanatory statement
This amendment makes provision about the purpose and contents of regulations to regulate app stores which may be made by the Secretary of State under the preceding new Clause proposed to be inserted in my name.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Moved by
236C: After Clause 194, insert the following new Clause—
“Power to impose duty about alternative dispute resolution procedure
(1) The Secretary of State may by regulations amend this Act for or in connection with the imposition on providers of Category 1 services of an ADR duty.(2) An “ADR duty”—(a) is a duty requiring providers of Category 1 services to arrange for and engage in an alternative dispute resolution procedure in specified circumstances for the resolution of disputes about their handling of relevant complaints, and(b) may include a duty requiring such providers to meet the costs incurred by any other person in using a dispute resolution procedure which is so arranged.(3) Complaints are “relevant” for the purposes of subsection (2)(a) if they—(a) relate to a Category 1 service,(b) are of a specified kind, and(c) are made by persons of a specified kind.(4) Regulations under this section may not be made before the publication of a statement by the Secretary of State responding to OFCOM’s report under section (OFCOM’s report about reporting and complaints procedures)(report about reporting and complaints procedures in use by providers of Part 3 services: see subsection (10) of that section). (5) Before making regulations under this section the Secretary of State must consult—(a) OFCOM,(b) the Information Commissioner, and(c) such other persons as the Secretary of State considers appropriate.(6) If the power conferred by subsection (1) is exercised, the first regulations made under the power must—(a) require the use of a dispute resolution procedure which is impartial, and(b) prohibit the use of a dispute resolution procedure which restricts or excludes the availability of civil proceedings.(7) Provision made by regulations under this section may have the effect that the duties set out in any or all of sections 17, 18 and 19 which apply in relation to duties imposed by other provisions of Chapter 2 of Part 3 are also to apply in relation to the ADR duty, and accordingly the regulations may amend—(a) section 17(6),(b) the definition of “safety measures and policies” in section 18(8), or(c) the definition of “relevant duties” in section 19(10).(8) The provisions of this Act that may be amended by the regulations in connection with the imposition of the ADR duty include, but are not limited to, the following provisions (in addition to those mentioned in subsection (7))—(a) section 6(5),(b) section 94(12)(a), and(c) section 120(2).(9) If the power conferred by subsection (1) is exercised, the first regulations made under the power must require OFCOM to—(a) produce and publish guidance for providers of Category 1 services to assist them in complying with the ADR duty, and(b) consult the Secretary of State, the Information Commissioner and such other persons as OFCOM consider appropriate before producing the guidance.(10) Section 184(1) applies for the purposes of the references to Category 1 services in this section.(11) In this section “specified” means specified in regulations under this section.(12) For the meaning of “Category 1 service”, see section 86 (register of categories of services).”Member’s explanatory statement
This amendment provides that the Secretary of State may make regulations amending this Bill so as to impose a new duty on providers of Category 1 services to arrange for and engage in an out of court, impartial dispute resolution procedure. The regulations may not be made until the Secretary of State has responded to OFCOM’s report about content reporting and complaints procedures under the new clause proposed to be inserted after Clause 147 in my name.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, the government amendments in this group relate to content reporting and complaints procedures. The Bill’s existing duties on each of these topics are a major step forward and will provide users with effective methods of redress. There will now be an enforceable duty on Part 3 services to offer accessible, transparent and easy-to-use complaints procedures. This is an important and significant change from which users and others will benefit directly.

Furthermore, Part 3 services complaints procedures will be required to provide for appropriate action to be taken in response to complaints. The duties here will fundamentally alter how complaints systems are operated by services, and providers will have to make sure that their systems are up to scratch. If services do not comply with their duties, they will face strong enforcement measures.

However, we have listened to concerns raised by your Lordships and others, and share the desire to ensure that complaints are handled effectively. That is why we have tabled Amendments 272AA and 274AA, to ensure that the Bill’s provisions in this area are the subject of a report to be published by Ofcom within two years of commencement.

Amendment 272AA places a requirement on Ofcom to undertake a report about Part 3 services reporting and complaints procedures. The report will assess the measures taken or in use by providers of Part 3 services to enable users and others to report content and make complaints. In assessing the content reporting and complaints measures in place, the report must take into account users’ and others’ experiences of those procedures—including how easy to use and clear they are for reporting content and making complaints, and whether providers are taking appropriate and timely action in response.

In this report, Ofcom must provide advice to the Secretary of State about whether she should use her power set out in Amendment 236C to make regulations imposing an alternative dispute resolution duty on category 1 services. Ofcom may also make wider recommendations about how the complaints and user redress provisions can be strengthened, and how users’ experiences with regard to complaints can be improved more broadly. Amendment 274AA is a consequential amendment ensuring that the usual confidentiality provisions apply to matters contained in that report.

These changes will ensure that the effectiveness of the Bill’s content reporting and complaints provisions can be thoroughly assessed by Ofcom two years after the commencement of the provision, providing time for the relevant reporting and complaints procedures to bed in.

Amendment 236C then provides that the Secretary of State will have a power to make regulations to amend the Act in order to impose an alternative dispute resolution duty on providers of category 1 services. This power can be used after the Secretary of State has published a statement in response to Ofcom’s report. This enables the Secretary of State to impose via regulations a duty on the providers of category 1 services to arrange for and engage in an impartial, out-of-court alternative dispute resolution procedure in respect of complaints. This means that, if the Bill’s existing user redress provisions are found to be insufficient, this requirement can quickly be imposed to strengthen the Bill.

This responds directly to concerns which noble Lords raised about cases where users or parents may feel that they have nowhere to turn if they are dissatisfied with a service’s response to their complaint. We believe that the existing provisions will remedy this, but, if they do not, these new requirements will ensure that there is an impartial, alternative dispute resolution procedure which will work towards the effective resolution of the complaint between the service and the complainant.

At the same time, it will avoid creating a single ombudsman, person or body which may be overwhelmed either through the volume of complaints from multiple services or by the complexity of applying such disparate services’ varying terms of service. Instead, if required, this power will put the onus on the provider to arrange for and engage in an impartial dispute resolution procedure.

Amendment 237D requires that, if regulations are made requiring category 1 services to offer an alternative dispute resolution procedure, such regulation must be subject to the affirmative parliamentary procedure. This ensures that Parliament will continue to have oversight of this process.

I hope that noble Lords are reassured that the Bill not only requires services to provide users and others with effective forms of redress but that these further amendments will ensure that the Bill’s provisions in this area will be thoroughly reviewed and that action can be taken quickly if it is needed. I beg to move.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to hear what the Minister has just announced. The scheme that was originally prefigured in the pre-legislative scrutiny report has now got some chance of being delivered. I think the process and procedures are quite appropriate; it does need review and thought. There needs to be account taken of practice on the ground, how people have found the new system is working, and whether or not there are gaps that can be filled this way. I give my full support to the proposal, and I am very glad to see it.

Having got to the Dispatch Box early, I will just appeal to our small but very important group. We are on the last day on Report. We are reaching a number of issues where lots of debate has taken place in Committee. I think it would be quite a nice surprise for us all if we were to get through this quickly. The only way to do that is by restricting our contributions.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I declare an interest as chair of Trust Alliance Group, which operates the energy and communications ombudsman schemes, so I have a particular interest in the operation of these ADR schemes. I thank the Minister for the flexibility that he has shown in the provision about the report by Ofcom and in having backstop powers for the Secretary of State to introduce such a scheme.

Of course, I understand that the noble Baroness, Lady Newlove, and the UK Safer Internet Centre are very disappointed that this is not going to come into effect immediately, but there are advantages in not setting out the scheme at this very early point before we know what some of the issues arising are. I believe that Ofcom will definitely want to institute such a scheme, but it may be that, in the initial stages, working out the exact architecture is going to be necessary. Of course, I would have preferred to have a mandated scheme, in the sense that the report will look not at the “whether” but the “how”, but I believe that at the end of the day it will absolutely obvious that there needs to be such an ADR scheme in order to provide the kind of redress the noble Baroness, Lady Harding, was talking about.

I also agree with noble Baroness, Lady Morgan, that the kinds of complaints that this would cover should include fraudulent adverts. I very much hope that the Minister will be able to answer the questions that both noble Baronesses asked. As my noble friend said, will he reassure us that the department and Ofcom will not take their foot off the pedal, whatever the Bill may say?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I am grateful to noble Lords for their warm support and for heeding the advice of the noble Lord, Lord Stevenson, on brevity. We must finish our Report today. The noble Lord, Lord Allan, is right to mention my noble friend Lady Newlove, who I have spoken to about this issue, as well as the noble Lord, Lord Russell of Liverpool, who has raised some questions here.

Alongside the strong duties on services to offer content reporting and complaints procedures, our amendments will ensure that the effectiveness of these provisions can be reviewed after they have had sufficient time to bed in. The noble Lord, Lord Allan, asked about timing in more detail. Ofcom must publish the report within the two-year period beginning on the day on which the provision comes into force. That will allow time for the regime to bed in before the report takes place, ensuring that its conclusions are informed by how the procedures work in practice. If necessary, our amendments will allow the Secretary of State to impose via regulations a duty on the providers of category 1 services to arrange for and engage in an impartial, out-of-court alternative dispute resolution procedure, providing the further strengthening which I outlined in opening.

I can reassure my noble friend Lady Morgan of Cotes that reporting mechanisms to facilitate providers’ removal of fraudulent advertisements are exactly the kinds of issues that Ofcom’s codes of practice will cover, subject to consultation and due process. As companies have duties to remove fraudulent advertising once they are alerted to it, we expect platforms will need the necessary systems and processes in place to enable users to report fraudulent adverts so that providers can remove them.

My noble friend Lady Harding asked the question which was posed a lot in Committee about where one goes if all avenues are exhausted. We have added further avenues for people to seek redress if they do not get it but, as I said in Committee, the changes that we are bringing in through this Bill will mark a significant change for people. Rather than focusing on the even-further-diminished possibility of their not having their complaints adequately addressed through the additional amendments we are bringing today, I hope she will see that the provisions in the Bill and in these amendments as bringing in the change we all want to see to improve users’ safety online.

Amendment 236C agreed.
Moved by
237: After Clause 195, insert the following new Clause—
“Powers to amend sections (“Primary priority content that is harmful to children”) and (“Priority content that is harmful to children”)
(1) The Secretary of State may by regulations amend—(a) section (“Primary priority content that is harmful to children”) (primary priority content that is harmful to children);(b) section (“Priority content that is harmful to children”) (priority content that is harmful to children).But the power to add a kind of content is limited by subsections (2) to (4).(2) A kind of content may be added to section (“Primary priority content that is harmful to children”) only if the Secretary of State considers that, in relation to Part 3 services—(a) there is a material risk of significant harm to an appreciable number of children presented by content of that kind that is regulated user- generated content or search content, and(b) it is appropriate for the duties set out in sections 11(3)(a) and 25(3)(a) (duty in relation to children of all ages) to apply in relation to content of that kind.(3) A kind of content may be added to section (“Priority content that is harmful to children”) only if the Secretary of State considers that, in relation to Part 3 services, there is a material risk of significant harm to an appreciable number of children presented by content of that kind that is regulated user-generated content or search content.(4) A kind of content may not be added to section (“Primary priority content that is harmful to children”) or (“Priority content that is harmful to children”) if the risk of harm presented by content of that kind flows from—(a) the content’s potential financial impact, (b) the safety or quality of goods featured in the content, or(c) the way in which a service featured in the content may be performed (for example, in the case of the performance of a service by a person not qualified to perform it).(5) The Secretary of State must consult OFCOM before making regulations under this section.(6) In this section references to children are to children in the United Kingdom.(7) In this section—“regulated user-generated content” has the same meaning as in Part 3 (see section 49);“search content” has the same meaning as in Part 3 (see section 51).”Member’s explanatory statement
This amendment gives power for the Secretary of State to make regulations changing the kinds of content that count as primary priority content and priority content harmful to children, subject to certain constraints set out in the Clause.
--- Later in debate ---
Moved by
237A: Clause 200, page 168, line 5, after “State” insert “or OFCOM”
Member’s explanatory statement
This amendment has the effect that regulations made by OFCOM under the Bill must be made by statutory instrument.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, Amendments 238A and 238D seek to change the parliamentary process for laying—oh, I am skipping ahead with final day of Report enthusiasm.

As noble Lords know, companies will fund the costs of Ofcom’s online safety functions through annual fees. This means that the regime which the Bill ushers in will be cost neutral to the taxpayer. Once the fee regime is operational, regulated providers with revenue at or above a set threshold will be required to notify Ofcom and to pay a proportionate fee. Ofcom will calculate fees with reference to the provider’s qualifying worldwide revenue.

The Delegated Powers and Regulatory Reform Committee of your Lordships’ House has made two recommendations relating to the fee regime which we have accepted, and the amendments we are discussing in this group reflect this. In addition, we are making an additional change to definitions to ensure that Ofcom can collect proportionate fees.

A number of the amendments in my name relate to qualifying worldwide revenue. Presently, the Bill outlines that this should be defined in a published statement laid before Parliament. Your Lordships’ committee advised that it should be defined through regulations subject to the affirmative procedure. We have agreed with this and are proposing changes to Clause 76 so that Ofcom can make provisions about qualifying worldwide revenue by regulations which, as per the committee’s recommendations, will be subject to the affirmative procedure.

Secondly, the committee recommended that we change the method by which the revenue threshold is defined. Presently, as set out in the Bill, it is set by the Secretary of State in a published statement laid before Parliament. The committee recommended that the threshold be set through regulations subject to the negative procedure and we are amending Clause 77 to make the recommended change.

Other amendments seek to make a further change to enable Ofcom to collect proportionate fees from providers. A provider of a regulated service the qualifying worldwide revenue of which is equal to, or greater than, the financial threshold will be required to notify Ofcom and pay an annual fee, calculated by reference to its qualifying worldwide revenue. Currently, this means that that fee calculation can be based only on the revenue of the regulated provider. The structure of some technology companies, however, means that how they accrue revenue is not always straightforward. The entity which meets the definition of a provider may therefore not be the entity which generates revenue referable to the regulated service.

Regulations to be made by Ofcom about the qualifying worldwide revenue will therefore be able to provide that the revenue accruing to certain entities in the same group as a provider of a regulated service can be taken into account for the purposes of determining qualifying worldwide revenue. This will enable Ofcom, when making such regulations, to make provisions, if necessary, to account for instances where a provider has a complex group structure; for example, where the regulated provider might accrue only a portion of the revenue referrable to the regulated service, the rest of which might be accrued by other entities in the group’s structure. These amendments to Clause 76 address these issues by allowing Ofcom to make regulations which provide that the revenue from certain other entities within the provider’s group structure can be taken into account. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, we have not talked much about fees in our consideration of the Bill, and I will not talk much about them today, but there are some important questions. We should not skip too lightly over the fact that we will be levying revenues from online providers. That might have a significant impact on the markets. I have some specific questions about this proposed worldwide revenue method but I welcome these amendments and that we will now be getting a better procedure. This will also allow the Minister to say, “All these detailed points can be addressed when these instruments come before Parliament”. That is a good development. However, there are three questions that are worth putting on the record now so that we have time to think about them.

First, what consideration will be given to the impact on services that do not follow a classic revenue model but instead rely on donations and other sorts of support? I know that we will come back to this question in a later group but there are some very large internet service providers that are not the classic advertising-funded model, instead relying on foundations and other things. They will have significant questions about what we would judge their qualifying worldwide revenue to be, given that they operate to these very different models.

The second question concerns the impact on services that may have a very large footprint outside the UK, and significant worldwide revenues, but which do very little business within the UK. The amendment that the Minister has tabled about group revenues is also relevant here. You can imagine an entity which may be part of a very large worldwide group making very significant revenues around the world. It has a relatively small subsidiary that is offering a service in the UK, with relatively low revenues. There are some important questions there around the potential impact of the fees on decision-making within that group. We have discussed how we do not want to end up with less choice for consumers of services in the UK. There is an interesting question there as to whether getting the fee level wrong might lead to worldwide entities saying, “If you’re going to ask me to pay a fee based on my qualifying worldwide revenue, the UK market is just not worth it”. That may particularly true if, for example, the European Union and other markets are also levying a fee. You can see a rational business choice of, “We’re happy to pay the fee to the EU but not to Ofcom if it is levied at a rate that is disproportionate to the business that we do here”.

The third and very topical question is about the Government’s thinking about services with declining revenues but whose safety needs are not reducing and may even be increasing. I hope as I say this that people have Twitter in mind, which has very publicly told us that its revenue is going down significantly. It has also very publicly fired most of its trust and safety staff. You can imagine a model within which, because its revenue is declining, it is paying less to Ofcom precisely when Ofcom needs to do more supervision of it.

I hope that we can get some clarity around the Government’s intentions in these circumstances. I have referenced three areas where the worldwide qualifying revenue calculation may go a little awry. The first is where the revenue is not classic commercial income but comes from other sources. The second is where the footprint in the UK is very small but it is otherwise a large global company which we might worry will withdraw from the market. The third, and perhaps most important, is what the Government’s intention is where a company’s revenue is declining and it is managing its platform less well and its Ofcom needs increase, and what we would expect to happen to the fee level in those circumstances.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, there is very little to add to that. These are important questions. I simply was struck by the thought that the amount of work, effort and thought that has gone into this should not be kept within this Bill. I wonder whether the noble Lord has thought of offering his services to His Majesty’s Treasury, which has difficulty in raising tax from these companies. It would be nice to see that problem resolved.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I am looking forward to returning to arts and heritage; I will leave that to my noble friend Lady Penn.

The noble Lord, Lord Allan, asked some good questions. He is right: the provisions and the parliamentary scrutiny allow for the flexibility for all these things to be looked at and scrutinised in the way that he set out. I stress that the fee regime is designed to be fair to industry; that is central to the approach we have taken. The Bill stipulates that Ofcom must charge only proportionate and justifiable fees to industry. The provisions that Ofcom can make via regulation about the qualifying worldwide revenue aim to ensure that fees are truly representative of the revenue relating to the regulated service and that they will encourage financial transparency. They also aim to aid companies with complex structures which would otherwise struggle to segregate revenues attributable to the provider and its connected entities.

The revenue of the group undertaking can be considered in scope of a provider’s qualifying worldwide revenue if the entity was a member of the provider’s group during any part of the qualifying period and the entity receives during the qualifying period any amount referrable to a regulated service. The regulations provide Ofcom with a degree of flexibility as to whether or not to make such provisions, because Ofcom will aim to keep the qualifying worldwide revenue simple.

I am grateful for noble Lords’ support for the amendments and believe that they will help Ofcom and the Government to structure a fair and transparent fee regime which charges proportionate fees to fund the cost of the regulatory regime that the Bill brings in.

Amendment 237A agreed.
Moved by
237B: Clause 200, page 168, line 6, at end insert—
“(3A) The Statutory Instruments Act 1946 applies in relation to OFCOM’s powers to make regulations under this Act as if OFCOM were a Minister of the Crown.(3B) The Documentary Evidence Act 1868 (proof of orders and regulations etc) has effect as if—(a) OFCOM were included in the first column of the Schedule to that Act;(b) OFCOM and persons authorised to act on their behalf were mentioned in the second column of that Schedule.”Member’s explanatory statement
This amendment makes technical provision in relation to regulations made by OFCOM under the Bill.
--- Later in debate ---
Moved by
237C: Clause 201, page 168, line 11, at end insert—
“(aa) regulations under section (“Regulations by OFCOM about qualifying worldwide revenue etc”)(1),”Member’s explanatory statement
This amendment provides that regulations made by OFCOM under subsection (1) of the new Clause 76 proposed in my name regarding “qualifying worldwide revenue” etc are subject to the affirmative Parliamentary procedure.
--- Later in debate ---
Moved by
237E: Clause 201, page 168, line 23, at end insert—
“(m) regulations under paragraph 5(9) of Schedule 13,”Member’s explanatory statement
This amendment provides that regulations made by OFCOM under paragraph 5(9) of Schedule 13 regarding “qualifying worldwide revenue” etc for the purposes of that paragraph are subject to the affirmative Parliamentary procedure.
--- Later in debate ---
Moved by
238A: Clause 201, page 169, line 3, at end insert—
“(7A) A statutory instrument containing the first regulations under paragraph 1(1) of Schedule 11 (whether alone or with regulations under paragraph 1(2) or (3) of that Schedule) may not be made unless a draft of the instrument has been laid before, and approved by a resolution of, each House of Parliament.(7B) Any other statutory instrument containing regulations under paragraph 1(1) of Schedule 11 is subject to annulment in pursuance of a resolution of either House of Parliament.”Member’s explanatory statement
This amendment provides that the first regulations made under paragraph 1(1) of Schedule 11 (regulations specifying Category 1 threshold conditions) are subject to the affirmative Parliamentary procedure.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, as I was eagerly anticipating, government Amendments 238A and 238D seek to change the parliamentary process for laying the first regulations specifying the category 1 threshold conditions from the negative to the affirmative procedure. I am pleased to bring forward this change in response to the recommendation of your Lordships’ Delegated Powers and Regulatory Reform Committee.

The change will ensure that there are adequate levels of parliamentary scrutiny of the first regulations specifying the category 1 threshold conditions. This is appropriate given that the categorisation of category 1 services will lead to the most substantial duties on the largest and most influential services. As noble Lords are aware, these include the duties on user empowerment, user identity verification, journalistic and news publisher content, content of democratic importance, and fraudulent advertising.

Category 2A services will have only additional transparency and fraudulent advertising duties, and category 2B services will be subject only to additional transparency reporting duties. The burden of these duties is significantly less than the additional category 1 duties, and we have therefore retained the use of the negative resolution procedure for these regulations, as they require less parliamentary scrutiny.

Future changes to the category 1 threshold conditions will also use the negative procedure. This will ensure that the regime remains agile in responding to change, which I know was of particular concern to noble Lords when we debated the categorisation group in Committee. Keeping the negative procedure for such subsequent uses will avoid the risk of future changes being subject to delays because of parliamentary scheduling. I beg to move.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I shall speak to Amendment 245. I would like to thank my noble friend the Minister, and also the Minister on leave, for the conversations that I have had with them about this amendment and related issues. As we have already heard, the platform categorisation is extremely important. So far, much of it is unknown, including which sites are actually going to be in which categories. For example, we have not yet seen any proposed secondary regulations. As my noble friend has just outlined, special duties apply, especially for those sites within category 1—user empowerment in particular, but also other duties relating to content and fraudulent advertisements.

Clause 85 and Schedule 11 set out the thresholds for determining which sites will be in category 1, category 2A or category 2B. I am very mindful of the exhortation of the noble Lord, Lord Stevenson, about being brief, but it is amazing how much you have to say about one word to explain this amendment. This amendment proposes to change an “and” to an “or” in relation to determining which sites would fall within category 1. It would move from a test of size “and” functionality to a test of size “or” functionality. This would give Ofcom more flexibility to decide which platforms really need category 1 designation. Category 1 should not be decided just on size; it should also be possible to determine it on the basis of functionality.

Functionality is defined in the Bill in Clause 208. We will get to those amendments shortly, but there is no doubt from what the Government have already conceded, or agreed with those of us who have been campaigning passionately on the Bill for a number of years, that functionality can make a platform harmful. It is perfectly possible to have small platforms that both carry highly harmful content and themselves become harmful in the way that they are designed. We have heard many examples and I will not detain the House with them, but I draw attention to two particular sites which capture how broad this is. The perpetrators of offline hate crimes are often linked to these small platforms. For example, the perpetrator of the 2018 Tree of Life synagogue mass shooting had an online presence on the right-wing extremist social network Gab. In the UK, Jake Davison, the self-proclaimed incel who killed five people in Plymouth in 2021, frequented smaller incel forums after he was banned from Reddit in the days leading up to the mass shooting.

I also want to share with noble Lords an email that I received just this week from a family who had been to see their Member of Parliament, Matt Rodda MP, and also the noble Baroness, Lady Kidron, who I know is very regretful that she cannot be here today. I thank Victoria and Jean Eustace for sharing the story of their sister and daughter. Victoria wrote: “I am writing to you regarding the Online Safety Bill, as my family and I are concerned it will not sufficiently protect vulnerable adults from harm. My sister, Zoe Lyalle, killed herself on 26 May 2020, having been pointed towards a method using an online forum called Sanctioned Suicide. Zoe was 18 years old at the time of her death and as such technically an adult, but she was autistic, so she was emotionally less mature than many 18 year- olds. She found it difficult to critically analyse written content”. She says that “The forum in question is not large and states on its face that it does not encourage suicide, although its content does just that”. The next part I was even more shocked about: “Since Zoe’s death, we have accessed her email account. The forum continues to email Zoe, providing her with updates on content she may have missed while away from the site, as well as requesting donations. One recent email included a link to a thread on the forum containing tips on how best to use the precise method that Zoe had employed”.

In her note to me, the Minister on leave said that she wanted to catch some of the platforms we are talking about with outsized influence. In my reply, I said that those sites on which people are encouraged to take their own lives or become radicalised and therefore take the harms they are seeing online into the real world undoubtedly exercise influence and should be tackled.

It is also perfectly possible for us to have large but safe platforms. I know that my noble friend Lord Moylan may want to discuss this in relation to sites that he has talked about already on this Bill. The risk of the current drafting is a flight of users from these large platforms, newly categorised as category 1, to the small, non-category 1 platforms. What if a platform becomes extremely harmful very quickly? How will it be recategorised speedily but fairly and involving parliamentary oversight?

The Government have run a variety of arguments as to why the “and” in the Bill should not become an “or”. They say that it creates legal uncertainty. Every Bill creates legal uncertainty; that is why we have an army of extremely highly paid lawyers, not just in this country but around the world. They say that what we are talking about is broader than illegal content or content related to children’s safety, but they have already accepted an earlier amendment on safety by design and, in subsections (10) to (12) of Clause 12, that specific extra protections should be available for content related to

“suicide or an act of deliberate self-injury, or … an eating disorder or behaviours associated with an eating disorder”

or abusive content relating to race, religion, sex, sexual orientation, disability or gender reassignment and that:

“Content is within this subsection if it incites hatred against people”.


The Government have already breached some of their own limits on content that is not just illegal or relates to child safety duties. In fact, they have agreed that that content should have enhanced triple-shield protection.

The Government have also said that they want to avoid burdens on small but low-harm platforms. I agree with that, but with an “or” it would be perfectly possible for Ofcom to decide by looking at size or functionality and to exclude those smaller platforms that do not present the harm we all care about. The Minister may also offer me a review of categorisation; however, it is a review of the tiers of categorisation and not the sites within the categories, which I think many of us will have views on over the years.

I come to what we should do on this final day of Report. I am very thankful to those who have had many conversations on this, but there is a fundamental difference of opinion in this House on these matters. We will talk about functionality shortly and I am mindful of the pre-legislative scrutiny committee’s recommendation that this legislation should adopt

“a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model”.

There should be other factors. Ofcom should have the ability to decide whether it takes one factor or another, and not have a series of all the thresholds to be passed, to give it the maximum flexibility. I will listen very carefully to what my noble friend the Minister and other noble Lords say, but at this moment I intend to test the opinion of the House on this amendment.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I have good news and bad news for the Minister. The good news is that we have no problem with his amendments. The bad news, for him, is that we strongly support Amendment 245 from the noble Baroness, Lady Morgan of Coates, which, as others have said, we think is a no-brainer.

The beauty of the simple amendment has been demonstrated; it just changes the single word “and” to “or”. It is of course right to give Ofcom leeway—or flexibility, as the noble Baroness, Lady Finlay, described it—in the categorisation and to bring providers into the safety regime. What the noble Baroness, Lady Morgan, said about the smaller platforms, the breadcrumbing relating to the Jake Davison case and the functionality around bombarding Zoe Lyalle with those emails told the story that we needed to hear.

As it stands, the Bill requires Ofcom to always be mindful of size. We need to be more nuanced. From listening to the noble Lord, Lord Allan of Hallam—with his, as ever, more detailed analysis of how things work in practice—my concern is that in the end, if it is all about size, Ofcom will end up having to have a much larger number in scope on the categorisation of size in order to cover all the platforms that it is worried about. If we could give flexibility around size or functionality, that would make the job considerably easier.

We on this side think categorisation should happen with a proportionate, risk-based approach. We think the flexibility should be there, the Minister is reasonable—come on, what’s not to like?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I shall explain why the simple change of one word is not as simple as it may at first seem. My noble friend’s Amendment 245 seeks to amend the rule that a service must meet both a number-of-users threshold and a functionality threshold to be designated as category 1 or 2B. It would instead allow the Secretary of State by regulation to require a service to have to meet only one or other of the two requirements. That would mean that smaller user-to-user services could be so categorised by meeting only a functionality threshold.

In practical terms, that would open up the possibility of a future Secretary of State setting only a threshold condition about the number of users, or alternatively about functionality, in isolation. That would create the risk that services with a high number of users but limited functionality would be caught in scope of category 1. That could be of particular concern to large websites that operate with limited functionality for public interest reasons, and I am sure my noble friend Lord Moylan can think of one that fits that bill. On the other hand, it could capture a vast array of low-risk smaller services merely because they have a specific functionality—for instance, local community fora that have livestreaming capabilities. So we share the concerns of the noble Lord, Lord Allan, but come at it from a different perspective from him.

My noble friend Lady Morgan mentioned the speed of designation. The Bill’s approach to the pace of designation for the category 1 watchlist and register is flexible—deliberately so, to allow Ofcom to act as quickly as is proportionate to each emerging service. Ofcom will have a duty proactively to identify, monitor and evaluate emerging services, which will afford it early visibility when a service is approaching the category 1 threshold. It will therefore be ready to act accordingly to add services to the register should the need arise.

The approach set out in my noble friend’s Amendment 245 would not allow the Secretary of State to designate individual services as category 1 if they met one of the threshold conditions. Services can be designated as category 1 only if they meet all the relevant threshold conditions set out in the regulations made by the Secretary of State. That is the case regardless, whether the regulations set out one condition or a combination of several conditions.

The noble Baroness, Lady Finlay, suggested that the amendment would assist Ofcom in its work. Ofcom itself has raised concerns that amendments such as this—to introduce greater flexibility—could increase the risk of legal challenges to categorisation. My noble friend Lady Morgan was part of the army of lawyers before she came to Parliament, and I am conscious that the noble Lord, Lord Clement-Jones, is one as well. I hope they will heed the words of the regulator; this is not a risk that noble Lords should take lightly.

I will say more clearly that small companies can pose significant harm to users—I have said it before and I am happy to say it again—which is why there is no exemption for small companies. The very sad examples that my noble friend Lady Morgan gave in her speech related to illegal activity. All services, regardless of size, will be required to take action against illegal content, and to protect children if they are likely to be accessed by children. This is a proportionate regime that seeks to protect small but excellent platforms from overbearing regulation. However, I want to be clear that a small platform that is a font of illegal content cannot use the excuse of its size as an excuse for not dealing with it.

Category 1 services are those services that have a major influence over our public discourse online. Again, I want to be clear that designation as a category 1 service is not based only on size. The thresholds for category 1 services will be based on the functionalities of a service as well as the size of the user base. The thresholds can also incorporate other characteristics that the Secretary of State deems relevant, which could include factors such as a service’s business model or its governance. Crucially, Ofcom has been clear that it will prioritise engagement with high-risk or high-impact services, irrespective of their categorisation, to understand their existing safety systems and how they plan to improve them.

--- Later in debate ---
Moved by
238B: Clause 201, page 169, line 6, leave out “74(3)(b)” and insert “(“Regulations by OFCOM about qualifying worldwide revenue etc”)(2)”
Member’s explanatory statement
This amendment provides that regulations made by OFCOM about supporting evidence to be supplied by providers for the purposes of Part 6 of the Bill (fees) are subject to the negative Parliamentary procedure.
--- Later in debate ---
Lord Kamall Portrait Lord Kamall (Con)
- View Speech - Hansard - - - Excerpts

I associate myself with the comments of my noble friend Lady Stowell on this whole issue, and I refer to my register of interests. One question we should be asking, which goes wider than this Bill, is: who regulates the regulators? It is a standard problem in political science and often known as principal agent theory, whereby the principals delegate powers to the agents for many reasons, and you see agency slack, whereby they develop their own powers beyond what was perhaps originally intended. For that reason, I completely associate myself with my noble friend Lady Stowell’s comments—and not because she chairs a committee on which I sit and I hope to get a favour of more speaking time on that committee. It is simply because, on its merit, we should all be asking who regulates the regulators and making sure that they are accountable. We are asking the same question of the Secretary of State, and quite rightly, the Secretary of State should be accountable for any measures they propose, but we should also be asking it of regulators.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, I have always felt rather sorry for the first Viscount Addison, because what we refer to as the Salisbury convention is really the Salisbury-Addison convention. So while I am grateful to the noble Lord, Lord Stevenson, for his flattering speech, I shall insist on calling it the “Parkinson-Stevenson rule”, not least in the hope that that mouthful will encourage people to forget its name more swiftly.

I am grateful to the noble Lord for his attention to this matter and the useful discussions that we have had. His Amendment 239 would go beyond the existing legislative process for the delegated powers in the Bill by providing for parliamentary committees to be, in effect, inserted into the secondary legislative process. The delegated powers in the Bill are crucial for implementing the regime effectively and for ensuring that it keeps pace with changes in technology. Regulation-making powers are an established part of our legislative practice, and it would not be appropriate to deviate from existing processes.

However, I agree that ongoing parliamentary scrutiny of the regime will be crucial in helping to provide noble Lords and Members in another place with the reassurance that the implementation of the regime is as we intended. As the noble Lord noted, the establishment of the Science, Innovation and Technology Select Committee in another place means that there is a new dedicated committee looking at this important area of public policy. That provides an opportunity for cross-party scrutiny of the online safety regime and broader issues. While it will be, as he said, for respective committees to decide their priorities, we welcome any focus on online safety, and certainly welcome committees in both Houses co-operating effectively on this matter. I am certain that the Communications and Digital Committee of your Lordships’ House will continue to play a vital role in the scrutiny of the online safety regime.

We would fully expect these committees to look closely at the codes of practice, the uses of regulation-making powers and the powers of direction in a way that allows them to focus on key issues of interest. To support that, I can commit that the Government will do two things. First, where the Bill places a consultation requirement on the Government, we will ensure that the relevant committees have every chance to play a part in that consultation by informing them that the process is open. Secondly, while we do not wish to see the implementation process delayed, we will, where possible, share draft statutory instruments directly with the relevant committees ahead of the formal laying process. These timelines will be on a case-by-case basis, considering what is appropriate and reasonably practical. It will be for the committees to decide how they wish to engage with the information that we provide, but it will not create an additional approval process to avoid delaying implementation. I am grateful to my noble friend Lady Stowell of Beeston for her words of caution and wisdom on that point as both chairman of your Lordships’ committee and a former Leader of your Lordships’ House.

I hope that the noble Lord will be satisfied by what I have set out and will be willing to withdraw his amendment so that our rule might enter into constitutional history more swiftly.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

I am very grateful to everyone who has contributed to the debate, despite my injunction that no one was to speak other than those key persons—but it was nice to hear views around the House in support for this proposal, with caution. The noble Baroness, Lady Stowell, was right to be clear that we have to be focused on where we are going on this; there is quite a lot at stake here, and it is a much bigger issue than simply this Bill and these particular issues. Her willingness to take this on in a wider context is most welcome, and I look forward to hearing how that goes. I am also very grateful for the unexpected but very welcome support from the noble Baroness, Lady Fox. It was nice that she finally agreed to meet on one piece of territory, if we cannot agree on some of the others. The noble Lord, Lord Kamall, is right to say that we need to pick up the much broader question about who regulates those who regulate us. This is not the answer, but it certainly gets us a step in the direction.

I was grateful to the Minister for suggesting that the “Parkinson rule” could take flight, but I shall continue to call it by a single name—double-barrelled names are not appropriate here. We will see the results of that in the consultation; the things that already have to be consulted about will be offered to the committees, and it is up to them to respond on that, but it is a very good start. The idea that drafts and issues that are being prepared for future regulation will be shown ahead of the formal process is exactly where I wanted to be on this, so I am very grateful for that. I withdraw the amendment.

--- Later in debate ---
Moved by
239B: Clause 74, page 70, line 3, leave out from “information” to end of line 5 and insert “as required by regulations made by OFCOM under section (“Regulations by OFCOM about qualifying worldwide revenue etc”).”
Member’s explanatory statement
This amendment omits a reference to regulations made by the Secretary of State. Details about supporting evidence etc to accompany providers’ notifications for the purposes of the fees regime are now to be contained in regulations made by OFCOM (see the new Clause 76 proposed in my name).
--- Later in debate ---
Moved by
239F: Clause 76, leave out Clause 76 and insert the following new Clause—
“Regulations by OFCOM about qualifying worldwide revenue etc
(1) For the purposes of this Part, OFCOM may by regulations make provision—(a) about how the qualifying worldwide revenue of a provider of a regulated service is to be determined, and(b) defining the “qualifying period” in relation to a charging year.(2) OFCOM may by regulations also make provision specifying or describing evidence, documents or other information that providers must supply to OFCOM for the purposes of section 74 (see subsection (3)(b) of that section), including provision about the way in which providers must supply the evidence, documents or information.(3) Regulations under subsection (1)(a) may provide that the qualifying worldwide revenue of a provider of a regulated service (P) who is a member of a group during any part of a qualifying period is to include the qualifying worldwide revenue of any entity that—(a) is a group undertaking in relation to P for all or part of that period, and(b) receives or is due to receive, during that period, any amount referable (to any degree) to a regulated service provided by P.(4) Regulations under subsection (1)(a) may, in particular—(a) make provision about circumstances in which amounts do, or do not, count as being referable (to any degree) to a regulated service for the purposes of the determination of the qualifying worldwide revenue of the provider of the service or of an entity that is a group undertaking in relation to the provider;(b) provide for cases or circumstances in which amounts that—(i) are of a kind specified or described in the regulations, and(ii) are not referable to a regulated service,are to be brought into account in determining the qualifying worldwide revenue of the provider of the service or of an entity that is a group undertaking in relation to the provider.(5) Regulations which make provision of a kind mentioned in subsection (3) may include provision that, in the case of an entity that is a group undertaking in relation to a provider for part (not all) of a qualifying period, only amounts relating to the part of the qualifying period for which the entity was a group undertaking may be brought into account in determining the entity’s qualifying worldwide revenue.(6) Regulations under subsection (1)(a) may make provision corresponding to paragraph 5(8) of Schedule 13.(7) Before making regulations under subsection (1) OFCOM must consult—(a) the Secretary of State,(b) the Treasury, and(c) such other persons as OFCOM consider appropriate.(8) Before making regulations under subsection (2) OFCOM must consult the Secretary of State.(9) Regulations under this section may make provision subject to such exemptions and exceptions as OFCOM consider appropriate.(10) In this section—“group” means a parent undertaking and its subsidiary undertakings, reading those terms in accordance with section 1162 of the Companies Act 2006;“group undertaking” has the meaning given by section 1161(5) of that Act.”Member’s explanatory statement
This amendment substitutes Clause 76, which is about what is meant by “qualifying worldwide revenue”. The new Clause provides for OFCOM to make regulations about this and related matters for the purposes of the fees regime, and allows the regulations (among other things) to provide that revenue arising to certain entities in the same group as a provider of a regulated service is to be brought into account.
--- Later in debate ---
Moved by
239G: Clause 77, page 72, line 2, leave out from “must” to “the” in line 3 and insert “make regulations specifying”
Member’s explanatory statement
This amendment provides that the Secretary of State must specify the threshold figure in regulations (rather than in a published statement).
--- Later in debate ---
Moved by
239N: Clause 79, page 73, line 18, leave out from “period”” to end of line 19 and insert “for the purposes of this Part, and”
Member’s explanatory statement
This amendment is consequential on the new Clause 76 proposed in my name.
--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, if I may, I shall speak very briefly, in the absence of my noble friend Lady Kidron, and because I am one of the signatories of this amendment, alongside the noble Lord, Lord Stevenson, and the right reverend Prelate the Bishop of Oxford. Amendment 240, together with a number of amendments that we will be debating today, turns on a fundamental issue that we have not yet resolved.

I came in this morning being told that we would be voting on this amendment and that other amendments later today would be consequential—I am a novice at this level of parliamentary procedure, so forgive me if I have got myself confused during the day—but I now understand that my noble friend considers this amendment to be consequential but, strangely, the amendments right at the end of the day are not. I just wanted to flag to the House that they all cover the same fundamental issue of whether harms can be unrelated to content, whether the harms of the online world can be to do with functionality—the systems and processes that drive the addiction that causes so much harm to our children.

It is a fundamental disagreement. I pay tribute to the amount of time the department, the Secretary of State and my noble friend have spent on it, but it is not yet resolved and, although I understand that I should now say that I beg leave to move the amendment formally, I just wanted to mark, with apologies, the necessity, most likely, of having to bring the same issue back to vote on later today.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, His Majesty’s Government indeed agree that this is consequential on the other amendments, including Amendment 35, which the noble Baroness, Lady Kidron, previously moved at Report. We disagreed with them, but we lost that vote; this is consequential, and we will not force a Division on it.

We will have further opportunity to debate the fundamental issues that lie behind it, to which my noble friend Lady Harding just referred. Some of the amendments on which we may divide later, the noble Baroness, Lady Kidron, tabled after defeating the Government the other day, so we cannot treat them as consequential. We look forward to debating them; I will urge noble Lords not to vote for them, but we will have opportunity to discuss them later.

Amendment 240 agreed.
Moved by
241: Clause 82, page 74, line 31, leave out “or 3” and insert “, 3 or 3A”
Member’s explanatory statement
Clause 82 is about OFCOM’s general duties. This amendment and the next amendment in my name insert a reference to Chapter 3A, which is the new Chapter containing the new duties imposed by the Clause proposed after Clause 67 in my name.
--- Later in debate ---
Moved by
243: Clause 82, page 75, line 2, leave out “or 3” and insert “, 3 or 3A”
Member’s explanatory statement
See the explanatory statement for the preceding amendment in my name.
--- Later in debate ---
Moved by
246: Clause 91, page 83, line 14, leave out “(an “information notice”)”
Member’s explanatory statement
This technical amendment is needed because the new notice requiring information in connection with an investigation into the death of a child (see the new Clause proposed after Clause 91 in my name) is also a form of information notice.
--- Later in debate ---
Moved by
247A: Clause 91, page 83, line 19, at end insert—
“(2A) The power conferred by subsection (1) also includes power to require a person within any of paragraphs (a) to (d) of subsection (4) to take steps so that OFCOM are able to remotely access the service provided by the person, or remotely access equipment used by the service provided by the person, in order to view, in particular—(a) information demonstrating in real time the operation of systems, processes or features, including functionalities and algorithms, used by the service;(b) information generated in real time by the performance of a test or demonstration of a kind required by a notice under subsection (1).”Member’s explanatory statement
This amendment makes it clear that OFCOM have the power by notice to require a provider of a regulated service (among others) to take steps to allow OFCOM to remotely access the service so that they can view the operation in real time of systems, processes, functionalities and algorithms, and tests and demonstrations.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I beg to move Amendment 247A.

Amendment 247B (to Amendment 247A) not moved.
--- Later in debate ---
Moved by
248: Clause 91, page 84, line 2, at end insert—
“(iva) any duty set out in section (Disclosure of information about use of service by deceased child users) (deceased child users),”Member’s explanatory statement
This amendment mentions the new duties imposed by the Clause proposed after Clause 67 in my name in the Clause that sets out the purposes for which OFCOM may require people to provide information.
--- Later in debate ---
Moved by
249: After Clause 91, insert the following new Clause—
“Information in connection with an investigation into the death of a child
(1) OFCOM may by notice under this subsection require a relevant person to provide them with information for the purpose of—(a) responding to a notice given by a senior coroner under paragraph 1(2) of Schedule 5 to the Coroners and Justice Act 2009 in connection with an investigation into the death of a child, or preparing a report under section (OFCOM’s report in connection with investigation into a death) in connection with such an investigation;(b) responding to a request for information in connection with the investigation of a procurator fiscal into, or an inquiry held or to be held in relation to, the death of a child, or preparing a report under section (OFCOM’s report in connection with investigation into a death) in connection with such an inquiry;(c) responding to a notice given by a coroner under section 17A(2) of the Coroners Act (Northern Ireland) 1959 (c. 15 (N.I.)) in connection with—(i) an investigation to determine whether an inquest into the death of a child is necessary, or(ii) an inquest in relation to the death of a child,or preparing a report under section (OFCOM’s report in connection with investigation into a death) in connection with such an investigation or inquest. (2) The power conferred by subsection (1) includes power to require a relevant person to provide OFCOM with information about the use of a regulated service by the child whose death is under investigation, including, in particular—(a) content encountered by the child by means of the service,(b) how the content came to be encountered by the child (including the role of algorithms or particular functionalities),(c) how the child interacted with the content (for example, by viewing, sharing or storing it or enlarging or pausing on it), and(d) content generated, uploaded or shared by the child.(3) The power conferred by subsection (1) includes power to require a relevant person to obtain or generate information.(4) The power conferred by subsection (1) must be exercised in a way that is proportionate to the purpose mentioned in that subsection.(5) The power conferred by subsection (1) does not include power to require the provision of information in respect of which a claim to legal professional privilege, or (in Scotland) to confidentiality of communications, could be maintained in legal proceedings. (6) Nothing in this section limits the power conferred on OFCOM by section 91.(7) In this section—“inquiry” means an inquiry held, or to be held, under the Inquiries into Fatal Accidents and Sudden Deaths etc. (Scotland) Act 2016 (asp 2);“information” includes documents, and any reference to providing information includes a reference to producing a document (and see also section 92(9));“relevant person” means a person within any of paragraphs (a) to (e) of section 91(4).”Member’s explanatory statement
This amendment makes it clear that OFCOM have the power to obtain information for the purposes of responding to a notice given to them by a coroner or, in Scotland, a request from a procurator fiscal, in connection with the death of a child, including a power to obtain information from providers about the use of a service by the deceased child.
--- Later in debate ---
Moved by
250: Clause 92, page 85, line 3, at end insert—
“(A1) A notice given under section 91(1) or (Information in connection with an investigation into the death of a child)(1) is referred to in this Act as an information notice.”Member’s explanatory statement
This amendment provides that a notice under the new Clause proposed in my name concerning OFCOM’s power to obtain information in connection with an investigation into the death of a child is called an “information notice” (as well as a notice under Clause 91). This ensures that provisions of the Bill that relate to information notices also apply to a notice given under that Clause.
--- Later in debate ---
Moved by
250B: Clause 94, page 86, line 26, leave out “any” and insert “either”
Member’s explanatory statement
This amendment is consequential on the next amendment of Clause 94 in my name.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, these amendments are concerned with Ofcom’s powers under Clause 111 to issue notices to deal with terrorism content and child sexual exploitation and abuse content.

I acknowledge the concerns which have been aired about how these powers work with encrypted services. I want to make it clear that the Bill does not require companies to break or weaken encryption, and we have built in strong safeguards to ensure that users’ privacy is protected. Encryption plays an important role online, and the UK supports its responsible use. I also want to make it clear that we are not introducing a blanket requirement for companies to monitor all content for all harms, at all times. That would not be proportionate.

However, given the serious risk of harm to children from sexual abuse and exploitation online, the regulator must have appropriate, tightly targeted powers to compel companies to take the most effective action to tackle such reprehensible illegal activity which is taking place on their services. We must ask companies to do all that is technically feasible to keep children safe, subject to stringent legal safeguards.

The powers in the Bill are predicated on risk assessments. If companies are managing the risks on their platform appropriately, Ofcom will not need to use its powers. As a last resort, however, where there is clear evidence of child sexual abuse taking place on a platform, Ofcom will be able to direct companies either to use, or to make best efforts to develop or source, accredited and accurate technology to identify and remove this illegal content. To be clear, these powers will not enable Ofcom or our law enforcement agencies to obtain any automatic access to the content detected. It is simply a matter of making private companies take effective action to prevent child sexual abuse on their services.

Ofcom must consider a wide range of matters when deciding whether a notice is necessary and proportionate, including the impacts on privacy and freedom of expression of using a particular technology on a particular service. Ofcom will only be able to require the use of technology accredited as highly accurate in detecting illegal child sexual abuse or terrorism content, vastly minimising the risk that content is wrongly identified.

In addition to these safeguards, as a public body, Ofcom is bound through the Human Rights Act 1998 by the European Convention on Human Rights, including Articles 8 and 10. Ofcom has an obligation not to act in a way which unduly interferes with the right to privacy and freedom of expression when carrying out its duties, for which it is held to account.

If appropriate technology does not exist which meets these requirements, Ofcom cannot require its use. That is why the powers include the ability for Ofcom to require companies to make best endeavours to develop or source a solution. It is right that we can require technology companies to use their considerable resources and expertise to develop the best possible protections for children in encrypted environments.

Despite the breadth of the existing safeguards, we recognise that concerns remain about these powers, and we have listened to the points that noble Lords raised in Committee about privacy and technical feasibility. That is why we are introducing additional safeguards. I am grateful for the constructive engagement I have had with noble Lords across your Lordships’ House on this issue, and I hope that the government amendments alleviate their concerns.

I turn first to our Amendments 250B, 250C, 250D, 255A, 256A, 257A, 257B, 257C and 258A, which require that Ofcom obtain a skilled persons’ report before issuing a warning notice and exercising its powers under Clause 111. This independent expert scrutiny will supplement Ofcom’s own expertise to ensure that it has a full understanding of relevant technical issues to inform its decision-making. That will include issues specific to the service in question, such as its design and relevant factors relating to privacy.

--- Later in debate ---
I am very grateful to those who have suggested that our amendments are the right way to go. As I have said, I will not be pushing them—the reasons being that I think they go a little too far, but a little more of that would not be a bad thing. The Government are almost there with that, but I think a bit more time, effort and concern about some of the suggestions would probably get us to a better place than we are at the moment. I particularly think that about those from the noble Baroness, Lady Harding, about taking the lessons from what has happened in other places and trying to systematise that so it is clear that there are external persons and we know who they are, what their backgrounds are and what their roles will be. I look forward to hearing from the Minister when he comes to respond, but, just for confirmation, I do not think this is the appropriate place to vote, and should a vote be called, we will be abstaining.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I am grateful to noble Lords for their further scrutiny of this important but complex area, and for the engagement that we have had in the days running up to it as well. We know how child sexual exploitation and abuse offenders sadly exploit private channels, and the great danger that this poses, and we know how crucial these channels are for secure communication. That is why, where necessary and proportionate, and where all the safeguards are met, it is right that Ofcom can require companies to take all technically feasible measures to remove this vile and illegal content.

The government amendments in this group will go further to ensure that a notice is well informed and targeted and does not unduly restrict users’ rights. Privacy and safety are not mutually exclusive—we can and must have both. The safety of our children depends on it.

I make it clear again that the Bill does not require companies to break or weaken end-to-end encryption on their services. Ofcom can require the use of technology on an end-to-end encrypted service only when it is technically feasible and has been assessed as meeting minimum standards of accuracy. When deciding whether to issue a notice, Ofcom will engage in continual dialogue with the company and identify reasonable, technically feasible solutions to the issues identified. As I said in opening, it is right that we require technology companies to use their considerable resources and expertise to develop the best possible protections to keep children safe in encrypted environments. They are well placed to innovate to find solutions that protect both the privacy of users and the safety of children.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - - - Excerpts

Just to be clear, am I right to understand my noble friend as saying that there is currently no technology that would be technically acceptable for tech companies to do what is being asked of them? Did he say that tech companies should be looking to develop the technology to do what may be required of them but that it is not currently available to them?

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

For clarification, if the answer to that is that the technology does not exist—which I believe is correct, although there are various snake oil salespeople out there claiming that it does, as the noble Baroness, Lady Fox of Buckley, said—my noble friend seems to be saying that the providers and services should develop it. This seems rather circular, as the Bill says that they must adopt an approved technology, which suggests a technology that has been imposed on them. What if they cannot and still get such a notice? Is it possible that these powers will never be capable of being used, especially if they do not co-operate?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

To answer my noble friend Lady Stowell first, it depends on the type of service. It is difficult to give a short answer that covers the range of services that we want to ensure are covered here, but we are seeking to keep this and all other parts of the Bill technology neutral so that, as services develop, technology changes and criminals, unfortunately, seek to exploit that, technology companies can continue to innovate to keep children safe while protecting the privacy of their users. That is a long-winded answer to my noble friend’s short question, but necessarily so. Ofcom will need to make its assessments on a case- by-case basis and can require a company to use its best endeavours to innovate if no effective and accurate technology is currently available.

While I am directing my remarks towards my noble friend, I will also answer a question she raised earlier on general monitoring. General monitoring is not a legally defined concept in UK law; it is a term in European Union law that refers to the generalised monitoring of user activity online, although its parameters are not clearly defined. The use of automated technologies is already fundamental to how many companies protect their users from the most abhorrent harms, including child sexual abuse. It is therefore important that we empower Ofcom to require the use of such technology where it is necessary and proportionate and ensure that the use of these tools is transparent and properly regulated, with clear and appropriate safeguards in place for users’ rights. The UK’s existing intermediary liability regime remains in place.

Amendment 255 from my noble friend Lord Moylan seeks to prevent Ofcom imposing any requirement in a notice that would weaken or remove end-to-end encryption. He is right that end-to-end encryption should not be weakened or removed. The powers in the Bill will not do that. These powers are underpinned by proportionality and technical feasibility; if it is not proportionate or technically feasible for companies to identify child sexual exploitation abuse content on their platform while upholding users’ right to privacy, Ofcom cannot require it.

I agree with my noble friend and the noble Baroness, Lady Fox, that encryption is a very important and popular feature today. However, with technology evolving at a rapid rate, we cannot accept amendments that would risk this legislation quickly becoming out of date. Naming encryption in the Bill would risk that happening. We firmly believe that the best approach is to focus on strong safeguards for upholding users’ rights and ensuring that measures are proportionate to the specific situation, rather than on general features such as encryption.

The Bill already requires Ofcom to consider the risk that technology could result in a breach of any statutory provision or rule of law concerning privacy and whether any alternative measures would significantly reduce the amount of illegal content on a service. As I have said in previous debates, Ofcom is also bound by the Human Rights Act not to act inconsistently with users’ rights.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

Will the Minister write to noble Lords who have been here in Committee and on Report in response to the fact that it is not just encryption companies saying that the demands of this clause will lead to the breaching of encryption, even though the Minister and the Government keep saying that it will not? As I have indicated, a wide range of scientists and technologists are saying that, whatever is said, demanding that Ofcom insists that technology notices are used in this way will inadvertently lead to the breaking of encryption. It would be useful if the Government at least explained scientifically and technologically why those experts are wrong and they are right.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am very happy to put in writing what I have said from the Dispatch Box. The noble Baroness may find that it is the same, but I will happily set it out in further detail.

I should make it clear that the Bill does not permit law enforcement agencies to access information held on platforms, including access to private channels. The National Crime Agency will be responsible for receiving reports from in-scope services via secure transmission, processing these reports and, where appropriate, disseminating them to other UK law enforcement bodies and our international counterparts. The National Crime Agency will process only information provided to it by the company; where it determines that the content is child sexual abuse content and meets the threshold for criminality, it can request further information from the company using existing powers.

I am glad to hear that my noble friend Lord Moylan does not intend to divide on his amendment. The restrictions it sets out are not ones we should impose on the Bill.

Amendments 256, 257 and 259 in the name of the noble Lord, Lord Stevenson of Balmacara, require a notice to be approved by a judicial commissioner appointed under the Investigatory Powers Act 2016 and remove Ofcom’s power to require companies to make best endeavours to develop or source new technology to address child sexual exploitation and abuse content.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

I appreciate the tone of the Minister’s comments very much, but they are not entirely reassuring me. There is a debate going on out there: there are people saying, “We’ve got these fabulous technologies that we would like Ofcom to order companies to install” and there are companies saying, “That would be disastrous and break encryption if we had to install them”. That is a dualistic situation where there is a contest going on. My amendment seeks to make sure the conflict can be properly resolved. I do not think Ofcom on its own can ever do that, because Ofcom will always be defending what it is doing and saying “This is fine”. So, there has to be some other mechanism whereby people can say it is not fine and contest that. As I say, in this debate we are ignoring the fact that they are already out there: people saying “We think you should deploy this” and companies saying “It would be disastrous if we did”. We cannot resolve that by just saying “Trust Ofcom”.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

To meet the expectation the noble Lord voiced earlier, I will indeed point out that Ofcom can consult the ICO as a skilled person if it wishes to. It is important that we square the circle and look at these issues. The ICO will be able to be involved in the way I have set out as a skilled person.

Before I conclude, I want to address my noble friend Lady Harding’s questions on skilled persons. Given that notices will be issued on a case-by-case basis, and Ofcom will need to look at specific service design and existing systems of a provider to work out how a particular technology would interact with that design system, a skilled person’s report better fits this process by requiring Ofcom to obtain tailored advice rather than general technical advice from an advisory board. The skilled person’s report will be largely focused on the technical side of Ofcom’s assessment: that is to say, how the technology would interact with the service’s design and existing systems. In this way, it offers something similar to but more tailored than a technical advisory board. Ofcom already has a large and expert technology group, whose role it is to advice policy teams on new and existing technologies, to anticipate the impact of technologies and so on. It already has strong links with academia and with external researchers. A technical advisory board would duplicate that function. I hope that reassures my noble friend that the points she raised have been taken into account.

So I hope the noble Lord, Lord Allan, will not feel the need to divide—

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

Before the Minister finishes, I posed the question about whether, given the debate and issues raised, he felt completely satisfied that we had arrived at the right solution, and whether there was a case for withdrawing the amendment at this stage and bringing it back at Third Reading, having had further discussions and debate where we could all agree. I take it his answer is “no”.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am afraid it is “no”, and if the noble Lord, Lord Allan, does seek to divide, we will oppose his amendment. I commend the amendments standing in my name in this group to the House.

Amendment 250B agreed.
Moved by
250C: Clause 94, page 86, line 34, leave out paragraph (c)
Member’s explanatory statement
This amendment is consequential on the new Clause proposed to be inserted in my name after Clause 111. It omits words in Clause 94 (skilled person’s reports) because that new Clause now requires OFCOM to obtain a skilled person’s report before giving a provider a notice under Clause 111.
--- Later in debate ---
Moved by
252: Clause 94, page 88, line 2, at end insert—
“(xiia) section (Disclosure of information about use of service by deceased child users) (deceased child users);” Member’s explanatory statement
This amendment has the effect that OFCOM may require a skilled person’s report in relation to compliance with the new duties imposed by the Clause proposed after Clause 67 in my name.
--- Later in debate ---
Moved by
252A: Schedule 12, page 228, line 4, at end insert—
“(4A) The power to observe the carrying on of the regulated service at the premises includes the power to view, using equipment or a device on the premises, information generated in real time by the performance of a test or demonstration required by a notice given under paragraph 3.”Member’s explanatory statement
This amendment ensures that during an inspection of a service, OFCOM have the power to observe a test or demonstration of which notice has been given.
--- Later in debate ---
Moved by
254: Clause 105, page 94, line 33, at end insert—
“(3A) In subsection (3), after paragraph (h) insert—“(ha) a person appointed under—(i) paragraph 1 of Schedule 3 to the Coroners and Justice Act 2009, or(ii) section 2 of the Coroners Act (Northern Ireland) 1959 (c. 15 (N.I.));(hb) the procurator fiscal, within the meaning of the enactment mentioned in subsection (5)(s);”.(3B) In subsection (5)—(a) before paragraph (d) insert—“(ca) the Coroners Act (Northern Ireland) 1959;”,(b) after paragraph (na) insert—“(nb) Part 1 of the Coroners and Justice Act 2009;”, and(c) after paragraph (r) insert—“(s) the Inquiries into Fatal Accidents and Sudden Deaths etc. (Scotland) Act 2016 (asp 2).”.”Member’s explanatory statement
This amendment ensures that it is not necessary for OFCOM to obtain the consent of providers of internet services before disclosing information to a coroner or, in Scotland, procurator fiscal, who is investigating a person’s death.
--- Later in debate ---
Moved by
254A: Clause 107, page 95, line 20, leave out “(2)” and insert “(3)”
Member’s explanatory statement
This is a technical drafting change needed because section 24B of the Communications Act 2003 has been amended after this Bill was introduced.
--- Later in debate ---
Moved by
255A: Clause 111, page 98, line 8, at end insert—
“(za) section (Requirement to obtain skilled person’s report), which requires OFCOM to obtain a skilled person’s report before giving a notice under subsection (1),”Member’s explanatory statement
This amendment is consequential on the new Clause proposed to be inserted in my name after Clause 111. It inserts a signpost to the requirement in that new Clause to obtain a skilled person’s report before giving a provider a notice under Clause 111.
--- Later in debate ---
Moved by
256A: After Clause 111, insert the following new Clause—
“Requirement to obtain skilled person’s report
(1) OFCOM may give a notice under section 111(1) to a provider only after obtaining a report from a skilled person appointed by OFCOM under section 94(3).(2) The purpose of the report is to assist OFCOM in deciding whether to give a notice under section 111(1), and to advise about the requirements that might be imposed by such a notice if it were to be given.”Member’s explanatory statement
This amendment requires OFCOM to obtain a skilled person’s report under Clause 94 before giving a notice to a provider under Clause 111.
--- Later in debate ---
Moved by
257A: Clause 112, page 98, line 24, at end insert—
“(za) contain a summary of the report obtained by OFCOM under section (Requirement to obtain skilled person’s report),”Member’s explanatory statement
This amendment requires a warning notice given to a provider to contain a summary of the skilled person’s report obtained by OFCOM under the new Clause proposed to be inserted in my name after Clause 111.
--- Later in debate ---
Moved by
257C: Clause 113, page 99, line 32, at end insert—
“(ga) the contents of the skilled person’s report obtained as required by section (Requirement to obtain skilled person’s report);”Member’s explanatory statement
This amendment requires OFCOM to consider the contents of the skilled person’s report obtained as required by the new Clause proposed to be inserted in my name after Clause 111, as part of OFCOM’s decision about whether it is necessary and proportionate to give a notice to a provider under Clause 111.
--- Later in debate ---
Moved by
258A: Clause 115, page 102, line 24, leave out “Section 112 (warning notices) does” and insert “Sections (Requirement to obtain skilled person’s report)(skilled person’s report) and 112 (warning notices) do”
Member’s explanatory statement
This amendment provides that, if OFCOM propose to issue a further notice under Clause 111, it is not necessary to obtain a further skilled person’s report under the new Clause proposed to be inserted in my name after Clause 111.
--- Later in debate ---
Moved by
260: Page 105, line 4, at end insert—

“Section (Assessment duties: user empowerment)

Assessments related to duty in section 12(2)”

Member’s explanatory statement
This amendment ensures that OFCOM are able to use their enforcement powers in Chapter 6 of Part 7 in relation to a breach of any of the new duties imposed by the Clause proposed after Clause 11 in my name.
--- Later in debate ---
Moved by
262: Clause 122, page 107, line 7, leave out “for constraints on” and insert “in relation to”
Member’s explanatory statement
This amendment is consequential on the amendments of Clause 125 in my name.
--- Later in debate ---
Moved by
262A: Clause 122, page 107, line 17, at end insert—
“(ba) specify which of those requirements (if any) have been designated as CSEA requirements (see subsections (5A) and (5B)),”Member’s explanatory statement
This amendment is consequential on the next amendment to this Clause in my name.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, in moving Amendment 262A, I will speak also to the other government amendments in the group. These amendments address the Bill’s enforcement powers. Government Amendments 262A, 262B, 262C, 264A and 266A, Amendments 265, 266 and 267, tabled by my noble friend Lord Bethell, and Amendment 268 tabled by the noble Lord, Lord Stevenson of Balmacara, relate to senior management liability. Amendment 268C from the noble Lord, Lord Weir of Ballyholme, addresses interim service restriction orders.

In Committee, we amended the Bill to create an offence of non-compliance with steps set out in confirmation decisions that relate to specific children’s online safety duties, to ensure that providers and individuals can be held to account where their non-compliance risks serious harm to children. Since then, we have listened to concerns raised by noble Lords and others, in particular that the confirmation decision offence would not tackle child sexual exploitation and abuse. That is why the government amendments in this group will create a new offence of a failure to comply with a child sexual exploitation and abuse requirement imposed by a confirmation decision. This will mean that providers and senior managers can be held liable if they fail to comply with requirements to take specific steps as set out in Ofcom’s confirmation decision in relation to child sexual exploitation and abuse on their service.

Ofcom must designate a step in a confirmation decision as a child sexual exploitation and abuse requirement, where that step relates, whether or not exclusively, to a failure to comply with specific safety duties in respect of child sexual exploitation and abuse content. Failure to comply with such a requirement will be an offence. This approach is necessary, given that steps may relate to multiple or specific kinds of illegal content, or systems and process failures more generally. This approach will ensure that services know from the confirmation decision when they risk criminal liability, while providing sufficient legal certainty via the specified steps to ensure that the offence can be prosecuted effectively.

The penalty for this offence is up to two years in prison, a fine or both. Through Clause 182, where an offence is committed with the consent or connivance of a senior manager, or attributable to his or her neglect, the senior manager, as well as the entity, will have committed the offence and can face up to two years in prison, a fine or both.

I thank my noble friend Lord Bethell, as well as our honourable friends Miriam Cates and Sir William Cash in another place, for their important work in raising this issue and their collaborative approach as we have worked to strengthen the Bill in this area. I am glad that we have reached a position that will help to keep children safe online and drive a change in culture in technology companies. I hope this amendment reassures them and noble Lords that the confirmation decision offence will tackle harms to children effectively by ensuring that technology executives take the necessary steps to keep children safe online. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, I will briefly comment positively on the Minister’s explanation of how these offences might work, particularly the association of the liability with the failure to enforce a confirmation decision, which seems entirely sensible. In an earlier stage of the debate, there was a sense that we might associate liability with more general failures to enforce a duty of care. That would have been problematic, because the duty of care is very broad and requires a lot of pieces to be put in place. Associating the offences with the confirmation decision makes absolute sense. Having been in that position, if, as an executive in a tech company, I received a confirmation decision that said, “You must do these things”, and I chose wilfully to ignore that decision, it would be entirely reasonable for me to be held potentially criminally liable for that. That association is a good step forward.

--- Later in debate ---
Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, we welcome the government amendments in this group to bring child sexual exploitation and abuse failures into the scope of the senior manager liability and enforcement regime but consider that they do not go far enough. On the government amendments, I have a question for the Minister about whether, through Clause 122, it would be possible to require a company that was subject to action to do some media literacy as part of its harm reduction; in other words, would it be possible for Ofcom to use its media literacy powers as part of the enforcement process? I offer that as a helpful suggestion.

We share the concerns expressed previously by the noble Lord, Lord Bethell, about the scope of the senior manager liability regime, which does not cover all the child safety duties in the Bill. We consider that Amendment 268, in the name of my noble friend Lord Stevenson, would provide greater flexibility, giving the possibility of expanding the list of duties covered in the future. I have a couple of brief questions to add to my first question. Will the Minister comment on how the operation of the senior manager liability regime will be kept under review? This has, of course, been something of a contentious issue in the other place, so could the Minister perhaps tell your Lordships’ House how confident he is that the current position is supported there? I look forward to hearing from the Minister.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I did not quite finish writing down the noble Baroness’s questions. I will do my best to answer them, but I may need to follow up in writing because she asked a number at the end, which is perfectly reasonable. On her question about whether confirmation decision steps could include media literacy, yes, that is a good idea; they could.

Amendment 268, tabled by the noble Lord, Lord Stevenson of Balmacara, seeks to enable the Secretary of State, through regulation, to add to the list of duties which are linked to the confirmation decision offence. We are very concerned at the prospect of allowing an unconstrained expansion of the confirmation decision offence. In particular, as I have already set out, we would be concerned about expansion of those related to search services. There is also concern about unconstrained additions of any other duties related to user-to-user services as well.

We have chosen specific duties which will tackle effectively key issues related to child safety online and tackling child abuse while ensuring that the confirmation decision offence remains targeted. Non-compliance with a requirement imposed by a confirmation decision in relation to such duties warrants the prospect of criminal enforcement on top of Ofcom’s extensive civil enforcement powers. Making excessive changes to the offence risks shifting the regime towards a more punitive and disproportionate enforcement model, which would represent a significant change to the framework as a whole. Furthermore, expansion of the confirmation decision offence could lead to services taking an excessively cautious approach to content moderation to avoid the prospect of criminal liability. We are also concerned that such excessive expansion could significantly increase the burden on Ofcom.

I am grateful to the noble Lord, Lord Weir of Ballyholme, and the noble Baroness, Lady Benjamin, for the way they set out their Amendment 268C. We are concerned about this proposal because it is important that Ofcom can respond to issues on a case-by-case basis: it may not always be appropriate or proportionate to use a specific enforcement power in response to a suspected breach. Interim service restriction orders are some of the strongest enforcement powers in the Bill and will have a significant impact on the service in question. Their use may be disproportionate in cases where there is only a minor breach, or where a service is taking steps to deal with a breach following a provisional notice of contravention.

--- Later in debate ---
Moved by
262B: Clause 122, page 107, line 35, at end insert—
“(5A) If the condition in subsection (5B) is met in relation to a requirement imposed by a confirmation decision which is of a kind described in subsection (1), OFCOM must designate the requirement as a “CSEA requirement” for the purposes of section 127(2A) (offence of failure to comply with confirmation decision).(5B) The condition referred to in subsection (5A) is that the requirement is imposed (whether or not exclusively) in relation to either or both of the following—(a) a failure to comply with section 9(2)(a) or (3)(a) in respect of CSEA content, or in respect of priority illegal content which includes CSEA content; (b) a failure to comply with section 9(2)(b) in respect of an offence specified in Schedule 6 (CSEA offences), or in respect of priority offences which include such an offence.”Member’s explanatory statement
This amendment provides that where a confirmation decision imposes a requirement to take steps in relation to a failure to comply with a duty under Clause 9(2)(a), (2)(b) or (3)(a) in respect of CSEA content or an offence under Schedule 6, OFCOM must designate the requirement as a CSEA requirement with the result that failure to comply with it is an offence (see the amendment inserting subsection (2A) into Clause 127 in my name).
--- Later in debate ---
Moved by
263: Clause 125, page 109, line 27, leave out “constraints on OFCOM’s power” and insert “what powers OFCOM have”
Member’s explanatory statement
This amendment is consequential on the next amendment in my name.
--- Later in debate ---
Moved by
264A: Clause 127, page 112, line 22, leave out “relates (whether or not exclusively) to” and insert “is imposed (whether or not exclusively) in relation to a failure to comply with”
Member’s explanatory statement
This is a technical amendment which adjusts the language of this provision.
--- Later in debate ---
Moved by
266A: Clause 127, page 112, line 27, at end insert—
“(2A) A person to whom a confirmation decision is given commits an offence if, without reasonable excuse, the person fails to comply with a CSEA requirement imposed by the decision (see section 122 (5A) and (5B)).”Member’s explanatory statement
This amendment provides that a person commits an offence if the person fails to comply, without reasonable excuse, with a CSEA requirement imposed by a confirmation decision given to the person (see the amendment inserting new subsections (5A) and (5B) into Clause 122 in my name.)
--- Later in debate ---
Moved by
268A: Schedule 13, page 236, line 12, leave out sub-paragraph (9) and insert—
“(9) Regulations made by OFCOM under section (Regulations by OFCOM about qualifying worldwide revenue etc)(1)(a)(including regulations making provision of a kind mentioned in section (Regulations by OFCOM about qualifying worldwide revenue etc)(3), (4) or (5)) apply for the purpose of determining the qualifying worldwide revenue of a provider of a regulated service for an accounting period as mentioned in this paragraph as they apply for the purpose of determining the qualifying worldwide revenue of a provider of a regulated service for a qualifying period for the purposes of Part 6.”Member’s explanatory statement
This amendment provides that regulations under the new Clause 76 proposed in my name about “qualifying worldwide revenue” for the purposes of Part 6 of the Bill (fees) also applies for the purposes of financial penalties under paragraph 4 of Schedule 13.
--- Later in debate ---
Moved by
269B: Clause 141, page 128, line 19, leave out “duty” and insert “duties”
Member’s explanatory statement
This amendment is consequential on the new Clause proposed to be inserted after Clause 149 in my name expanding OFCOM’s duties to promote media literacy in relation to regulated user-to-user and search services.
--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I think the upshot of this brief debate is that the noble Lord, Lord Knight —how he was tracked down in a Pret A Manger, I have no idea; he is normally too fast-moving for that—in his usual constructive and creative way is asking the Government to constructively engage to find a solution, which he discussed in that Pret A Manger, involving a national helpline, the NSPCC and the Children’s Commissioner, for the very reasons that he and my noble friend Lord Allan have put forward. In no way would this be some of kind of quango, in the words of the noble Baroness, Lady Fox.

This is really important stuff. It could be quite a game-changer in the way that the NSPCC and the Children’s Commissioner collaborate on tackling the issues around social media, the impact of the new rights under the Bill and so on. I very much hope that the Government will be able to engage positively on this and help to bring the parties together to, in a sense, deliver something which is not in the Bill but could be of huge importance.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, first, I reassure noble Lords that the Government are fully committed to making sure that the interests of children are both represented and protected. We believe, however, that this is already achieved through the provisions in the Bill.

Rather than creating a single advocacy body to research harms to children and advocate on their behalf, as the noble Lord’s amendment suggests, the Bill achieves the same effect through a combination of Ofcom’s research functions, the consultation requirements and the super-complaints provisions. Ofcom will be fully resourced with the capacity and technological ability to assess and understand emerging harms and will be required to research children’s experiences online on an ongoing basis.

For the first time, there will be a statutory body in place charged with protecting children from harm online. As well as its enforcement functions, Ofcom’s research will ensure that the framework remains up to date and that Ofcom itself has the latest, in-depth information to aid its decision-making. This will ensure that new harms are not just identified in retrospect when children are already affected by them and complaints are made; instead, the regulator will be looking out for new issues and working proactively to understand concerns as they develop.

Children’s perspectives will play a central role in the development of the framework, as Ofcom will build on its strong track record of qualitative research to ensure that children are directly engaged. For example, Ofcom’s ongoing programme, Children’s Media Lives, involves engaging closely with children and tracking their views and experiences year on year.

Alongside its own research functions, super-complaints will ensure that eligible bodies can make complaints on systemic issues, keeping the regulator up to date with issues as they emerge. This means that if Ofcom does not identify a systemic issue affecting children for any reason, it can be raised and then dealt with appropriately. Ofcom will be required to respond to the super-complaint, ensuring that its subsequent decisions are understood and can be scrutinised. Complaints by users will also play a vital role in Ofcom’s horizon scanning and information gathering, providing a key means by which new issues can be raised.

The extensive requirements for Ofcom to consult on codes of practice and guidance will further ensure that it consistently engages with groups focused on the interests of children as the codes and guidance are developed and revised. Children’s interests are embedded in the implementation and delivery of this framework.

The Children’s Commissioner will play a key and ongoing role. She will be consulted on codes of practice and any further changes to those codes. The Government are confident that she will use her statutory duties and powers effectively to understand children’s experiences of the digital world. Her primary function as Children’s Commissioner for England is promoting and protecting the rights of children in England and to promote and protect the rights of children across the United Kingdom where those rights are or may be affected by reserved matters. As the codes of practice and the wider Bill relate to a reserved area of law—namely, internet services—the Children’s Commissioner for England will be able to represent the interests of children from England, Scotland, Wales and Northern Ireland when she is consulted on the preparation of codes of practice. That will ensure that children’s voices are represented right across the UK. The Children’s Commissioner for England and her office also regularly speak to the other commissioners about ongoing work on devolved and reserved matters. Whether she does that in branches of Pret A Manger, I do not know, but she certainly works with her counterparts across the UK.

I am very happy to take back the idea that the noble Lord has raised and discuss it with the commissioner. There are many means by which she can carry out her duties, so I am very happy to take that forward. I cannot necessarily commit to putting it in legislation, but I shall certainly commit to discussing it with her. On the proposals in the noble Lord’s amendment, we are concerned that a separate child user advocacy body would duplicate the functions that she already has, so I hope with that commitment he will be happy to withdraw.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to those who have spoken in this quick debate and for the support from the noble Lord, Lord Allan of Hallam, and the noble Baroness, Lady Fox, about children’s voices being heard. I think that we are getting to the point when there will not be a quango or indeed a minefield, so that makes us all happy. The Minister almost derailed me, because so much of his speaking note was about the interests of children and I am more interested in the voice of children being heard directly rather than people acting on their behalf and representing their interests, but his final comments around being happy to take the idea forward means that I am very happy to withdraw my amendment.

--- Later in debate ---
Moved by
271: After Clause 145, insert the following new Clause—
“OFCOM’s reports about use of age assurance
(1) OFCOM must produce and publish a report assessing—(a) how providers of regulated services have used age assurance for the purpose of compliance with their duties set out in this Act,(b) how effective the use of age assurance has been for that purpose, and(c) whether there are factors that have prevented or hindered the effective use of age assurance, or a particular kind of age assurance, for that purpose,(and in this section, references to a report are to a report described in this subsection).(2) A report must, in particular, consider whether the following have prevented or hindered the effective use of age assurance—(a) the costs to providers of using it, and(b) the need to protect users from a breach of any statutory provision or rule of law concerning privacy that is relevant to the use or operation of a regulated service (including, but not limited to, any such provision or rule concerning the processing of personal data).(3) Unless the Secretary of State requires the production of a further report (see subsection (6)), the requirement in subsection (1) is met by producing and publishing one report within the period of 18 months beginning with the day on which sections 11 and 72(2) come into force (or if those provisions come into force on different days, the period of 18 months beginning with the later of those days).(4) In preparing a report, OFCOM must consult—(a) the Information Commissioner, and(b) such other persons as OFCOM consider appropriate.(5) OFCOM must send a copy of a report to the Secretary of State, and the Secretary of State must lay it before Parliament.(6) The Secretary of State may require OFCOM to produce and publish a further report in response to—(a) the development of age assurance technology, or(b) evidence of the reduced effectiveness of such technology.(7) But such a requirement may not be imposed—(a) within the period of three years beginning with the date on which the first report is published, or(b) more frequently than once every three years.(8) For further provision about reports under this section, see section 149.(9) In this section “age assurance” means age verification or age estimation.”Member’s explanatory statement
This new Clause requires OFCOM to produce and publish a report about the use of age assurance by providers of regulated services.
--- Later in debate ---
Moved by
272A: After Clause 147, insert the following new Clause—
“OFCOM’s report about use of app stores by children
(1) OFCOM must produce a report about the use of app stores by children.(2) In particular, the report must—(a) assess what role app stores play in children encountering content that is harmful to children, search content that is harmful to children or regulated provider pornographic content by means of regulated apps which the app stores make available,(b) assess the extent to which age assurance is currently used by providers of app stores, and how effective it is, and(c) explore whether children’s online safety would be better protected by the greater use of age assurance or particular kinds of age assurance by such providers, or by other measures.(3) OFCOM must publish the report during the period beginning two years, and ending three years, after the day on which sections 11 and 25 come into force (or if those sections come into force on different days, the later of those days).(4) For further provision about the report under this section, see section 149.(5) In this section—“app” includes an app for use on any kind of device, and “app store” is to be read accordingly;“content that is harmful to children” has the same meaning as in Part 3 (see section 54);“regulated app” means an app for a regulated service;“regulated provider pornographic content” has the same meaning as in Part 5 (see section 70);“search content” has the same meaning as in Part 3 (see section 51).(6) In this section references to children are to children in the United Kingdom.”Member’s explanatory statement
This amendment requires OFCOM to produce a report about the use of app stores by children, including consideration of whether children would be better protected by greater use of age assurance.
--- Later in debate ---
Moved by
272B: Clause 148, page 132, line 11, leave out “two years” and insert “18 months”
Member’s explanatory statement
This amendment provides that the report that OFCOM must publish under Clause 148 (report about researchers’ access to information) must be published within 18 months of Clause 148 coming into force (rather than two years).
--- Later in debate ---
Moved by
272C: Clause 148, page 132, line 16, leave out “Following the publication of the report, OFCOM may” and insert “OFCOM must”
Member’s explanatory statement
This amendment provides that OFCOM must (rather than may) produce guidance about matters dealt with by the report published under Clause 148.
--- Later in debate ---
Moved by
273: After Clause 148, insert the following new Clause—
“OFCOM’s report in connection with investigation into a death
(1) Subsection (2) applies if OFCOM receive—(a) a notice from a senior coroner under paragraph 1(2) of Schedule 5 to the Coroners and Justice Act 2009 in connection with an investigation into the death of a person;(b) a request for information in connection with the investigation of a procurator fiscal into, or an inquiry held or to be held in relation to, the death of a person;(c) a notice from a coroner under section 17A(2) of the Coroners Act (Northern Ireland) 1959 (c. 15 (N.I.)) in connection with—(i) an investigation to determine whether an inquest into the death of a person is necessary, or(ii) an inquest in relation to the death of a person.(2) OFCOM may produce a report for use by the coroner or procurator fiscal, dealing with any matters that they consider may be relevant.(3) In subsection (1)(b) “inquiry” means an inquiry held, or to be held, under the Inquiries into Fatal Accidents and Sudden Deaths etc. (Scotland) Act 2016 (asp 2).” Member’s explanatory statement
This amendment makes it clear that OFCOM may produce a report in connection with a person’s death, if the coroner gives OFCOM a notice or, in Scotland, the procurator fiscal requests information, for that purpose.
--- Later in debate ---
Moved by
274: Clause 149, page 132, line 41, at end insert—
“(aa) a report under section (OFCOM’s reports about use of age assurance) (report about use of age assurance),”Member’s explanatory statement
This amendment is consequential on the new Clause to be inserted after Clause 145 in my name. It ensures that the usual confidentiality provisions apply to matters contained in OFCOM’s report about the use of age assurance.
--- Later in debate ---
Moved by
274B: After Clause 149, insert the following new Clause—
“CHAPTER 8MEDIA LITERACYMedia literacy
(1) Section 11 of the Communications Act is amended in accordance with subsections (2) to (5).(2) Before subsection (1) insert—“(A1) In this section—(a) subsection (1) imposes duties on OFCOM which apply in relation to material published by means of the electronic media (including by means of regulated services), and(b) subsections (1A) to (1E) expand on those duties, and impose further duties on OFCOM, in relation to regulated services only.”(3) After subsection (1) insert— “(1A) OFCOM must take such steps, and enter into such arrangements, as they consider most likely to be effective in heightening the public’s awareness and understanding of ways in which they can protect themselves and others when using regulated services, in particular by helping them to—(a) understand the nature and impact of harmful content and the harmful ways in which regulated services may be used, especially content and activity disproportionately affecting particular groups, including women and girls;(b) reduce their and others’ exposure to harmful content and to the use of regulated services in harmful ways, especially content and activity disproportionately affecting particular groups, including women and girls;(c) use or apply—(i) features included in a regulated service, including features mentioned in section 12(2) of the Online Safety Act 2023, and(ii) tools or apps, including tools such as browser extensions,so as to mitigate the harms mentioned in paragraph (b);(d) establish the reliability, accuracy and authenticity of content;(e) understand the nature and impact of disinformation and misinformation, and reduce their and others’ exposure to it;(f) understand how their personal information may be protected.(1B) OFCOM must take such steps, and enter into such arrangements, as they consider most likely to encourage the development and use of technologies and systems for supporting users of regulated services to protect themselves and others as mentioned in paragraph (a), (b), (c), (d) or (e) of subsection (1A), including technologies and systems which—(a) provide further context to users about content they encounter;(b) help users to identify, and provide further context about, content of democratic importance present on regulated user-to-user services;(c) signpost users to resources, tools or information raising awareness about how to use regulated services so as to mitigate the harms mentioned in subsection (1A)(b).(1C) OFCOM’s duty under subsection (1A) is to be performed in the following ways (among others)—(a) pursuing activities and initiatives,(b) commissioning others to pursue activities and initiatives,(c) taking steps designed to encourage others to pursue activities and initiatives, and(d) making arrangements for the carrying out of research (see section 14(6)(a)).(1D) OFCOM must draw up, and from time to time review and revise, a statement recommending ways in which others, including providers of regulated services, might develop, pursue and evaluate activities or initiatives relevant to media literacy in relation to regulated services.(1E) OFCOM must publish the statement and any revised statement in such manner as they consider appropriate for bringing it to the attention of the persons who, in their opinion, are likely to be affected by it.”(4) After subsection (2) insert— “(3) In this section and in section 11A,“regulated service” means—(a) a regulated user-to-user service, or(b) a regulated search service.“Regulated user-to-user service” and “regulated search service” have the same meaning as in the Online Safety Act 2023 (see section 3 of that Act).(4) In this section—(a) “content”, in relation to regulated services, means regulated user-generated content, search content or fraudulent advertisements;(b) the following terms have the same meaning as in the Online Safety Act 2023—“content of democratic importance” (see section 13 of that Act);“fraudulent advertisement” (see sections 33 and 34 of that Act);“harm” (see section 209 of that Act) (and “harmful” is to be interpreted consistently with that section);“provider”(see section 202 of that Act);“regulated user-generated content” (see section 49 of that Act);“search content” (see section 51 of that Act).”(5) In the heading, for “Duty” substitute “Duties”.(6) In section 14 of the Communications Act (consumer research), in subsection (6)(a), after “11(1)” insert “, (1A) and (1B)”.”Member’s explanatory statement
This amendment inserts provisions into section 11 of the Communications Act 2003 (OFCOM’s duties to promote media literacy). The new provisions expand on the existing duties so far as they relate to regulated user-to-user and search services, and impose new duties on OFCOM aimed at enhancing users’ media literacy.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I beg to move Amendment 274B.

Amendments 274BA and 274BB (to Amendment 274B) not moved.
--- Later in debate ---
Moved by
274C: After Clause 149, insert the following new Clause—
“Media literacy strategy and media literacy statement
After section 11 of the Communications Act insert—“11A Regulated services: media literacy strategy and media literacy statement(1) OFCOM must prepare and publish a media literacy strategy within the period of one year beginning with the day on which the Online Safety Act 2023 is passed.(2) A media literacy strategy is a plan setting out how OFCOM propose to exercise their functions under section 11 in the period covered by the plan, which must be not more than three years.(3) In particular, a media literacy strategy must state OFCOM’s objectives and priorities for the period it covers.(4) Before the end of the period covered by a media literacy strategy, OFCOM must prepare and publish a media literacy strategy for a further period, ensuring that each successive strategy covers a period beginning immediately after the end of the last one. (5) In preparing or revising a media literacy strategy, OFCOM must consult such persons as they consider appropriate.(6) OFCOM’s annual report must contain a media literacy statement.(7) A media literacy statement is a statement by OFCOM—(a) summarising what they have done in the financial year to which the report relates in the exercise of their functions under section 11, and(b) assessing what progress has been made towards achieving the objectives and priorities set out in their media literacy strategy in that year.(8) A media literacy statement must include a summary and an evaluation of the activities and initiatives pursued or commissioned by OFCOM in the exercise of their functions under section 11 in the financial year to which the report relates.(9) The first annual report that is required to contain a media literacy statement is the report for the financial year during which OFCOM’s first media literacy strategy is published, and that first statement is to relate to the period from publication day until the end of that financial year.(10) But if OFCOM’s first media literacy strategy is published during the second half of a financial year—(a) the first annual report that is required to contain a media literacy statement is the report for the next financial year, and(b) that first statement is to relate to the period from publication day until the end of that financial year.(11) References in this section to OFCOM’s functions under section 11 are to those functions so far as they relate to regulated services.(12) In this section—“annual report” means OFCOM’s annual report under paragraph 12 of the Schedule to the Office of Communications Act 2002;“financial year” means a year ending with 31 March.””Member’s explanatory statement
This amendment requires OFCOM to produce a media literacy strategy every three years (or more frequently), and to include, in their annual report, a statement summarising and evaluating their media literacy activities, so far as they relate to regulated services, during the year.
--- Later in debate ---
Moved by
276: Clause 202, page 171, line 2, at end insert—
“(15) For the purposes of subsections (8) and (9), a person who makes available on a service an automated tool or algorithm by means of which content is generated is to be regarded as having control over content so generated.”Member’s explanatory statement
This amendment is about who counts as the provider of a service (other than a user-to-user or search service) that hosts provider pornographic content for the purposes of the Bill. The amendment makes it clear that a person who controls a generative tool on the service, such as a generative AI bot, is regarded as controlling the content generated by that tool.
--- Later in debate ---
Moved by
277: After Clause 205, insert the following new Clause—
““Age verification” and “age estimation”
(1) This section applies for the purposes of this Act.(2) “Age verification” means any measure designed to verify the exact age of users of a regulated service.(3) “Age estimation” means any measure designed to estimate the age or age- range of users of a regulated service.(4) A measure which requires a user to self-declare their age (without more) is not to be regarded as age verification or age estimation.”Member’s explanatory statement
This new Clause defines age verification and age estimation, and makes it clear that mere self-declaration of age does not count as either.
--- Later in debate ---
Moved by
278: Clause 206, page 172, line 34, leave out “assessing or establishing” and insert “verifying or estimating”
Member’s explanatory statement
This amendment is made to ensure consistency of language in the Bill when referring to age verification and age estimation.
--- Later in debate ---
Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, as we have heard, the noble Baroness, Lady Harding, made a very clear case in support of these amendments, tabled in the name of the noble Baroness, Lady Kidron, and supported by noble Lords from across the House. The noble Baroness, Lady Morgan, gave wise counsel to the Minister, as did the noble Lord, Lord Clement-Jones, that it is worth stepping back and seeing where we are in order to ensure that the Bill is in the right place. I urge the Minister to find the time and the energy that I know he has—he certainly has the energy and I am sure he will match it with the time—to speak to noble Lords over the coming Recess to agree a way to incorporate systems and functionality into the Bill, for all the reasons we have heard.

On Monday, my noble friend Lord Knight spoke of the need for a review about loot boxes and video games. When we checked Hansard, we saw the Minister had promised that such a review would be offered in the coming months. In an unusual turn of events, the Minister exceeded the timescale. We did not have to hear the words “shortly”, “in the summer” or “spring” or anything like that, because it was announced the very next day that the department would keep legislative options under review.

I make that point simply to thank the Minister for the immediate response to my noble friend Lord Knight. But, if we are to have such a review, does this not point very much to the fact that functionality and systems should be included in the Bill? The Minister has a very nice hook to hang this on and I hope that he will do so.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, this is not just a content Bill. The Government have always been clear that the way in which a service is designed and operated, including its features and functionalities, can have a significant impact on the risk of harm to a user. That is why the Bill already explicitly requires providers to ensure their services are safe by design and to address the risks that arise from features and functionalities.

The Government have recognised the concerns which noble Lords have voiced throughout our scrutiny of the Bill, and those which predated the scrutiny of it. We have tabled a number of amendments to make it even more explicit that these elements are covered by the Bill. We have tabled the new introductory Clause 1, which makes it clear that duties on providers are aimed at ensuring that services are safe by design. It also highlights that obligations on services extend to the design and operation of the service. These obligations ensure that the consideration of risks associated with the business model of a service is a fundamental aspect of the Bill.

My noble friend Baroness Harding of Winscombe worried that we had made the Bill worse by adding this. The new clause was a collaborative one, which we have inserted while the Bill has been before your Lordships’ House. Let me reassure her and other noble Lords as we conclude Report that we have not made it worse by so doing. The Bill will require services to take a safety by design approach to the design and operation of their services. We have always been clear that this will be crucial to compliance with the legislation. The new introductory Clause 1 makes this explicit as an overarching objective of the Bill. The introductory clause does not introduce any new concepts; it is an accurate summary of the key provisions and objectives of the Bill and, to that end, the framework and introductory statement are entirely compatible.

We also tabled amendments—which we debated last Monday—to Clause 209. These make it clear that functionalities contribute to the risk of harm to users, and that combinations of functionality may cumulatively drive up the level of risk. Amendment 281BA would amend the meaning of “functionality” within the Bill, so that it includes any system or process which affects users. This presents a number of concerns. First, such a broad interpretation would mean that any service in scope of the Bill would need to consider the risk of any feature or functionality, including ones that are positive for users’ online experience. That could include, for example, processes designed for optimising the interface depending on the user’s device and language settings. The amendment would increase the burden on service providers under the existing illegal content and child safety duties and would dilute their focus on genuinely risky functionality and design.

Second, by duplicating the reference to systems, processes and algorithms elsewhere in the Bill, it implies that the existing references in the Bill to the design of a service or to algorithms must be intended to capture matters not covered by the proposed new definition of “functionality”. This would suggest that references to systems and processes, and algorithms, mentioned elsewhere in the Bill, cover only systems, processes or algorithms which do not have an impact on users. That risks undermining the effectiveness of the existing duties and the protections for users, including children.

Amendment 268A introduces a further interpretation of features and functionality in the general interpretation clause. This duplicates the overarching interpretation of functionality in Clause 208 and, in so doing, introduces legal and regulatory uncertainty, which in turn risks weakening the existing duties. I hope that sets out for my noble friend Lady Harding and others our legal concerns here.

Amendment 281FA seeks to add to the interpretation of harm in Clause 209 by clarifying the scenarios in which harm may arise, specifically from services, systems and processes. This has a number of concerning effects. First, it states that harm can arise solely from a system and process, but a design choice does not in isolation harm a user. For example, the decision to use algorithms, or even the algorithm itself, is not what causes harm to a user—it is the fact that harmful content may be pushed to a user, or content pushed in such a manner that is harmful, for example repeatedly and in volume. That is already addressed comprehensively in the Bill, including in the child safety risk assessment duties.

Secondly, noble Lords should be aware that the drafting of the amendment has the effect of saying that harm can arise from proposed new paragraphs (a) (b) and (c)—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

Can I just double-check what my noble friend has just said? I was lulled into a possibly false sense of security until we got to the point where he said “harmful” and then the dreaded word “content”. Does he accept that there can be harm without there needing to be content?

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

This is the philosophical question on which we still disagree. Features and functionality can be harmful but, to manifest that harm, there must be some content which they are functionally, or through their feature, presenting to the user. We therefore keep talking about content, even when we are talking about features and functionality. A feature on its own which has no content is not what the noble Baroness, Lady Kidron, my noble friend Lady Harding and others are envisaging, but to follow the logic of the point they are making, it requires some content for the feature or functionality to cause its harm.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

But the content may not be harmful.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Yes, even if the content is not harmful. We keep saying “content” because it is the way the content is disseminated, as the Bill sets out, but the features and functionalities can increase the risks of harm as well. We have addressed this through looking at the cumulative effects and in other ways.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

This is the key question. For example, let us take a feature that is pushing something at you constantly; if it was pushing poison at you then it would obviously be harmful, but if it was pushing marshmallows then they would be singularly not harmful but cumulatively harmful. Is the Minister saying that the second scenario is still a problem and that the surfeit of marshmallows is problematic and will still be captured, even if each individual marshmallow is not harmful?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Yes, because the cumulative harm—the accumulation of marshmallows in that example—has been addressed.

Noble Lords should also be aware that the drafting of Amendment 281FA has the effect of saying that harm can arise from proposed new paragraphs (a), (b) and (c)—for example, from the

“age or characteristics of the likely user group”.

In effect, being a child or possessing a particular characteristic may be harmful. This may not be the intention of the noble Baronesses who tabled the amendment, but it highlights the important distinction between something being a risk factor that influences the risk of harm occurring and something being harmful.

The Government are clear that these aspects should properly be treated as risk factors. Other parts of the Bill already make it clear that the ways in which a service is designed and used may impact on the risk of harm suffered by users. I point again to paragraphs (f) to (h) of Clause 10(6); paragraph (e) talks about the level of risk of functionalities of the service, paragraph (f) talks about the different ways in which the service is used, and so on.

We have addressed these points in the Bill, though clearly not to the satisfaction of my noble friend, the noble Baroness, Lady Kidron, and others. As we conclude Report, I recognise that we have not yet convinced everyone that our approach achieves what we all seek, though I am grateful for my noble friend’s recognition that we all share the same aim in this endeavour. As I explained to the noble Baroness, Lady Kidron, on her Amendment 35, I was asking her not to press it because, if she did, the matter would have been dealt with on Report and we would not be able to return to it at Third Reading.

As the Bill heads towards another place with this philosophical disagreement still bubbling away, I am very happy to commit to continuing to talk to your Lordships—particularly when the Bill is in another place, so that noble Lords can follow the debates there. I am conscious that my right honourable friend Michelle Donelan, who has had a busy maternity leave and has spoken to a number of your Lordships while on leave, returns tomorrow in preparation for the Bill heading to her House. I am sure she will be very happy to speak even more when she is back fully at work, but we will both be happy to continue to do so.

I think it is appropriate, in some ways, that we end on this issue, which remains an area of difference. With that promise to continue these discussions as the Bill moves towards another place, I hope that my noble friend will be content not to press these amendments, recognising particularly that the noble Baroness, Lady Kidron, has already inserted this thinking into the Bill for consideration in the other House.

--- Later in debate ---
Moved by
281C: Clause 209, page 175, line 17, leave out from “dissemination” to end of line 18
Member’s explanatory statement
This amendment is consequential on the next amendment to this Clause in my name.
--- Later in debate ---
Moved by
281G: Clause 209, page 175, line 33, leave out “and (4)” and insert “to (4)”
Member’s explanatory statement
This amendment is consequential on the amendment in my name inserting new subsection (3A) into this Clause.
--- Later in debate ---
Moved by
281H: Clause 210, page 176, line 12, leave out “section 11 (duty” and insert “sections 11 and 11A (duties”
Member’s explanatory statement
This amendment provides that the term “online safety functions” includes OFCOM’s functions under section 11A of the Communications Act 2003 (inserted by the new Clause proposed to be inserted after Clause 149 in my name) regarding OFCOM’s media literacy strategy (as well as OFCOM’s functions under section 11 of that Act).
--- Later in debate ---
Moved by
284: Clause 211, page 176, leave out lines 27 and 28
Member’s explanatory statement
This amendment removes a definition of “age assurance” from Clause 211 as that term is now defined separately where used.
--- Later in debate ---
Moved by
287: Clause 211, page 177, line 10, after “91(1)”insert “or (Information in connection with an investigation into the death of a child)(1)”
Member’s explanatory statement
This amendment revises the definition of “information notice” so that it includes a notice under the new Clause proposed in my name concerning OFCOM’s power to obtain information in connection with an investigation into the death of a child.
--- Later in debate ---
Moved by
291: Clause 212, page 179, leave out line 3
Member’s explanatory statement
This amendment removes the entry for “age assurance” in the index of defined terms as that term is now defined separately where used.
--- Later in debate ---
Moved by
295: Clause 212, page 180, line 17, leave out “(in Part 5)”
Member’s explanatory statement
This amendment updates the entry for pornographic content consequential on the amendment to Clause 211 which inserts a definition of that term into that Clause which applies for the purposes of the whole Bill.
--- Later in debate ---
Moved by
299: Clause 214, page 182, line 9, at end insert—
“(aa) section (Sharing or threatening to share intimate photograph or film);(ab) section 171(2);(ac) section (Repeals in connection with offences under section (Sharing or threatening to share intimate photograph or film));”Member’s explanatory statement
This amendment revises the extent Clause so that the provisions mentioned extend to England and Wales only.
--- Later in debate ---
Moved by
300: Clause 215, page 182, line 37, leave out subsection (1)
Member’s explanatory statement
Clause 215(1) specifies which provisions of the Bill come into force on Royal Assent. This amendment omits subsection (1), but only because it is being moved further down in the section and replaced (see the amendment in my name below).

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, I will make a brief statement on the devolution status of the Bill. I am pleased to inform your Lordships’ House that both the Scottish Parliament and Senedd Cymru have voted to grant consent for all the relevant provisions. For Scotland, these provisions are the power to amend the list of exempt educational institutions, the power to amend the list of child sexual exploitation and abuse offences and the new offence of encouraging or assisting serious self-harm. For Wales, the provisions are the power to amend the list of exempt educational institutions, the false communications offence, the threatening communications offence, the flashing images offences and the offence of encouraging or assisting serious self-harm.

As noble Lords will be aware, because the Northern Ireland Assembly is adjourned the usual process for seeking legislative consent in relation to Northern Ireland has not been possible. In the absence of legislative consent from the Northern Ireland Assembly, officials from the relevant UK and Northern Ireland departments have worked together to ensure that the Bill considers and reflects the relevant aspects of devolved legislation so that we may extend the following provisions to Northern Ireland: the power to amend the list of exempt educational institutions, the false communications offence, the threatening communications offence and the offence of encouraging or assisting serious self-harm. His Majesty’s Government have received confirmation in writing from the relevant Permanent Secretaries in Northern Ireland that they are content that nothing has been identified which would cause any practical difficulty in terms of the existing policy and legislative landscape. Historically, this area of legislation in Northern Ireland has mirrored that in Great Britain, and we believe that legislating without the consent of the Northern Ireland Assembly is justified in these exceptional circumstances and mitigates the risk of leaving Northern Ireland without the benefit of the Bill’s important reforms and legislative parity.

We remain committed to ensuring sustained engagement on the Bill with all three devolved Administrations as it progresses through Parliament. I beg to move that the Bill be read a third time.

Clause 44: Secretary of State’s powers of direction

Amendment 1

Moved by
1: Clause 44, page 45, line 30, leave out from “must” to end of line 31 and insert “, as soon as reasonably practicable, be published and laid before Parliament.”
Member’s explanatory statement
This amendment provides that, in addition to publishing a direction under this Clause, the Secretary of State must also lay it before Parliament. Additionally the Secretary of State is required to do these things as soon as reasonably practicable. There is an exemption in certain circumstances (as to which see the next amendment to this Clause in my name).
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, His Majesty’s Government have listened carefully to the views expressed in Committee and on Report and have tabled amendments to the Bill to address concerns raised by noble Lords. Let me first again express my gratitude to my noble friend Lady Stowell of Beeston for her constructive engagement on the Secretary of State’s powers of direction. As I said during our previous debate on this topic, I am happy to support her Amendments139 and 140 from Report. The Government are therefore bringing forward two amendments to that effect today.

Noble Lords will recall that, whenever directing Ofcom about a code, the Secretary of State must publish that direction. Amendment 1 means that, alongside this, in most cases a direction will now need to be laid before Parliament. There may be some cases where it is appropriate for the Secretary of State to withhold information from a laid direction: for example, if she thinks that publishing it would be against the interests of national security. In these cases, Amendment 2 will instead require the Secretary of State to lay a statement before Parliament setting out that a direction has been given, the kind of code to which the direction relates and the reasons for not publishing it. Taken together, these amendments will ensure that your Lordships and Members of another place are always made aware as soon as a direction has been made and, wherever possible, understand the contents of that direction. I hope noble Lords will agree that, after the series of debates we have had, we have reached a sensible and proportionate position on these clauses and one which satisfies your Lordships’ House.

I am also grateful to the noble Baroness, Lady Kennedy of The Shaws, for her determined and collaborative work on the issue of threatening communications. Following the commitment I made to her on Report, I have tabled an amendment to make it explicit that the threatening communications offence captures threats where the recipient fears that someone other than the person sending the message will carry out the threat. I want to make it clear that the threatening communications offence, like other existing offences related to threats, already captures threats that could be carried out by third parties. This amendment does not change the scope of the offence, but the Government understand the desire of the noble Baroness and others to make this explicit in the Bill, and I am grateful to her for her collaboration.

Regarding Ofcom’s power of remote access, I am grateful to noble Lords, Lord Knight of Weymouth and Lord Allan of Hallam, my noble friend Lord Moylan and the noble Baroness, Lady Fox of Buckley, who unavoidably cannot be with us today, for raising their concerns about the perceived breadth of the power and the desire for further safeguards to ensure that it is used appropriately by the regulator.

I am also grateful to technology companies for the constructive engagement they have had with officials over the summer. As I set out on Report, the intention of our policy is to ensure clarity about Ofcom’s ability to observe empirical tests, which are a standard method for understanding algorithms and consequently for assessing companies’ compliance with the duties in the Bill. They involve taking a test data set, running it through an algorithmic system and observing the output.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I do not know how everyone has spent their summer, but this feels a bit like we have been working on a mammoth jigsaw puzzle and we are now putting in the final pieces. At times, through the course of this Bill, it has felt like doing a puzzle in the metaverse, where we have been trying to control an unreliable avatar that is actually assembling the jigsaw—but that would be an unfair description of the Minister. He has done really well in reflecting on what we have said, influencing his ministerial colleagues in a masterclass of managing upwards, and coming up with reasonable resolutions to previously intractable issues.

We are trusting that some of the outcome of that work will be attended to in the Commons, as the noble Baroness, Lady Morgan, has said, particularly the issues that she raised on risk, that the noble Baroness, Lady Kidron, raised on children’s safety by design, and that my noble friend Lady Merron raised on animal cruelty. We are delighted at where we think these issues have got to.

For today, I am pleased that the concerns of the noble Baroness, Lady Stowell, on Secretary of State powers, which we supported, have been addressed. I also associate myself with her comments on parliamentary scrutiny of the work of the regulator. Equally, we are delighted that the Minister has answered the concerns of my noble friend Lady Kennedy and that he has secured the legislative consent orders which he informed us of at the outset today. We would be grateful if the Minister could write to us answering the points of my noble friend Lord Rooker, which were well made by him and by the Delegated Powers Committee.

I am especially pleased to see that the issues which we raised at Report on remote access have been addressed. I feel smug, as I had to press quite hard for the Minister to leave the door open to come back at this stage on this. I am delighted that he is now walking through the door. Like the noble Lord, Lord Allan, I have just a few things that I would like clarification on—the proportional use of the powers, Ofcom taking into account user privacy, especially regarding live user data, and that the duration of the powers be time- limited.

Finally, I thank parliamentarians on all sides for an exemplary team effort. With so much seemingly falling apart around us, it is encouraging that, when we have common purpose, we can achieve a lot, as we have with this Bill.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, let me first address the points made by the noble Lord, Lord Rooker. I am afraid that, like my noble friend Lady Stowell of Beeston, I was not aware of the report of your Lordships’ committee. Unlike her, I should have been. I have checked with my private office and we have not received a letter from the committee, but I will ask them to contact the clerk to the committee immediately and will respond to this today. I am very sorry that this was not brought to my attention, particularly since the members of the committee met during the Recess to look at this issue. I have corresponded with my noble friend Lord McLoughlin, who chairs the committee, on each of its previous reports. Where we have disagreed, we have done so explicitly and set out our reasons. We have agreed with most of its previous recommendations. I am very sorry that I was not aware of this report and have not had the opportunity to provide answers for your Lordships’ House ahead of the debate.

Lord Rooker Portrait Lord Rooker (Lab)
- Hansard - - - Excerpts

The report was published on 31 August. It so happens that the committee has been forced to meet in an emergency session tomorrow morning because of government amendments that have been tabled to the levelling-up Bill, which will be debated next Wednesday, that require a report on the delegated powers, so we will have the opportunity to see what the Minister has said. I am very grateful for his approach.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

The committee will have a reply from me before it meets tomorrow. Again, I apologise. It should not be up to the committee to let the Minister know; I ought to have known about it.

I am very grateful to noble Lords for their support of the amendments that we have tabled in this group, which reflect the collaborative nature of the work that we have done and the thought which has been put into this by my ministerial colleagues and me, and by the Bill team, over the summer. I will have a bit more to say on that when I move that the Bill do now pass in a moment, but I am very grateful to those noble Lords who have spoken at this stage for highlighting the model of collaborative working that the Bill has shown.

The noble Baroness, Lady Ritchie of Downpatrick, asked for an update on timetables. Some of the implementation timetables which Ofcom has assessed depend a little on issues which may still change when the Bill moves to another place. If she will permit it, once they have been resolved I will write with the latest assessments from Ofcom, and, if appropriate, from us, on the implementation timelines. They are being recalculated in the light of amendments that have been made to the Bill and which may yet further change. However, everybody shares the desire to implement the Bill as swiftly as possible, and I am grateful that your Lordships’ work has helped us do our scrutiny with that in mind.

The noble Lord, Lord Allan, asked some questions about the remote viewing power. On proportionality, Ofcom will have a legal duty to exercise its power to view information remotely in a way that is proportionate, ensuring, as I said, that undue burdens are not placed on businesses. In assessing proportionality in line with this requirement, Ofcom would need to consider the size and resource capacity of a service when choosing the most appropriate way of gathering information. To comply with this requirement, Ofcom would also need to consider whether there was a less onerous method of obtaining the necessary information.

On the points regarding that and intrusion, Ofcom expects to engage with providers as appropriate about how to obtain the information it needs to carry out its functions. Because of the requirement on Ofcom to exercise its information-gathering powers proportionately, it would need to consider less onerous methods. As I said, that might include an audit or a skilled persons report, but we anticipate that, for smaller services in particular, those options could be more burdensome than Ofcom remotely viewing information.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

Will my noble friend draw attention to the part of Clause 122 that says that Ofcom cannot issue a requirement which is not technically feasible, as he has just said? That does not appear in the text of the clause, and it creates a potential conflict. Even if the requirement is not technically feasible—or, at least, if the platform claims that it is not—Ofcom’s power to require it is not mitigated by the clause. It still has the power, which it can exercise, and it can presumably take some form of enforcement action if it decides that the company is not being wholly open or honest. The technical feasibility is not built into the clause, but my noble friend has just added it, as with quite a lot of other stuff in the Bill.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

It has to meet minimum standards of accuracy and must have privacy safeguards in place. The clause talks about those in a positive sense, which sets out the expectation. I am happy to make clear, as I have, what that means: if the appropriate technology does not exist that meets these requirements, then Ofcom will not be able to use Clause 122 to require its use. I hope that that satisfies my noble friend.

Amendment 1 agreed.
--- Later in debate ---
Moved by
2: Clause 44, page 45, line 31, at end insert—
“(7A) If the Secretary of State considers that publishing and laying before Parliament a direction given under this section would be against the interests of national security, public safety or relations with the government of a country outside the United Kingdom—(a) subsection (7)(c) does not apply in relation to the direction, and(b) the Secretary of State must, as soon as reasonably practicable, publish and lay before Parliament a document stating—(i) that a direction has been given,(ii) the kind of code of practice to which it relates, and(iii) the reasons for not publishing it.”Member’s explanatory statement
This amendment provides that in the circumstances mentioned in the amendment the Secretary of State is not required to publish and lay before Parliament a direction given under this Clause but must instead publish and lay before Parliament a document stating that a direction has been given, the code of practice to which it relates and the reasons for not publishing it.
--- Later in debate ---
Moved by
4: Clause 52, page 52, line 12, leave out “subsection (9) of those sections” and insert “section 23(10) or 34(9)”
Member’s explanatory statement
This is a technical amendment which substitutes the correct cross-references into this provision.
--- Later in debate ---
Moved by
5: Clause 95, page 85, line 12, at end insert—
“(za) references to a service meeting the Category 1, Category 2A or Category 2B threshold conditions are to a service meeting those conditions in a way specified in regulations under paragraph 1 of Schedule 11 (see paragraph 1(4) of that Schedule);”Member’s explanatory statement
This amendment improves the drafting to clarify that a service “meets the Category 1 threshold conditions” (for example) if the service meets them in a way set out in regulations under Schedule 11.
--- Later in debate ---
Moved by
6: Clause 98, page 88, line 19, after “which” insert “does not meet the Category 1 threshold conditions and which”
Member’s explanatory statement
This amendment improves the drafting to clarify that services which are already Category 1 services, or which meet the conditions to be a Category 1 service, do not need to be assessed by OFCOM to see if they should be included in the list which is provided for by Clause 98.
--- Later in debate ---
Moved by
7: Clause 101, page 91, line 23, leave out from “that” to end of line 26 and insert “a person authorised by OFCOM is able to view remotely—”
Member’s explanatory statement
This amendment changes the wording of one of OFCOM’s information powers. The power now refers to viewing information remotely, rather than remotely accessing a service; the power is exercisable by a person authorised by OFCOM; and the power may only be exercised in relation to information as mentioned in Clause 101(3)(a) and (b).
--- Later in debate ---
Moved by
11: Clause 103, page 94, line 27, at end insert—
“(4A) An information notice requiring a person to take steps of a kind mentioned in section 101(3) must give the person at least seven days’ notice before the steps are required to be taken.” Member’s explanatory statement
This amendment has the effect that if a person receives a notice from OFCOM requiring them to allow OFCOM to remotely view information, they must be given at least 7 days to comply with the notice.
--- Later in debate ---
Moved by
12: Clause 121, page 105, line 32, after “101” insert “, 102”
Member’s explanatory statement
Clause 121 is about the admissibility of statements in criminal proceedings. This amendment adds Clause 102 to the list of relevant information powers (information in connection with an investigation into the death of a child).
--- Later in debate ---
Moved by
15: Clause 162, page 144, line 29, at end insert—
““age assurance” means age verification or age estimation;”Member’s explanatory statement
This amendment adds a definition of “age assurance” into this Clause.
--- Later in debate ---
Moved by
16: Clause 182, page 159, line 29, after “out” insert “(whether or not by the person sending the message)”
Member’s explanatory statement
This amendment makes it clear that the threatening communications offence in Clause 182 may be committed by a person who sends a threatening message regardless of who might carry out the threat.
--- Later in debate ---
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - -

That the Bill do now pass.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, in begging to move that the Bill do now pass, I add my words of thanks to all noble Lords who have been involved over many years and many iterations of the Bill, particularly during my time as the Minister and in the diligent scrutiny we have given it in recent months. The Bill will establish a vital legislative framework, making the internet safer for all, particularly for children. We are now closer than ever to achieving that important goal. In a matter of months from Royal Assent, companies will be required to put in place protections to tackle illegal content on their services or face huge fines. I am very grateful to noble Lords for the dedication, attention and time they have given to the Bill while it has been before your Lordships’ House.

The Bill will mark a significant change in children’s safety online. Last month, data from UK police forces showed that 6,350 offences relating to sexual communications with a child were recorded last year alone. These are horrifying statistics which underline the importance of the Bill in building a protective shield for our children online. We cannot let perpetrators of such abhorrent crimes stalk children online and hide behind their screens, nor let companies continue to turn a blind eye to the harm being done to children on their services. We are working closely with Ofcom to make sure that the protections for children established by the Bill are enforced as soon as possible, and we have been clear that companies should not wait for the legislation to come into force before taking action.

The aim of keeping children safe online is woven throughout the Bill, and the changes that we have made throughout its passage in your Lordships’ House have further bolstered it. In order to provide early and clear guidance to companies and Ofcom regarding the content from which children must be protected, rather than addressing these later via secondary legislation, the categories of primary priority and priority content which is harmful to children will now be set out in the Bill.

Following another amendment made during your Lordships’ scrutiny, providers of the largest services will also be required to publish summaries of their risk assessments for illegal content and content which is harmful to children. Further changes to the Bill have also made sure that technology executives must take more responsibility for the safety of those who use their websites. Senior managers will face criminal liability if they fail to comply with steps set by Ofcom following enforcement action to keep children safe on their platforms, with the offence punishable with up to two years in prison.

Noble Lords have rightly raised concerns about what the fast-changing technological landscape will mean for children. The Bill faces the future and is designed to keep pace with emerging technological changes such as AI-generated pornography.

Child sexual exploitation and abuse content generated by AI is illegal, regardless of whether it depicts a real child or not, and the Bill makes it clear that technology companies will be required to identify this content proactively and remove it. Whatever the future holds, the Bill will ensure that guard rails are in place to allow our children to explore it safely online.

I have also had the pleasure of collaborating with noble Lords from across your Lordships’ House who have championed the important cause of strengthening protections for women and girls online, who we know disproportionately bear the brunt of abhorrent behaviour on the internet. Following changes made earlier to the Bill, Ofcom will be required to produce and publish guidance which summarises in one clear place measures that should be taken to reduce the risk of harm to women and girls online. The amendment will also oblige Ofcom to consult when producing the guidance, ensuring that it reflects the voices of women and girls as well as the views of experts on this important issue.

The Bill strikes a careful balance: it tackles criminal activity online and protects our children while enshrining freedom of expression in its legislative framework. A series of changes to the Bill has ensured that adults are provided with greater control over their online experience. All adult users of the largest services will have access to tools which, if they choose to use them, will allow them to filter out content from non-verified users and to reduce the likelihood of encountering abusive content. These amendments, which have undergone careful consideration and consultation, will ensure that the Bill remains proportionate, clear and future-proof.

I am very grateful to noble Lords who have helped us make those improvements and many more. I am conscious that a great number of noble Lords who have taken part in our debates were part of the pre-legislative scrutiny some years ago. They know the Bill very well and they know the issues well, which has helped our debates be well informed and focused. It has helped the scrutiny of His Majesty’s Government, and I hope that we have risen to that.

I am very grateful to all noble Lords who have made representations on behalf of families who have suffered bereavements because of the many terrible experiences online of their children and other loved ones. There are too many for me to name now, and many more who have not campaigned publicly but who I know have been following the progress of the Bill carefully, and we remember them all today.

Again, there are too many noble Lords for me to single out all those who have been so vigilant on this issue. I thank my colleagues on the Front Bench, my noble friends Lord Camrose and Lord Harlech, and on the Front Bench opposite the noble Lords, Lord Knight and Lord Stevenson, and the noble Baroness, Lady Merron. On the Liberal Democrat Benches, I thank the noble Lords, Lord Clement-Jones and Lord Allan of Hallam—who has been partly on the Front Bench and partly behind—who have been working very hard on this.

I also thank the noble Baroness, Lady Kidron, whom I consider a Front-Bencher for the Cross Benches on this issue. She was at the vanguard of many of these issues long before the Bill came to your Lordships’ House and will continue to be long after. We are all hugely impressed by her energy and personal commitment, following the debates not only in our own legislature but in other jurisdictions. I am grateful to her for the collaborative nature of her work with us.

I will not single out other noble Lords, but I am very grateful to them from all corners of the House. They have kicked the tyres of the Bill and asked important questions; they have given lots of time and energy to it and it is a better Bill for that.

I put on record my thanks to the huge team in my department and the Department for Science, Innovation and Technology, who, through years of work, expertise and determination, have brought the Bill to this point. I am grateful to the staff of your Lordships’ House and to colleagues from the Office of the Parliamentary Counsel, in particular Maria White and Neil Shah, and, at the Department for Science, Innovation and Technology, Sarah Connolly, Orla MacRae, Caroline Bowman and Emma Hindley as well as their huge teams, including those who have worked on the Bill over the years but are not currently working on it. They have worked extremely hard and been generous with their time to noble Lords for the use of our work.

The Bill will make a vital difference to people’s safety online, especially children’s safety. It has been a privilege to play a part in it. I was working as a special adviser at the Home Office when this area of work was first mooted. I remember that, when this Bill was suggested in the 2017 manifesto, people suggested that regulating the internet was a crazy idea. The biggest criticism now is that we have not done it sooner. I am very grateful to noble Lords for doing their scrutiny diligently but speedily, and I hope to see the Bill on the statute book very soon. I beg to move that the Bill do now pass.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to the Minister for his very kind words to everybody, particularly my Front Bench and me. I also wish him a speedy recovery from his recent illness, although I was less sympathetic when I discovered how much he has been “managing upwards”—in the words of my noble friend Lord Knight—and achieving for us in the last few days. He has obviously been recovering and I am grateful for that. The noble Lord has steered the Bill through your Lordships’ House with great skill and largely single-handedly. It has been a pleasure to work with him, even when he was turning down our proposals and suggestions for change, which he did in the nicest possible way but absolutely firmly.

--- Later in debate ---
Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

I rise briefly to raise the question of access to data by academics and research organisations. Before I do so, I want to express profound thanks to noble Lords who have worked so collaboratively to create a terrific Bill that will completely transform and hold to account those involved in the internet, and make it a safer place. That was our mission and we should be very proud of that. I cannot single out noble Peers, with the exception of the noble Baroness, Lady Kidron, with whom I worked collaboratively both on age assurance and on harms. It was a partnership I valued enormously and hope to take forward. Others from all four corners of the House contributed to the parts of the Bill that I was particularly interested in. As I look around, I see so many friends who stuck their necks out and spoke so movingly, for which I am enormously grateful.

The question of data access is one of the loose ends that did not quite make it into the Bill. I appreciate the efforts of my noble friend the Minister, the Secretary of State and the Bill team in this matter and their efforts to try and wangle it in; I accept that it did not quite make it. I would like to hear reassurance from my noble friend that this is something that the Government are prepared to look at in future legislation. If he could provide any detail on how and in which legislation it could be revisited, I would be enormously grateful.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I will be brief and restrict myself to responding to the questions which have been raised. I will hold to my rule of not trying to thank all noble Lords who have played their part in this scrutiny, because the list is indeed very long. I agree with what the noble Lord, Lord Clement-Jones, said about this being a Back-Bench-driven Bill, and there are many noble Lords from all corners of the House and the Back Benches who have played a significant part in it. I add my thanks to the noble Baroness, Lady Benjamin, not just for her kind words, but for her years of campaigning on this, and to my noble friend Lord Bethell who has worked with her—and others—closely on the issues which she holds dear.

I also thank my noble friend Lord Moylan who has often swum against the tide of debate, but very helpfully so, and on important matters. In answer to his question about Wikipedia, I do not have much to add to the words that I have said a few times now about the categorisation, but on his concerns about the parliamentary scrutiny for this I stress that it is the Secretary of State who will set the categorisation thresholds. She is, of course, a Member of Parliament, and accountable to it. Ofcom will designate services based on those thresholds, so the decision-making can be scrutinised in Parliament, even if not in the way he would have wished.

I agree that we should all be grateful to the noble Lord, Lord Allan of Hallam, because he addressed some of the questions raised by my noble friend Lady Stowell of Beeston. In brief, the provision is flexible for where the technological solutions do not currently exist, because Ofcom can require services to develop or source new solutions.

This close to the gracious Speech, I will not point to a particular piece of legislation in which we might revisit the issue of researchers’ access, as raised by my noble friend Lord Bethell, but I am happy to say that we will certainly look at that again, and I know that he will take the opportunity to raise it.

Noble Lords on the Front Benches opposite alluded to the discussions which are continuing—as I committed on Report to ensure that noble Lords are able to be part of discussions as the Bill heads to another place—on functionalities and on the amendment of my noble friend Lady Morgan on category 1 services. She is one of a cavalcade of former Secretaries of State who have been so helpful in scrutinising the Bill. It is for another place to debate them, but I am grateful to noble Lords who have given their time this week to have the discussions which I committed to have and will continue to have as the Bill heads there, so that we can follow those issues hopefully to a happy resolution.

I thank my noble friend Lady Harding of Winscombe for the concessions that she wrought on Report, and for the part that she has played in discussions. She has also given a great deal of time outside the Chamber.

We should all be very grateful to the noble Lord, Lord Grade of Yarmouth, who has sat quietly throughout most of our debates—understandably, in his capacity as chairman of Ofcom—but he has followed them closely and taken those points to the regulator. Dame Melanie Dawes and all the team there stand ready to implement this work and we should be grateful to the noble Lord, Lord Grade of Yarmouth, and to all those at Ofcom who are ready to put it into action.

Bill passed and returned to the Commons with amendments.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- View Speech - Hansard - -

That this House do not insist on its Amendment 17 and do agree with the Commons in their Amendments 17A and 17B in lieu.

17A: Clause 10, page 9, line 30, leave out paragraph (e) and insert—
“(e) the extent to which the design of the service, in particular its functionalities, affects the level of risk of harm that might be suffered by children, identifying and assessing those functionalities that present higher levels of risk, including functionalities—
(i) enabling adults to search for other users of the service (including children), or
(ii) enabling adults to contact other users (including children) by means of the service;”
17B: Clause 10, page 9, line 38, after “used,” insert “including functionalities or other features of the service that affect how much children use the service (for example a feature that enables content to play automatically),”
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - -

My Lords, I beg to move Motion A and, with the leave of the House, I shall also speak to Motions B to H.

I am pleased to say that the amendments made in your Lordships’ House to strengthen the Bill’s provisions were accepted in another place. His Majesty’s Government presented a number of amendments in lieu of changes proposed by noble Lords, which are before your Lordships today.

I am grateful to my noble friend Lady Morgan of Cotes for her continued engagement on the issue of small but high-risk platforms. The Government were happy to accept her proposed changes to the rules for determining the conditions that establish which services will be designated as category 1 or 2B services. In making the regulations, the Secretary of State will now have the discretion to decide whether to set a threshold based on either the number of users or the functionalities offered, or on both factors. Previously, the threshold had to be based on a combination of both.

It remains the expectation that services will be designated as category 1 services only where it is appropriate to do so, to ensure that the regime remains proportionate. We do not, for example, expect to apply these duties to large companies with very limited functionalities. This change, however, provides greater flexibility to bring smaller services with particular functionalities into scope of category 1 duties, should it be necessary to do so. As a result of this amendment, we have also made a small change to Clause 98—the emerging services list—to ensure that it makes operational sense. Before my noble friend’s amendment, a service would be placed on the emerging services list if it met the functionality condition and 75% of the user number threshold. Under the clause as amended, a service could be designated as category 1 without meeting both a functionality and a user condition. Without this change, Ofcom would, in such an instance, be required to list only services which meet the 75% condition.

We have heard from both Houses about the importance of ensuring that technology platforms are held to account for the impact of their design choices on children’s safety. We agree and the amendments we proposed in another place make it absolutely clear that providers must assess the impact of their design choices on the risk of harm to children, and that they deliver robust protections for children on all areas of their service. I thank in particular the noble Baroness, Lady Kidron, the noble Lords, Lord Stevenson of Balmacara and Lord Clement-Jones, my noble friend Lady Harding of Winscombe and the right reverend Prelate the Bishop of Oxford for their hard work to find an acceptable way forward. I also thank Sir Jeremy Wright MP for his helpful contributions to this endeavour.

Noble Lords will remember that an amendment from the noble Baroness, Lady Merron, sought to require the Secretary of State to review certain offences relating to animals and, depending on the outcome of that review, to list these as priority offences. To accelerate protections in this important area, the Government have tabled an amendment in lieu listing Section 4(1) of the Animal Welfare Act 2006 as a priority offence. This will mean that users can be protected from animal torture material more swiftly. Officials at the Department for Environment, Food and Rural Affairs have worked closely with the RSPCA and are confident that the Section 4 offence, unnecessary suffering of an animal, will capture a broad swathe of illegal activity. Adding this offence to Schedule 7 will also mean that linked inchoate offences, such as encouraging or assisting this behaviour, are captured by the illegal content duties. I am grateful to the noble Baroness for raising this matter, for her discussions on them with my noble friend Lord Camrose and for her support for the amendment we are making in lieu.

To ensure the speedy implementation of the Bill’s regime, we have added Clauses 116 to 118, which relate to the disclosure of information by Ofcom, and Clauses 170 and 171, which relate to super-complaints, to the provisions to be commenced immediately on Royal Assent. These changes will allow Ofcom and the Government to hold the necessary consultations as quickly as possible after Royal Assent. As noble Lords know, the intention of the Bill is to make the UK the safest place in the world to be online, particularly for children. I firmly believe that the Bill before your Lordships today will do that, strengthened by the changes made in this House and by the collaborative approach that has been shown, not just in all quarters of this Chamber but between both Houses of Parliament. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the Minister very warmly for his introduction today. I shall speak in support of Motions A to H inclusive. Yes, I am very glad that we have agreement at this final milestone of the Bill before Royal Assent. I pay tribute to the Minister and his colleagues, to the Secretary of State, to the noble Baronesses, Lady Morgan, Lady Kidron and Lady Merron, who have brought us to this point with their persistence over issues such as functionalities, categorisation and animal cruelty.

This is not the time for rehearsing any reservations about the Bill. The Bill must succeed and implementation must take place swiftly. So, with many thanks to the very many, both inside and outside this House, who have worked so hard on the Bill for such a long period, we on these Benches wish the Bill every possible success. He is in his place, so I can say that it is over to the noble Lord, Lord Grade, and his colleagues at Ofcom, in whom we all have a great deal of confidence.

--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I too thank the Minister for his swift and concise introduction, which very carefully covered the ground without raising any issues that we have to respond to directly. I am grateful for that as well.

The noble Lord, Lord Clement-Jones, was his usual self. The only thing that I missed, of course, was the quotation that I was sure he was going to give from the pre-legislative scrutiny report on the Bill, which has been his constant prompt. I also think that the noble Baroness, Lady Finlay, was very right to remind us of those outside the House who we must remember as we reach the end of this stage.

Strangely, although we are at the momentous point of allowing this Bill to go forward for Royal Assent, I find that there is actually very little that needs to be said. In fact, everything has been said by many people over the period; trying to make any additional points would be meretricious persiflage. So I will make two brief points to wind up this debate.

First, is it not odd to reflect on the fact that this historic Parliament, with all our archaic rules and traditions, has the capacity to deal with a Bill that is regulating a technology which most of us have difficulty in comprehending, let alone keeping up with? However, we have done a very good job and, as a result, I echo the words that have already been said; I think the internet will now be a much safer place for children to enjoy and explore, and the public interest will be well served by this Bill, even though we accept that it is likely to only be the first of a number of Bills that will be needed in the years to come.

Secondly, I have been reflecting on the offer I made to the Government at Second Reading, challenging them to work together with the whole House to get the best Bill that we could out of what the Commons had presented to us. That of course could have turned out to be a slightly pointless gesture if nobody had responded positively—but they did. I particularly thank the Minister and the Bill team for rising to the challenge. There were problems initially, but we got there in the end.

More widely, there was, I know, a worry that committing to working together would actually stifle debate and somehow limit our crucial role of scrutiny. But actually I think it had the opposite effect. Some of the debates we had in Committee, from across the House, were of the highest standard, and opened up issues which needed to be resolved. People listened to each other and responded as the debate progressed. The discussion extended to the other place. It is very good to see Sir Jeremy Wright here; he has played a considerable role in resolving the final points.

It will not work for all Bills, but if the politics can be ignored, or at least put aside, it seems to make it easier to get at the issues that need to be debated in the round. In suggesting this approach, I think we may have found a way of getting the best out of our House —something that does not always occur. I hope that lesson can be listened to by all groups and parties.

For myself, participating in this Bill and the pre-legislative scrutiny committee which preceded it has been a terrific experience. Sadly, a lot of people who contributed to our discussions over that period cannot be here today, but I hope they read this speech in Hansard, because I want to end by thanking them, and those here today, for being part of this whole process. We support the amendments before the House today and wish good luck to the noble Lord, Lord Grade.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I am very conscious that this is not the end of the road. As noble Lords have rightly pointed out in wishing the Bill well, attention now moves very swiftly to Ofcom, under the able chairmanship of the noble Lord, Lord Grade of Yarmouth, who has participated, albeit silently, in our proceedings before, and to the team of officials who stand ready to implement this swiftly. The Bill benefited from pre-legislative scrutiny. A number of noble Lords who have spoken throughout our deliberations took part in the Joint Committee of both Houses which did that. It will also benefit from post-legislative scrutiny, through the Secretary of State’s review, which will take place between two and five years after Royal Assent. I know that the noble Lords who have worked so hard on this Bill for many years will be watching it closely as it becomes an Act of Parliament, to ensure that it delivers what we all want it to.

The noble Lord, Lord Stevenson, reminded us of the challenge he set us at Second Reading: to minimise the votes in dissent and to deliver this Bill without pushing anything to ping-pong. I think I was not the only one in the Chamber who was sceptical about our ability to do so, but it is thanks to the collaborative approach and the tone that he has set that we have been able to do that. That is a credit to everybody involved.

--- Later in debate ---
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - -

That this House do not insist on its Amendment 20, to which the Commons have disagreed for their Reason 20A.

20A: Because the Bill already makes sufficient provision requiring providers of user-to-user- services to mitigate the impact of harm to children online.
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - -

That this House do not insist on its Amendment 22, to which the Commons have disagreed for their Reason 22A.

22A: Because the Bill already makes sufficient provision requiring providers of user-to-user- services to mitigate the impact of harm to children online.
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - -

That this House do not insist on its Amendment 81 and do agree with the Commons in their Amendments 81A, 81B and 81C in lieu.

81A: Clause 25, page 26, line 31, leave out paragraph (c) and insert—
“(c) the extent to which the design of the service, in particular its functionalities, affects the level of risk of harm that might be suffered by children, identifying and assessing those functionalities that present higher levels of risk, including a functionality that makes suggestions relating to users’ search requests (predictive search functionality);”
--- Later in debate ---
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - -

That this House do not insist on its Amendment 148 and do agree with the Commons in their Amendment 148A in lieu.

148A: Page 205, line 36, at end insert—
“Animal welfare
32A An offence under section 4(1) of the Animal Welfare Act 2006 (unnecessary suffering of an animal).”
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - -

That this House do agree with the Commons in their Amendment 182A.

182A (as an amendment to Amendment 182): Line 1, leave out ““presented by content”” and insert ““content on””
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - -

That this House do agree with the Commons in their Amendments 349A and 349B.

349A (as an amendment to Amendment 349): Line 20, at end insert—
“(qa) sections 104 to 106;”
--- Later in debate ---
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - -

That this House do agree with the Commons in their Amendments 391A and 391B.

391A (as an amendment to Amendment 391): Line 1, after ““and” insert “at least one specified condition about”