None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 28, in clause 10, page 9, line 18, at end insert—

“(ba) matters relating to CSEA content including—

(i) the level of illegal images blocked at the upload stage and number and rates of livestreams of CSEA in public and private channels terminated; and

(ii) the number and rates of images and videos detected and removed by different tools, strategies and/or interventions.”

This amendment requires the children’s risk assessment to consider matters relating to CSEA content.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - -

As this is the first time I have spoken in the Committee, may I say that it is a pleasure to serve with you in the Chair, Ms Rees? I agree with my hon. Friend the Member for Pontypridd that we are committed to improving the Bill, despite the fact that we have some reservations, which we share with many organisations, about some of the structure of the Bill and some of its provisions. As my hon. Friend has detailed, there are particular improvements to be made to strengthen the protection of children online, and I think the Committee’s debate on this section is proving fruitful.

Amendment 28 is a good example of where we must go further if we are to achieve the goal of the Bill and protect children from harm online. The amendment seeks to require regulated services to assess their level of risk based, in part, on the frequency with which they are blocking, detecting and removing child sexual exploitation and abuse content from their platforms. By doing so, we will be able to ascertain the reality of their overall risk and the effectiveness of their existing response.

The addition of livestreamed child sexual exploitation and abuse content not only acknowledges first-generation CSEA content, but recognises that livestreamed CSEA content happens on both public and private channels, and that they require different methods of detection.

Furthermore, amendment 28 details the practical information needed to assess whether the action being taken by a regulated service is adequate in countering the production and dissemination of CSEA content, in particular first-generation CSEA content. Separating the rates of terminated livestreams of CSEA in public and private channels is important, because those rates may vary widely depending on how CSEA content is generated. By specifying tools, strategies and interventions, the amendment would ensure that the systems in place to detect and report CSEA are adequate, and that is why we would like it to be part of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Government support the spirit of amendments 17 and 28, which seek to achieve critical objectives, but the Bill as drafted delivers those objectives. In relation to amendment 17 and cross-platform risk, clause 8 already sets out harms and risks—including CSEA risks—that arise by means of the service. That means through the service to other services, as well as on the service itself, so that is covered.

Amendment 28 calls for the risk assessments expressly to cover illegal child sexual exploitation content, but clause 8 already requires that to happen. Clause 8(5) states that the risk assessment must cover the

“risk of individuals who are users of the service encountering…each kind of priority illegal content”.

If we follow through the definition of priority illegal content, we find all those CSEA offences listed in schedule 6. The objective of amendment 28 is categorically delivered by clause 8(5)(b), referencing onwards to schedule 6.

--- Later in debate ---
Children’s Risk Assessment duties
Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I beg to move amendment 15, in clause10, page 8, line 41, at end insert—

“(4A) A duty for the children’s risk assessment to be approved by either—

(a) the board of the entity; or, if the organisation does not have a board structure,

(b) a named individual who the provider considers to be a senior manager of the entity, who may reasonably be expected to be in a position to ensure compliance with the children’s risk assessment duties, and reports directly into the most senior employee of the entity.”

This amendment seeks to ensure that regulated companies’ boards or senior staff have responsibility for children’s risk assessments.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 11, in clause 10, page 9, line 2, at end insert—

“(5A) A duty to publish the children’s risk assessment and proactively supply this to OFCOM.”

This amendment creates a duty to publish the children’s risk assessment and supply it to Ofcom.

Amendment 27, in clause 10, page 9, line 25, after “facilitating” insert “the production of illegal content and”

This amendment requires the children’s risk assessment to consider the production of illegal content.

Clause 10 stand part.

Amendment 16, in clause 25, page 25, line 10, at end insert—

‘”(3A) A duty for the children’s risk assessment to be approved by either—

(a) the board of the entity; or, if the organisation does not have a board structure,

(b) a named individual who the provider considers to be a senior manager of the entity, who may reasonably be expected to be in a position to ensure compliance with the children’s risk assessment duties, and reports directly into the most senior employee of the entity.”

This amendment seeks to ensure that regulated companies’ boards or senior staff have responsibility for children’s risk assessments.

Amendment 13, in clause 25, page 25, line 13, at end insert—

“(4A) A duty to publish the children’s risk assessment and proactively supply this to OFCOM.”

This amendment creates a duty to publish the children’s risk assessment and supply it to Ofcom.

Amendment 32, in clause 25, page 25, line 31, after “facilitating” insert “the production of illegal content and”

This amendment requires the children’s risk assessment to consider risks relating to the production of illegal content.

Clause 25 stand part.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I will speak to other amendments in this group as well as amendment 15. The success of the Bill’s regulatory framework relies on regulated companies carefully risk-assessing their platforms. Once risks have been identified, the platform can concentrate on developing and implementing appropriate mitigations. However, up to now, boards and top executives have not taken the risk to children seriously. Services have either not considered producing risk assessments or, if they have done so, they have been of limited efficacy and failed to identify and respond to harms to children.

In evidence to the Joint Committee, Frances Haugen explained that many of the corporate structures involved are flat, and accountability for decision making can be obscure. At Meta, that means teams will focus only on delivering against key commercial metrics, not on safety. Children’s charities have also noted that corporate structures in the large technology platforms reward employees who move fast and break things. Those companies place incentives on increasing return on investment rather than child safety. An effective risk assessment and risk mitigation plan can impact on profit, which is why we have seen so little movement from companies to take the measures themselves without the duty being placed on them by legislation.

It is welcome that clause 10 introduces a duty to risk-assess user-to-user services that are likely to be accessed by children. But, as my hon. Friend the Member for Pontypridd said this morning, it will become an empty, tick-box exercise if the Bill does not also introduce the requirement for boards to review and approve the risk assessments.

The Joint Committee scrutinising the draft Bill recommended that the risk assessment be approved at board level. The Government rejected that recommendation on the grounds thar Ofcom could include that in its guidance on producing risk assessments. As with much of the Bill, it is difficult to blindly accept promised safeguards when we have not seen the various codes of practice and guidance materials. The amendments would make sure that decisions about and awareness of child safety went right to the top of regulated companies. The requirement to have the board or a senior manager approve the risk assessment will hardwire the safety duties into decision making and create accountability and responsibility at the most senior level of the organisation. That should trickle down the organisation and help embed a culture of compliance across it. Unless there is a commitment to child safety at the highest level of the organisation, we will not see the shift in attitude that is urgently needed to keep children safe, and which I believe every member of the Committee subscribes to.

On amendments 11 and 13, it is welcome that we have risk assessments for children included in the Bill, but the effectiveness of that duty will be undermined unless the risk assessments can be available for scrutiny by the public and charities. In the current version of the Bill, risk assessments will only be made available to the regulator, which we debated on an earlier clause. Companies will be incentivised to play down the likelihood of currently emerging risks because of the implications of having to mitigate against them, which may run counter to their business interests. Unless the risk assessments are published, there will be no way to hold regulated companies to account, nor will there be any way for companies to learn from one another’s best practice, which is a very desirable aim.

The current situation shows that companies are unwilling to share risk assessments even when requested. In October 2021, following the whistleblower disclosures made by Frances Haugen, the National Society for the Prevention of Cruelty to Children led a global coalition of 60 child protection organisations that urged Meta to publish its risk assessments, including its data privacy impact assessments, which are a legal requirement under data protection law. Meta refused to share any of its risk assessments, even in relation to child sexual abuse and grooming. The company argued that risk assessments were live documents and it would not be appropriate for it to share them with any organisation other than the Information Commissioner’s Office, to whom it has a legal duty to disclose. As a result, civil society organisations and the charities that I talked about continue to be in the dark about whether and how Meta has appropriately identified online risk to children.

Making risk assessments public would support the smooth running of the regime and ensure its broader effectiveness. Civil society and other interested groups would be able to assess and identify any areas where a company might not be meeting its safety duties and make full, effective use of the proposed super-complaints mechanism. It will also help civil society organisations to hold the regulated companies and the regulator, Ofcom, to account.

As we have seen from evidence sessions, civil society organisations are often at the forefront of understanding and monitoring the harms that are occurring to users. They have an in depth understanding of what mitigations may be appropriate and they may be able to support the regulator to identify any obvious omissions. The success of the systemic risk assessment process will be significantly underpinned by and reliant upon the regulator’s being able to rapidly and effectively identify new and emerging harms, and it is highly likely that the regulator will want to draw on civil society expertise to ensure that it has highly effective early warning functions in place.

However, civil society organisations will be hampered in that role if they remain unable to determine what, if anything, companies are doing to respond to online threats. If Ofcom is unable to rapidly identify new and emerging harms, the resulting delays could mean entire regulatory cycles where harms were not captured in risk profiles or company risk assessments, and an inevitable lag between harms being identified and companies being required to act upon them. It is therefore clear that there is a significant public value to publishing risk assessments.

Amendments 27 and 32 are almost identical to the suggested amendments to clause 8 that we discussed earlier. As my hon. Friend the Member for Pontypridd said in our discussion about amendments 25, 26 and 30, the duty to carry out a suitable and sufficient risk assessment could be significantly strengthened by preventing the creation of illegal content, not only preventing individuals from encountering it. I know the Minister responded to that point, but the Opposition did not think that response was fully satisfactory. This is just as important for children’s risk assessments as it is for illegal content risk assessments.

Online platforms are not just where abusive material is published. Sex offenders use mainstream web platforms and services as tools to commit child sexual abuse. This can be seen particularly in the livestreaming of child sexual exploitation. Sex offenders pay to direct and watch child sexual abuse in real time. The Philippines is a known hotspot for such abuse and the UK has been identified by police leads as the third-largest consumer of livestreamed abuse in the world. What a very sad statistic that our society is the third-largest consumer of livestreamed abuse in the world.

Ruby is a survivor of online sexual exploitation in the Philippines, although Ruby is not her real name; she recently addressed a group of MPs about her experiences. She told Members how she was trafficked into sexual exploitation aged 16 after being tricked and lied to about the employment opportunities she thought she would be getting. She was forced to perform for paying customers online. Her story is harrowing. She said:

“I blamed myself for being trapped. I felt disgusted by every action I was forced to do, just to satisfy customers online. I lost my self-esteem and I felt very weak. I became so desperate to escape that I would shout whenever I heard a police siren go by, hoping somebody would hear me. One time after I did this, a woman in the house threatened me with a knife.”

Eventually, Ruby was found by the Philippine authorities and, after a four-year trial, the people who imprisoned her and five other girls were convicted. She said it took many years to heal from the experience, and at one point she nearly took her own life.

It should be obvious that if we are to truly improve child protection online we need to address the production of new child abuse material. In the Bill, we have a chance to address not only what illegal content is seen online, but how online platforms are used to perpetrate abuse. It should not be a case of waiting until the harm is done before taking action.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As the hon. Lady said, we discussed in the groupings for clauses 8 and 9 quite a few of the broad principles relating to children, but I will none the less touch on some of those points again because they are important.

On amendment 27, under clause 8 there is already an obligation on platforms to put in place systems and processes to reduce the risk that their services will be used to facilitate the presence of illegal content. As that includes the risk of illegal content being present, including that produced via the service’s functionality, the terrible example that the hon. Lady gave is already covered by the Bill. She is quite right to raise that example, because it is terrible when such content involving children is produced, but such cases are expressly covered in the Bill as drafted, particularly in clause 8.

Amendment 31 covers a similar point in relation to search. As I said for the previous grouping, search does not facilitate the production of content; it helps people to find it. Clearly, there is already an obligation on search firms to stop people using search engines to find illegal content, so the relevant functionality in search is already covered by the Bill.

Amendments 15 and 16 would expressly require board member sign-off for risk assessments. I have two points to make on that. First, the duties set out in clause 10(6)(h) in relation to children’s risk assessments already require the governance structures to be properly considered, so governance is directly addressed. Secondly, subsection (2) states that the risk assessment has to be “suitable and sufficient”, so it cannot be done in a perfunctory or slipshod way. Again, Ofcom must be satisfied that those governance arrangements are appropriate. We could invent all the governance arrangements in the world, but the outcome needs to be delivered and, in this case, to protect children.

Beyond governance, the most important things are the sanctions and enforcement powers that Ofcom can use if those companies do not protect children. As the hon. Lady said in her speech, we know that those companies are not doing enough to protect children and are allowing all kinds of terrible things to happen. If those companies continue to allow those things to happen, the enforcement powers will be engaged, and they will be fined up to 10% of their global revenue. If they do not sort it out, they will find that their services are disconnected. Those are the real teeth that will ensure that those companies comply.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I know that the Minister listened to Frances Haugen and to the members of charities. The charities and civil society organisations that are so concerned about this point do not accept that the Bill addresses it. I cannot see how his point addresses what I said about board-level acceptance of that role in children’s risk assessments. We need to change the culture of those organisations so that they become different from how they were described to us. He, like us, was sat there when we heard from the big platform providers, and they are not doing enough. He has had meetings with Frances Haugen; he knows what they are doing. It is good and welcome that the regulator will have the powers that he mentions, but that is just not enough.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I agree with the hon. Lady that, as I said a second ago, those platforms are not doing enough to protect children. There is no question about that at all, and I think there is unanimity across the House that they are not doing enough to protect children.

I do not think the governance point is a panacea. Frankly, I think the boards of these companies are aware of what is going on. When these big questions arise, they go all the way up to Mark Zuckerberg. It is not as if Mark Zuckerberg and the directors of companies such as Meta are unaware of these risks; they are extremely aware of them, as Frances Haugen’s testimony made clear.

We do address the governance point. As I say, the risk assessments do need to explain how governance matters are deployed to consider these things—that is in clause 10(6)(h). But for me, it is the sanctions—the powers that Ofcom will have to fine these companies billions of pounds and ultimately to disconnect their service if they do not protect our children—that will deliver the result that we need.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

The Minister is talking about companies of such scale that even fines of billions will not hurt them. I refer him to the following wording in the amendments:

“a named individual who the provider considers to be a senior manager of the entity, who may reasonably be expected to be in a position to ensure compliance with the children’s risk assessment duties”.

That is the minimum we should be asking. We should be asking these platforms, which are doing so much damage and have had to be dragged to the table to do anything at all, to be prepared to appoint somebody who is responsible. The Minister tries to gloss over things by saying, “Oh well, they must be aware of it.” The named individual would have to be aware of it. I hope he understands the importance of his role and the Committee’s role in making this happen. We could make this happen.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As I say, clause 10 already references the governance arrangements, but my strong view is that the only thing that will make these companies sit up and take notice—the only thing that will make them actually protect children in a way they are currently not doing—is the threat of billions of pounds of fines and, if they do not comply even after being fined at that level, the threat of their service being disconnected. Ultimately, that is the sanction that will make these companies protect our children.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Barbara Keeley, do you have anything to add?

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

All I have to add is the obvious point—I am sure that we are going to keep running into this—that people should not have to look to a transcript to see what the Minister’s and Parliament’s intention was. It is clear what the Opposition’s intention is—to protect children. I cannot see why the Minister will not specify who in an organisation should be responsible. It should not be a question of ploughing through transcripts of what we have talked about here in Committee; it should be obvious. We have the chance here to do something different and better. The regulator could specify a senior level.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clearly, we are legislating here to cover, as I think we said this morning, 25,000 different companies. They all have different organisational structures, different personnel and so on. To anticipate the appropriate level of decision making in each of those companies and put it in the Bill in black and white, in a very prescriptive manner, might not adequately reflect the range of people involved.

--- Later in debate ---

Division 10

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I beg to move amendment 72, in clause 10, page 9, line 24, after “characteristic” insert “or characteristics”.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 73, in clause 10, page 9, line 24, after “group” insert “or groups”.

Amendment 85, in clause 12, page 12, line 22, leave out subsection (d) and insert—

“(d) the level of risk of harm to adults presented by priority content that is harmful to adults which particularly affects individuals with certain characteristics or members of certain groups;”.

This amendment would recognise the intersectionality of harms.

Amendment 74, in clause 12, page 12, line 24, after “characteristic” insert “or characteristics”.

Amendment 75, in clause 12, page 12, line 24, after “group” insert “or groups”.

Amendment 71, in clause 83, page 72, line 12, at end insert—

“(1A) For each of the above risks, OFCOM shall identify and assess the level of risk of harm which particularly affects people with certain characteristics or membership of a group or groups.”

This amendment requires Ofcom as part of its risk register to assess risks of harm particularly affecting people with certain characteristics or membership of a group or groups.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

May I say—this might be a point of order—how my constituency name is pronounced? I get a million different versions, but it is Worsley, as in “worse”. It is an unfortunate name for a great place.

I will speak to all the amendments in the group together, because they relate to how levels of risk are assessed in relation to certain characteristics. The amendments are important because small changes to the descriptions of risk assessment will help to close a significant gap in protection.

Clauses 10 and 12 introduce a duty on regulated companies to assess harms to adults and children who might have an innate vulnerability arising from being a member of a particular group or having a certain characteristic. However, Ofcom is not required to assess harms to people other than children who have that increased innate vulnerability. Amendment 71 would require Ofcom to assess risks of harm particularly affecting people with certain characteristics or membership of a group or groups as part of its risk register. That would reduce the regulatory burden if companies had Ofcom’s risk assessment to base their work on.

Getting this right is important. The risk management regime introduced by the Bill should not assume that all people are at the same risk of harm—they are clearly not. Differences in innate vulnerability increase the incidence and impact of harm, such as by increasing the likelihood of encountering content or of that content being harmful, or heightening the impact of the harm.

It is right that the Bill emphasises the vulnerability of children, but there are other, larger groups with innate vulnerability to online harm. As we know, that often reflects structural inequalities in society.

For example, women will be harmed in circumstances where men might not be, and they could suffer some harms that have a more serious impact than they might for men. A similar point can be made for people with other characteristics. Vulnerability is then compounded by intersectional issues—people might belong to more than one high-risk group—and I will come to that in a moment.

The initial Ofcom risk assessment introduced by clause 83 is not required to consider the heightened risks to different groups of people, but companies are required to assess that risk in their own risk assessments for children and adults. They need to be given direction by an assessment by Ofcom, which amendment 71 would require.

Amendments 72 to 75 address the lack of recognition in these clauses of intersectionality issues. They are small amendments in the spirit of the Bill’s risk management regime. As drafted, the Bill refers to a singular “group” or “characteristic” for companies to assess for risk. However, some people are subject to increased risks of harm arising from being members of more than one group. Companies’ risk assessments for children and adults should reflect intersectionality, and not just characteristics taken individually. Including the plural of “group” and “characteristic” in appropriate places would achieve that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I will first speak to our amendment 85, which, like the Labour amendment, seeks to ensure that the Bill is crystal clear in addressing intersectionality. We need only consider the abuse faced by groups of MPs to understand why that is necessary. Female MPs are attacked online much more regularly than male MPs, and the situation is compounded if they have another minority characteristic. For instance, if they are gay or black, they are even more likely to be attacked. In fact, the MP who is most likely to be attacked is black and female. There are very few black female MPs, so it is not because of sheer numbers that they are at such increased risk of attack. Those with a minority characteristic are at higher risk of online harm, but the risk facing those with more than one minority characteristic is substantially higher, and that is what the amendment seeks to address.

I have spoken specifically about people being attacked on Twitter, Facebook and other social media platforms, but people in certain groups face an additional significant risk. If a young gay woman does not have a community around her, or if a young trans person does not know anybody else who is trans, they are much more likely to use the internet to reach out, to try to find people who are like them, to try to understand. If they are not accepted by their family, school or workplace, they are much more likely to go online to find a community and support—to find what is out there in terms of assistance—but using the internet as a vulnerable, at-risk person puts them at much more significant risk. This goes back to my earlier arguments about people requiring anonymity to protect themselves when using the internet to find their way through a difficult situation in which they have no role models.

It should not be difficult for the Government to accept this amendment. They should consider it carefully and understand that all of us on the Opposition Benches are making a really reasonable proposal. This is not about saying that someone with only one protected characteristic is not at risk; it is about recognising the intersectionality of risk and the fact that the risk faced by those who fit into more than one minority group is much higher than that faced by those who fit into just one. This is not about taking anything away from the Bill; it is about strengthening it and ensuring that organisations listen.

We have heard that a number of companies are not providing the protection that Members across the House would like them to provide against child sexual abuse. The governing structures, risk assessments, rules and moderation at those sites are better at ensuring that the providers make money than they are at providing protection. When regulated providers assess risk, it is not too much to ask them to consider not just people with one protected characteristic but those with multiple protected characteristics.

As MPs, we work on that basis every day. Across Scotland and the UK, we support our constituents as individuals and as groups. When protected characteristics intersect, we find ourselves standing in Parliament, shouting strongly on behalf of those affected and giving them our strongest backing, because we know that that intersection of harms is the point at which people are most vulnerable, in both the real and the online world. Will the Minister consider widening the provision so that it takes intersectionality into account and not only covers people with one protected characteristic but includes an over and above duty? I genuinely do not think it is too much for us to ask providers, particularly the biggest ones, to make this change.

None Portrait The Chair
- Hansard -

Barbara Keeley?

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I have nothing to add. I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Clause 10 ordered to stand part of the Bill.

Clause 11

Safety duties protecting children

None Portrait The Chair
- Hansard -

We now come to amendment 95, tabled by the hon. Member for Upper Bann, who is not on the Committee. Does anyone wish to move the amendment? If not, we will move on.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I beg to move amendment 29, in clause 11, page 10, line 20, at end insert—

“(c) prevent the sexual or physical abuse of a child by means of that service.”

This amendment establishes a duty to prevent the sexual or physical abuse of a child by means of a service.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 33, in clause 26, page 26, line 18, at end insert—

“(c) prevent the sexual or physical abuse of a child by means of that service.”

This amendment establishes a duty to prevent the sexual or physical abuse of a child by means of a service.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

The purpose of this clause is to ensure that children at risk of online harms are given protections from harmful, age-inappropriate content through specific children’s safety duties for user-to-user services likely to be accessed by children.

It is welcome that the Bill contains strong provisions to ensure that service providers act upon and mitigate the risks identified in the required risk assessment, and to introduce protective systems and processes to address what children encounter. This amendment aims to ensure that online platforms are proactive in their attempts to mitigate the opportunity for sex offenders to abuse children.

As we have argued with other amendments, there are missed opportunities in the Bill to be preventive in tackling the harm that is created. The sad reality is that online platforms create an opportunity for offenders to identify, contact and abuse children, and to do so in real time through livestreaming. We know there has been a significant increase in online sexual exploitation during the pandemic. With sex offenders unable to travel or have physical contact with children, online abuse increased significantly.

In 2021, UK law enforcement received a record 97,727 industry reports relating to online child abuse, a 29% increase on the previous year, which is shocking. An NSPCC freedom of information request to police forces in England and Wales last year showed that online grooming offences reached record levels in 2020-21, with the number of sexual communications with a child offences in England and Wales increasing by almost 70% in three years. There has been a deeply troubling trend in internet-facilitated abuse towards more serious sexual offences against children, and the average age of children in child abuse images, particularly girls, is trending to younger ages.

In-person contact abuse moved online because of the opportunity there for sex offenders to continue exploiting children. Sadly, they can do so with little fear of the consequences, because detection and disruption of livestreamed abuse is so low. The duty to protect children from sexual offenders abusing them in real time and livestreaming their exploitation cannot be limited to one part of the internet and tech sector. While much of the abuse might take place on the user-to-user services, it is vital that protections against such abuse are strengthened across the board, including in the search services, as set out in clause 26.

At the moment there is no list of harms in the Bill that must be prioritised by regulated companies. The NSPCC and others have suggested including a new schedule, similar to schedule 7, setting out what the primary priority harms should be. It would be beneficial for the purposes of parliamentary scrutiny for us to consider the types of priority harm that the Government intend the Bill to cover, rather than leaving that to secondary legislation. I hope the Minister will consider that and say why it has not yet been included.

To conclude, while we all hope the Bill will tackle the appalling abuse of children currently taking place online, this cannot be achieved without tackling the conditions in which these harms can take place. It is only by requiring that steps be taken across online platforms to limit the opportunities for sex offenders to abuse children that we can see the prevalence of this crime reduced.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I rise, hopefully to speak to clause 11 more generally—or will that be a separate stand part debate, Ms Rees?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Government obviously support the objective of these amendments, which is to prevent children from suffering the appalling sexual and physical abuse that the hon. Member for Worsley and Eccles South outlined in her powerful speech. It is shocking that these incidents have risen in the way that she described.

To be clear, that sort of appalling sexual abuse is covered in clause 9—which we have debated already—which covers illegal content. As Members would expect, child sexual abuse is defined as one of the items of priority illegal content, which are listed in more detail in schedule 6, where the offences that relate to sexual abuse are enumerated. As child sexual exploitation is a priority offence, services are already obliged through clause 9 to be “proactive” in preventing it from happening. As such, as Members would expect, the requirements contained in these amendments are already delivered through clause 9.

The hon. Member for Worsley and Eccles South also asked when we are going to hear what the primary priority harms to children might be. To be clear, those will not include the sexual exploitation offences, because as Members would also expect, those are already in the Bill as primary illegal offences. The primary priority harms might include material promoting eating disorders and that kind of thing, which is not covered by the criminal matters—the illegal matters. I have heard the hon. Lady’s point that if that list were to be published, or at least a draft list, that would assist Parliament in scrutinising the Bill. I will take that point away and see whether there is anything we can do in that area. I am not making a commitment; I am just registering that I have heard the point and will take it away.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 26 stand part.